How Governments Are Trying to Keep Young Children Off Social Media, From Face Scans to ID Checks

Lawmakers in the U.S., Europe propose age-verification tools, sparking debate over digital rights

Governments across the U.S. and Europe are moving to tighten online age restrictions following evidence that companies aren’t effectively enforcing limits on what children see and do on social media.

State and national lawmakers from Salt Lake City to Paris are looking to make services such as Meta Platforms META 3.70%increase; green up pointing triangle’ Instagram and ByteDance’s TikTok take steps to verify ages when users sign up. Some legislatures also are requiring companies to collect parental consent for users under a certain age. 

The momentum portends a sea change in how children and adults access the internet and could end the days when one needed only to click a button or fill in a birth date to avoid online age restrictions.

The new rules also are sparking debate between child-protection advocates concerned about social media’s impact on mental health and digital-rights groups arguing that verifying ages creates privacy risks and could discourage access to useful information. 

Many of the new rules target pornography websites. But another major driver is evidence of rising preteen use of social media despite the fact that most such platforms say they don’t allow users under at least 13, in part to comply with U.S. federal law. 

Up to a third or more of children ages 8 to 12 say they use social-media and video sites including YouTube, TikTok and Instagram, according to surveys in recent years in the U.S., U.K., France and Ireland conducted for child-protection advocates and regulators. 

Those advocates say social-media use opens children up to content and social pressures that are difficult for adults to handle, with surveys showing significant rates of children being bullied and communicating with strangers online.

Instagram says it requires everyone to be at least 13 years old before they can create an account. YouTube and TikTok also require users to be at least 13 before creating profiles on their main services, though both offer alternatives for children.

On Instagram, The Wall Street Journal found dozens of accounts that self-identified as being younger than 13 and remained active until the Journal flagged them to Meta’s communications team.

“Hey guyz I’m kaira welcome to my world I’m 12,” stated one Instagram account.

Instagram makes it hard to report underage accounts, requiring a user flagging such an account to fill out a webpage that requires the child’s birth date and full name. 

“This is a very high burden for reporting,” said Jennifer King, a researcher at the Stanford Institute for Human-Centered Artificial Intelligence who studies how design choices alter user behaviors.

Meta says it invests in AI to detect underage users and trains its moderators to remove them manually in response to user reports.

“We don’t allow people under 13 to use Instagram, and we have numerous methods to remove underage accounts,” spokesman Andy Stone said, adding that the company is “evaluating new ways to improve reporting to remove underage accounts faster while improving accuracy.”

By Sam Schechner and Jeff Horwitz

https://www.wsj.com/articles/as-preteens-ignore-social-media-age-limits-governments-push-for-better-checks-b21f5ae7?mod=tech_listb_pos1

Leave a Reply

Your email address will not be published. Required fields are marked *