Meta says it is expanding the use of artificial intelligence to identify underage users on Facebook and Instagram as part of a broader effort to improve child safety protections, while the company faces growing legal and political scrutiny.
As more teenagers spend time on social media platforms, one ongoing challenge for technology companies is determining whether users are actually the age they claim to be. While most major platforms require users to be at least 13 years old, verifying age online has long been difficult because users typically enter their own birthdate when creating an account.
To address that problem, Meta said it is developing AI-powered tools designed to identify accounts that may belong to underage users, even if those users entered a false age.
“We want to make sure that people on our platforms are over the age of 13, and so now with AI, we’ve been able to use new technology to really look beyond things like just someone admitting I’m 12 years old,” Jennifer Hanley, the head of Safety Policy North America for Meta, told KIRO Newsradio. “Instead, we’re able to look at a whole variety of different signals so that we can help detect underage accounts.”
Hanley told KIRO Newsradio the company’s technology can analyze a variety of signals to estimate a user’s age, like profiles, but also Instagram Live and Reels, Facebook groups, comments, captions, and bios to try to find more information to make sure that we’re able to identify and remove accounts opened by underage users. They are also using data from photos and other digital images.
“We’re not recognizing you with your facial recognition. We’re looking for signals that someone’s bone structure or height might indicate that they’re a child and not an adult, for example,” Hanley explained. “They might be misrepresenting that they’re an adult, but actually be someone who’s like 10 or 11. We’re trying to find those accounts so that we can deactivate them.”
Privacy concerns arise
Privacy advocates, however, warn that the technology could create new concerns about how much information companies collect and analyze about users. In response to Meta’s announcement, Electronic Privacy Information Center policy counsel Suzanne Bernstein said that AI-based age-verification systems may raise privacy concerns if platforms rely heavily on behavioral or biometric data to estimate someone’s age.
“I think it’s definitely jarring to see the extent to which AI can be used in a way that really scrutinizes a lot of information,” Suzanne Bernstein with the Electronic Privacy Information Center (EPIC), said. “All the information that we post or that we provide to Meta using the platform, you know, we don’t have much control over how that information is used, and we don’t have as many privacy protections as you might think.”
Meta’s expanded safety push comes as more than a dozen countries around the globe, including 12 in Europe, are passing laws to restrict teenagers from downloading social media apps. Australia led that charge in 2024, passing the Online Safety Amendment (Social Media Minimum Age) Act, which bans children under 16 years old on platforms including Facebook, Instagram, TikTok, Snapchat, X, Reddit, and others. Companies that don’t block underage accounts could face fines of up to roughly A$50 million.
Age-based social media laws in the US
In the U.S., the Children’s Online Privacy Protection Act (COPPA) prohibits companies from collecting personal information from children under 13 without parental consent, though critics argue American regulations remain less restrictive than those in many other countries. Many legal and technology experts believe that could soon change with individual states adopting laws similar to those in other countries.
Meta told KIRO Newsradio that they welcome federal guidelines and controls for children and parents/guardians before downloading an app. They also characterized the regulation issue as a shared responsibility among all social media platforms.
“We’re one company. There are so many different platforms out there that teens use, and so we actually think that the government has an important role to play, because understanding age is an industry-wide issue,” Hanley said. “Everyone is trying to figure it out, and we think that governments can actually create a framework by passing legislation that allows for age verification and parental approval at the app store level, kind of having one centralized place for families, making it much easier for parents.”
Meta also faces increasing legal pressure over allegations involving children’s safety and mental health. In March, a jury in New Mexico found Meta liable in a case alleging the company failed to adequately protect children from online exploitation. Separately, jurors in California found Meta and YouTube liable for platform features that plaintiffs argued contributed to addiction and mental health harms among young users.
Seattle attorney Matt Bergman, who has represented families in litigation, said courts are increasingly examining whether platforms intentionally design features that keep children engaged online for extended periods of time.
“Anyone who has kids experiences the consequences of social media addiction and some of the parents experience horrific consequences,” Bergman said.
Meta declined to comment on the ongoing litigation and is appealing the verdicts. However, the company says it is reimagining its experiences for teenage users.
“With today’s age announcement, we also have notifications that will go to parents to help them have a conversation with their teen about why it’s so important to have that accurate age, and for parents to understand how to look for what age their teen may be putting in place,” Hanley said. “We’re really trying to make things as easy as possible and get them educated and informed to help them.”
The debate underscores the growing tension between improving child safety online, protecting user privacy, and defining how much responsibility technology companies should bear for the experiences of young users on social media platforms.
“We do feel a strong responsibility. We’re very committed to doing this,” Hanley said. “That’s why we have been investing in tools and resources and safety for you know, more than a decade and a few years ago.”
This story was originally posted on MyNorthwest.com
Follow Luke Duecy on X. Read more of his stories here. Submit news tips here.
©2026 Cox Media Group







