Instagram introduced updates Tuesday designed to give parents more oversight into what their teenagers are doing online, but the changes did little to assuage critics who say the social media network is compromising adolescents’ well-being.
“We are changing the experience for millions of teens on our app,” Antigone Davis, Meta’s global head of safety, said in an interview. “We’re reimagining that parent-child relationship online in response to what we heard from parents about how they parent, or want to be able to parent.”
But the new teen accounts were largely panned Tuesday by regulators and activists pushing the social media giant to take bolder steps to protect young people from harmful content and from becoming hooked on the app — and to face accountability when those measures fail.
“It’s actually not the parents’ job to make sure that the content for kids is safe. I think it is Instagram’s job,” said Arturo Béjar, a former consultant for Meta who testified before Congress last year about his failed efforts to push the company to embrace stronger youth protections after his then-teenage daughter experienced abusive comments on Instagram.
Instagram’s new tools were announced a day before a key House committee is scheduled to weigh amendments to the Kids Online Safety Act and other child privacy legislation passed by the Senate in July, which would amount to the most significant such update to relevant federal law in decades if the House follows suit. The measures reflect rising concerns among parents and political leaders of both parties that social media sites are contributing to a youth mental health crisis in the United States, and that tech companies are prioritizing keeping younger users engaged over their safety and well-being.
Last year, 41 states and D.C. sued Meta, alleging that the company harms children by building addictive features into Instagram and Facebook while exposing them to harmful content. School districts and hundreds of families have also sued Meta over how its services have affected young people.
Connecticut Attorney General William Tong (D), one of the dozens of state law enforcement officials who has sued Meta over its child safety practices, said the moves were “not nearly enough.” Meta, he said, “is doing the very least it can do to look like it is doing something.”
Previn Warren, a Motley Rice lawyer who represents families who have sued Meta over how it has handled youth safety issues, welcomed the measures but said that “they should — and could — have been implemented years ago, before the hundreds of families that we represent experienced the harms of social media addiction.”
Meta’s new default protections for teens are meant to help address long-standing allegations that Instagram is intentionally designed to keep teens addicted to its services while hurting their well-being. But some experts say teens may find ways around the new standards, which they say leave some safety issues unaddressed. Some critics earlier suggested that tougher rules for teens on Instagram could infringe on their privacy and free speech.
“All this is better than it was before,” said Zvika Krieger, a former director of Meta’s responsible innovation team who now works as a consultant for technology companies. “I don’t want to say that it’s worthless or cosmetic, but I do think that it doesn’t solve all the problems.”
Some child safety advocates criticized Instagram’s changes as an effort to ward off more stringent regulation.
“We hope lawmakers will not be fooled by this attempt to forestall legislation,” Josh Golin, executive director of the advocacy group Fairplay, said in a statement Tuesday.
While action in Congress has stalled, dozens of states have passed their own laws expanding safety and privacy guardrails for children and teens online. But many have been challenged and halted in court in line with industry groups’ arguments that the laws ran afoul of users’ First Amendment free-speech protections.
Meta and other social media giants have long struggled to gain traction for their parental control tools. Child safety critics have argued the companies impose weak default settings for teens while forcing parents to do the heavy lifting to protect their children. Meta’s own internal research has demonstrated that parents often lack the time and technological understanding to properly supervise their children’s online activities. By the end of 2022, fewer than 10 percent of teens on Instagram had enabled the parental supervision setting, The Washington Post has reported. Meta has repeatedly declined to offer statistics on how many teens are being supervised by their parents on Instagram.
Other companies’ parental control adoption rates are also low. Earlier this year, Snapchat CEO Evan Spiegel told Congress that 20 million teens in the United States use the messaging app but that only 400,000 have linked their accounts to their parents, representing a 2 percent adoption rate.
Under Meta’s new approach, the company will force both new sign-ups and existing users under 18 into the new “teen accounts” with stricter safety default standards. Meta, which has often shied away from changing the default settings of existing teen users, expects that millions of teenagers will end up in the new accounts.
All “teen accounts” will be set to private by default. Users who are 16 and 17 years old can make those accounts public, but users under 16 won’t be able to do so without parental approval. By requiring younger teens to obtain supervision before changing any of their safety settings, the company expects that more teens’ accounts will be supervised by their parents.
“What we heard in talking to parents is how their parenting … and involvement shifts as a teen matures,” Davis said. “This is really designed to reflect that shift of the parent-child relationship.”
The new rules won’t immediately apply to Meta’s original social network, Facebook, which is larger than Instagram but less popular among American teenagers.
The company plans to place teens in the new restricted Instagram accounts within 60 days in the United States, Britain, Canada and Australia, and in other regions in January. Meta, which also owns WhatsApp, said it would put other apps under similar new rules in the coming months.
Meta also said it plans to use artificial intelligence to proactively find teens it suspects of lying about their age. As part of that effort, the company says it will more often ask suspected teenagers to verify their identity through an outside contractor, Yoti, one of several companies that ask users to take video selfies or hold up government IDs to verify their ages. Critics have expressed concerns about privacy lapses and fairness issues for young people who might not have an ID.
For users who end up in Instagram’s new teen accounts, the company will stop notifications between the hours of 10 p.m. and 7 a.m. That feature, called “sleep mode,” replaces earlier features that reminded teens to close Instagram at night. That change might not quell parents’ concerns that teenagers will simply keep checking the app when they should be sleeping, Krieger said.
The company will also start reminding teens to get off the app after an hour of usage a day, streamlining its existing approach to time limit reminders. Parents will be allowed to block their teens’ Instagram usage after a certain amount of time, or within certain windows of time.
The company is also asking teens to proactively choose the topics — such as the arts or sports — they favor for recommended content. Supervising parents will be able to see what topics their teens have chosen to see.
Parents will now also be able to see with whom their child has been recently messaging but not the messages themselves. Previously, they could only see lists of whom their children follow, who follows them and blocked accounts.
Meta’s new teen accounts largely streamlines and builds on its other teen safety efforts. In recent years, the company has tightened its standards on showing teens less “sensitive content,” such as posts that are violent or sexually suggestive. The company also restricted adults’ ability to contact them and pre-checked the private account option for new teen sign-ups.