Tech

Meta says it will restrict content for teens, as complaints mount about harmful effects on youth

Key Points
  • Meta said it will default teenage Facebook and Instagram users to the most restrictive content settings.
  • The company said it expects to complete the update over the coming weeks for teens under age 18.
  • In October, a bipartisan group of 42 attorneys general announced they're suing Meta, alleging that the company's products are harming teenagers.

In this article

Mark Zuckerberg, CEO of Meta, attends a U.S. Senate bipartisan Artificial Intelligence Insight Forum at the U.S. Capitol in Washington, D.C., Sept. 13, 2023.
Stefani Reynolds | AFP | Getty Images

Meta said Tuesday it will limit the type of content that teenagers on Facebook and Instagram are able to see, as the company faces mounting claims that its products are addictive and harmful to the mental well-being of younger users.

In a blog post, Meta said the new protections are designed "to give teens more age-appropriate experiences on our apps." The updates will default teenage users to the most restrictive settings, prevent those users from searching about certain topics and prompt them to update their Instagram privacy settings, the company said.

Meta expects to complete the update over the coming weeks, it said, keeping teens under age 18 from seeing "content that discusses struggles with self-harm and eating disorders, or that includes restricted goods or nudity," including content shared by a person they follow.

The change comes after a bipartisan group of 42 attorneys general announced in October that they're suing Meta, alleging that the company's products are hurting teenagers and contributing to mental health problems, including body dysmorphia and eating disorders.

"Kids and teenagers are suffering from record levels of poor mental health and social media companies like Meta are to blame," New York Attorney General Letitia James said in a statement announcing the lawsuits. "Meta has profited from children's pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem."

In November Senate subcommittee testimony, Meta whistleblower Arturo Bejar told lawmakers that the company was aware of the harms its products cause to young users but failed to take appropriate action to remedy the problems.  

Similar complaints have dogged the company since 2021, before it changed its name from Facebook to Meta. In September of that year, an explosive Wall Street Journal report, based on documents shared by whistleblower Francis Haugen, showed Facebook repeatedly found its social media platform Instagram was harmful to many teenagers. Haugen later testified to a Senate panel that Facebook consistently puts its own profits over users' health and safety, largely due to algorithms that steered users toward high-engagement posts.

Amid the uproar, Facebook paused its work on an Instagram for kids service, which was being developed for children ages 10 to 12. The company hasn't provided an update on its plans since.

Meta didn't say what prompted the latest policy change, but said in Tuesday's blog post that it regularly consults "with experts in adolescent development, psychology and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens."

WATCH: 2024 should also be a good year for Meta

Meta should see a good year in 2024, says Stifel's Mark Kelley
VIDEO3:1603:16
Meta should see a good year in 2024, says Stifel's Mark Kelley