KEY POINTS
  • An investigation published Wednesday described how Instagram's algorithms connect and promote accounts that facilitate and sell child sexual abuse material.
  • Alex Stamos, one of the paper's authors, said researchers focused on Instagram because its "position as the most popular platform for teenagers globally makes it a critical part of this ecosystem."
  • The investigation was conducted by The Wall Street Journal and researchers at Stanford University's Internet Observatory Cyber Policy Center and the University of Massachusetts Amherst.

In this article

An Instagram logo is seen displayed on a smartphone.

Instagram's recommendation algorithms have been connecting and promoting accounts that facilitate and sell child sexual abuse content, according to an investigation published Wednesday.

Meta's photo-sharing service stands out from other social media platforms and "appears to have a particularly severe problem" with accounts showing self-generated child sexual abuse material, or SG-CSAM, Stanford University researchers wrote in an accompanying study. Such accounts purport to be operated by minors.

In this article