Instagram rolls out sensitivity screens in the wake of teen suicide

The scary truth about underage users on Facebook, Snapchat and TikTok

The head of Instagram, Adam Mosseri, announced on Tuesday that company is implementing changes that it hopes will stop younger users from being exposed to inappropriate content relating to suicide and self harm.

These changes include "sensitivity screens," which will blur out images of cutting until the viewer opts in. Images that depict self-harm will also no longer appear in search, hashtag or account recommendations.

In an op-ed in The Telegraph, Mosseri revealed that he was inspired to make these changes in the wake of the 2017 suicide of British 14-year-old Molly Russell, who followed multiple self harm and suicide accounts.

Instagram parent company Facebook has also been under government pressure. In January, UK Health Secretary Matt Hancock demanded that tech giants like Facebook, Twitter and Google do a better job of protecting children from harmful content or face more forceful legislation.

In his letter, Hancock said, "I was inspired by the bravery of Molly's father, who spoke out about the role of social media in this tragedy – and moved by the sense that there is much more we all need to do to stop a tragedy like this from happening again."