Facebook, YouTube and Twitter are struggling to remove New Zealand mosque shooting videos

  • A suspected shooter livestreamed video of an attack on a mosque in New Zealand on Friday. It was one of two mosque attacks that day that killed at least 49 people.
  • Facebook, Twitter and YouTube said they removed the original videos, but hours later people reported finding versions on the platforms.
  • These companies have previously taken steps to prioritize authoritative information in the wake of breaking news events and prevent harmful content from being streamed live.
YouTube
Chris Ratcliffe | Bloomberg | Getty Images

The suspected shooter in at least one of the two mosque attacks in New Zealand on Friday used social media to stream his deadly rampage live.

Shortly after, tech giants scrambled to remove his accounts, but versions of the video remained on some sites hours after the shootings, which killed at least 49 people.

Facebook, Twitter and Google's YouTube all said they removed the original video following the attack. But hours later, people still reported online that they were able to find versions of the video on the platforms.

Twitter removed the original video and suspended the account that posted it, but is still working to remove copies that have been posted from other accounts. Twitter said that both the account and video violated its policies.

"We are deeply saddened by the shootings in Christchurch today," a Twitter spokesperson said in a statement. "Twitter has rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this. We also cooperate with law enforcement to facilitate their investigations as required."

Facebook also removed the stream and has also been working to remove content praising the attack.

"Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video," said Mia Garlick of Facebook's New Zealand office. "We're also removing any praise or support for the crime and the shooter or shooters as soon as we're aware. We will continue working directly with New Zealand police as their response and investigation continues."

Later on Friday afternoon, Garlick said in a separate statement that Facebook has been adding videos that violate its policies to an "internal data base which enables us to detect and automatically remove copies of the videos when uploaded again."

Facebook has previously experienced abuse of its livestream function and has taken steps to detect problematic streams in real time. In 2017, the company added additional measures to detect live videos where people express thoughts of suicide, including using artificial intelligence to streamline reporting, and adding live chat with crisis support organizations. These policies followed a series of suicides that were reportedly livestreamed on Facebook's platform.

Several people tweeted that they have been able to find repostings of videos of the attack on Youtube more than 12 hours after it happened, even though YouTube said it took down the original video, which violated its policies. A straightforward search on YouTube will generally yield legitimate reports from news organizations, but graphic videos could still be easily found if a user filtered results by upload date.

YouTube has taken steps to ensure legitimate news reports are prioritized when searching for a trending event, rather than other videos that have the potential for spreading misinformation. In July, YouTube said in a blog post that its Top News section would highlight videos from news organizations and it would link to news articles immediately in the wake of a breaking news event.

Those moves can prevent videos from bubbling up at the top of search results or appearing in YouTube's trending section, but that doesn't necessarily stop them from being uploaded to the site.

A YouTube spokesperson said in a statement: "Shocking, violent and graphic content has no place on our platforms, and is removed as soon as we become aware of it. As with any major tragedy, we will work cooperatively with the authorities."

The video also appeared in a Reddit forum dedicated to violent videos, where users discussed and commented on the images. By Friday afternoon, Reddit had banned the forum for violating its policy against "glorifying or encouraging violence," but earlier in the day, it was accessible to visitors who acknowledged a disturbing content warning. Reddit removed the video and similar links Friday morning at the request of New Zealand police, according to the Redditor who first posted the video. But users who found the video elsewhere online claimed to have downloaded copies and were offering to share the files in direct messages.

"We are actively monitoring the situation in Christchurch, New Zealand," a Reddit spokesperson said. "Any content containing links to the video stream are being removed in accordance with our site-wide policy."

— CNBC's Sara Salinas contributed to this report.

Subscribe to CNBC on YouTube.

Watch: 49 people killed in shootings at New Zealand mosques