Facebook says video of New Zealand mosque shootings was viewed 4,000 times before being removed

  • Facebook said the video of the man who killed 50 people during a shooting rampage at two mosques in New Zealand last week was viewed 4,000 times before it was removed.
  • The suspected gunman apparently livestreamed on Facebook his attack on Muslim worshipers.
  • The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended, Facebook said.
A woman checks the Facebook Inc. site on her smartphone whilst standing against an illuminated wall bearing the Facebook Inc. logo
Chris Ratcliffe | Bloomberg | Getty Images

Facebook said a live streamed video apparently showing last week's attack on two mosques in New Zealand was viewed 4,000 times before it was removed.

The suspected gunman, an Australian named Brenton Harrison Tarrant, allegedly killed 50 people after opening fire on Muslim worshipers at the two mosques, livestreaming the whole ordeal on Facebook. Tarrant has been charged with murder.

During the livestream, Facebook said the video was viewed fewer than 200 times. No users reported the video during the live broadcast, the social network said. Such broadcasts remain on Facebook even once they have ended. In total, the video was viewed 4,000 times before it was removed, the American social media giant said.

The first user report on the original video came in 29 minutes after it was first posted, and 12 minutes after the live broadcast ended, Facebook said.

Social media firms have faced intense criticism for not doing enough to police content on their platforms, particularly the kind allegedly created by Tarrant. Facebook said it also removed the account of the suspect from Facebook and Instagram as well as the live video.

The video was, however, shared in different formats across Facebook. The company said it was able to detect visually similar videos to the original and automatically remove them from Facebook and Instagram.

Facebook said on Saturday that it removed 1.5 million videos of the attack in the first 24 hours after it was originally livestreamed. Facebook said 1.2 million of those videos "were blocked at upload."

Reddit, Twitter and Google's YouTube also tried to move quickly to remove content related to the shooting.