Tech

Zuckerberg resists pressure to alter Facebook's livestream feature after Christchurch attacks

Key Points
  • Mark Zuckerberg has expressed reluctance to alter the Facebook feature that hosted video footage of the deadly terror attack on two mosques in Christchurch, New Zealand.
  • The tech giant came under criticism after video of the attack, which killed 50 people, was livestreamed and widely circulated on its platform.
Facebook's founder and CEO Mark Zuckerberg speaks to participants during the Viva Technologie show at Parc des Expositions Porte de Versailles on May 24, 2018 in Paris, France.
Chesnot | Getty Images

Mark Zuckerberg has expressed reluctance to alter the Facebook feature that hosted video footage of the deadly terror attack on two mosques in Christchurch, New Zealand.

The tech giant came under criticism after video of the attack, which killed 50 people, was livestreamed and widely circulated on its platform.

Speaking to ABC News' George Stephanopoulos on Thursday, the Facebook founder and CEO admitted that the artificial intelligence it uses to filter harmful content failed to flag the video, adding that its reach may have been limited if livestreams were subject to a broadcast delay.

"But it would also fundamentally break what livestreaming is for people. Most people are livestreaming a birthday party or hanging out with friends when they can't be together," he said. "One of the things that's magical about livestreaming is that it's bi-directional, … you're not just broadcasting, you're communicating, and people are commenting back. So if you had a delay [it] would break that."

Although reluctant to introduce a delay on Facebook's livestream feature, Zuckerberg accepted that the company needed to work harder to "mitigate and remove as much of the negative (content) as possible."

He also told ABC that the way the company was run had significantly changed in recent years, with the policing of harmful content among the major issues Facebook was focusing on.

"Ninety-nine percent of the ISIS and Al-Qaeda content that we take down are AI systems identifying the move before any person sees it — so that's a good example of being proactive, and I think what we should hold all companies to account [for]," he said.

Footage of last month's mosque massacre was livestreamed on Facebook by the shooter last month.

Facebook, Twitter and Google rushed to remove the content from their platforms, but users were still able to find versions of the video hours after the tech giants said they had taken it down.

On Thursday, Australian lawmakers passed legislation that could see social media executives face jail time and hefty fines if their platforms fail to remove violent content.