- Attorney General Christian Porter said Thursday that the new penalty regime would target social media firms and clamp down on the circulation of content that broadcasts violent crimes.
- It comes after tech giants Facebook, Google and Twitter failed to remove videos of a shooting in a New Zealand mosque last month, where 50 people were killed.
Australia has passed a tough new law that could see big tech executives jailed if their platforms host violent video content.
Attorney General Christian Porter said Thursday that the new penalty regime would target social media firms and clamp down on the circulation of content that broadcasts violent crimes.
It comes after tech giants Facebook, Google and Twitter failed to remove videos of a shooting in a New Zealand mosque last month, where 50 people were killed.
Australia's new legislation makes it a criminal offence for social media platforms not to swiftly remove "abhorrent content."
Executives may be jailed for up to three years if their platforms fail to comply, and companies face fines of up to 10 percent of their annual global turnover.
The law will also require social media firms — regardless of their location — to notify the Australian Federal Police if they become aware that their service is streaming violent conduct that is happening in Australia. Failure to do so will see company executives fined up to 168,000 Australian dollars ($119,300), while corporations face penalties of up to 840,000 Australian dollars.
"The tragedy in Christchurch just over two weeks ago brought this issue to a head," Porter said in a press release Thursday.
"It was clear from our discussions last week with social media companies, particularly Facebook, that there was no recognition of the need for them to act urgently to protect their own users from the horror of the live streaming of the Christchurch massacre and other violent crimes, and so the government has taken action with this legislation."
Under the legislation, Australia's e-Safety Commissioner will also have the power to issue notices that bring violent material to companies' attention and demand its removal from their platforms.
In an interview with ABC News' George Stephanopoulos on Thursday, Facebook founder and CEO Mark Zuckerberg admitted that the company's artificial intelligence technology had failed to flag the videos of the Christchurch attack.
He supported calls for greater regulation, telling ABC: "Ninety-nine percent of the ISIS and Al Qaeda content that we take down are AI systems identifying the move before any person sees it — so that's a good example of being proactive, and I think what we should hold all companies to account (for)."
However, the digital industry has pushed back against Australia's laws. Before the legislation was passed, lobby group Digi — which represents Facebook, Google, and Amazon — wrote to the Australian government to warn it risked undermining the nation's security alliance with the United States, the Sydney Morning Herald reported.
"The bill does nothing to address hate speech, which was the fundamental motivation for the tragic Christchurch terrorist attacks," the letter reportedly said.