Facebook condemned Saturday's deadly London Bridge attacks while pledging to "aggressively remove terrorist content" from its platform, as British Prime Minister Theresa May raised the specter of imposing new regulations to restrict the dissemination of extremist content.
On Sunday, May said Britain must work with allied democratic governments to tighten Internet regulation, in order to deny terrorists a tool for planning attacks and spreading extremism.
"We cannot allow this ideology the safe space it needs to breed, yet that is precisely what the internet and the big companies that provide internet-based services provide," she said in a statement outside Downing Street.
May's comments, which added new fuel to the debate about balancing free speech in an age of terrorism, were amplified by Facebook's own response to the London attack.
In a statement, Simon Milner, director of policy at Facebook, said the social network giant wants to "provide a service where people feel safe. That means we do not allow groups or people that engage in terrorist activity, or posts that express support for terrorism. We want Facebook to be a hostile environment for terrorists."
Facebook has faced criticism following a recent string of violent acts broadcast on its social network. Some accuse the company of failing to tackle terrorist recruitment and hate propaganda, as well as the spread of fake news.
Last month, Facebook CEO Mark Zuckerberg announced he would add another 3,000 employees to scrub harmful content from the network.
"Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it," Facebook's Milner said.
"We have long collaborated with policymakers civil society, and others in the tech industry, and we are committed to continuing this important work."