Facebook says it will hire more workers with law-enforcement experience and academics who study terrorism to help fight the spread of extremist content on its services.
The statement from the company comes the same week the leaders of France and the U.K. said they would push for new laws in Europe to fine companies that don't remove such material promptly.
Facebook said it will work to "counter terrorism and extremist content" by expanding its ranks of "terrorism and safety specialists, including academic experts on counter-terrorism and former law enforcement officers."
It will add 3,000 workers to the 4,500 it already has dedicated to that task, boosting the costs of keeping objectionable content off its services, which also include Instagram and the online messenger WhatsApp.
In the same statement, the company said it would give brands more control over the kind of content that appears adjacent to their ads, pledging that "brands can opt-out of Instant Articles, Audience Network and in-stream ads on Facebook."
U.K. Prime Minister Theresa May and French President Emmanuel Macron said in a joint statement this week they would get tough on internet services that don't do more to help fight online radicalization.
Facebook's announcement comes on the heels of a blog post from Facebook policy chief Elliot Schrage which pledged the company would begin to "talk more openly about some complex subjects," including how platforms should fight the spread of terrorist "propaganda" online. Other questions Schrage said Facebook would address include removing controversial posts and images, fake news and the effect of social media on democracy.