A coalition of tech companies is backing new principles put forth by five governments to prevent the spread of online child exploitation.
The organization, which includes Facebook, Apple, Google and Twitter, committed to promoting the new voluntary standards that outline how the industry should consider survivors' needs and safeguard certain features such as livestreaming.
The 11 principles were announced at a press conference at the Department of Justice on Thursday afternoon. Representatives from the Department of Homeland Security as well as the United Kingdom, Canada, Australia and New Zealand participated in the announcement alongside U.S. Attorney General William Barr.
Barr said representatives from Facebook, Google, Microsoft, Snap, Twitter and Roblox were also in attendance and had helped inform the principles.
The governments said WePROTECT Global Alliance, made up of 97 governments, 25 tech companies and 30 civil society organizations, will promote adoption of the guidelines.
The principles released Thursday include many of the strategies companies such as Facebook already employ to identify abuse on their platforms. The guidelines ask that companies try to prevent known and new material of child sexual abuse from becoming accessible on their services, for example. Large platforms such as Facebook and YouTube already have mechanisms in place to tag and trace images and videos that violate their standards and prevent them from being re-uploaded.
The principles also say companies should combat and report to authorities efforts to groom children for sexual abuse on their platforms, adopt child-specific safety measures, prevent livestreaming services from being used for abusive purposes and more. They also suggest companies work with one another to share useful practices and data to prevent further abuse from spreading online.
"We stand behind these principles and will be working with our members to both spread awareness of them and redouble our efforts to bring industry together to promote transparency, share expertise and accelerate new technologies to combat online child sexual exploitation and abuse," the Technology Coalition said in a statement.
The guidelines don't instruct companies on the specific steps they should take to adhere to the principles, pointing out that every service is different and will have different risk factors.
"When applying these principles, companies will take into account technical limitations, available resources, and legal and privacy considerations," an introduction to the principles says.Â
The principles also steer clear of a key dispute between the tech industry and law enforcement by not directly addressing encryption, a feature favored by privacy advocates that obscures the content of messages to anyone outside of the sender and receiver. Tech leaders and top law enforcement officials have clashed over encryption efforts, most recently involving Facebook's plans to integrate and encrypt its messaging services across WhatsApp, Messenger and Instagram.
Industry representatives argue encryption protects free expression, particularly under governments that suppress speech. But Barr and FBI Director Christopher Wray have said more encryption would gravely hinder law enforcement efforts to track down perpetrators of child sexual exploitation.Â
Barr and other government representatives did address encryption in their remarks at the press conference announcing the principles.
"Predators' supposed privacy interests should not outweigh our children's privacy and security," Barr said. "There's too much at stake."
"Encryption remains the elephant in the room," said U.K. Minister of State for Security James Brokenshire, calling out Facebook's plans directly. "I've got to say that putting our children at risk for what what I believe are marginal privacy gains is something I really struggle to believe any of us wants."
A New York Times investigation exposing the vast network of child sexual abuse images online said Facebook was responsible for nearly 12 million of the 18.4 million reports of child sexual abuse material worldwide, citing people familiar with the reports. The statistic underscores the vital role Facebook plays in halting the spread of abusive materials and allowing law enforcement to track down criminals.
Barr and other officials fear this evidence will be lost as tech companies create more encrypted messaging and communication products. Facebook has said it will still be able to collect metadata about encrypted messages, such as the time messages were sent, but law enforcement officials say that's not enough to aid their investigations.
Barr and representatives from DHS, the U.K. and Australia wrote to Facebook last year asking the company to postpone its encryption plans until law enforcement could ensure it would not reduce public safety. Barr has advocated for "lawful access" to encrypted services with a court order, but tech companies have said that building a "backdoor" for investigators would jeopardize the security of all users.
Barr has become increasingly vocal about tech companies' responsibility for monitoring their platforms for abuse. He has spearheaded efforts at the DOJ to reconsider protections provided by Section 230 of the Communications Decency Act, which prevents tech platforms from being held legally liable for content posted on their services by third-party users.
Congress has also been chewing on ways to rework the law to put greater responsibility on the tech platforms. A bill from Sens. Lindsey Graham, R-S.C., and Richard Blumenthal, D-Conn., is expected to be released soon and would pin Section 230's legal protections to purportedly voluntary standards aimed at combating child sexual exploitation, according to The Washington Post.
— CNBC's Ylan Mui contributed to this report.
WATCH:Â Why the U.S. government is questioning your online privacy
