The report, published Tuesday, said a state regulator should ensure tech firms are taking steps to help users identify trustworthy, reliable news on their platforms. It said the regulator would require companies like Facebook and Google to build on initiatives they have already established to weed out fake content.
"This task is too important to leave entirely to the judgment of commercial entities," the report said.
Online sites like Facebook, Twitter and YouTube have been under fire for allowing fake content to spread on their platforms. The companies have been investing in security measures to eliminate false accounts and misinformation, but the U.K. government report said these efforts should be enforced by a government agency. It also said they should sign a "code of conduct" to govern their commercial agreements with publishers.
"The experience of the last decade has shown that it is perfectly possible for social media platforms to be immensely profitable while simultaneously carrying a large quantity of fake news," it said.