It's an unusual request from a D.C. lawmaker after Congress has spent the past few years scolding Facebook for its policies on misinformation. The company has struggled to escape the shadow of the 2018 Cambridge Analytica scandal and its role in spreading disinformation by Russian actors during the 2016 U.S. presidential election. But the coronavirus pandemic has given Facebook an opportunity to reclaim its reputation and at least one lawmaker is taking notice.
Facebook said earlier this month that it would notify users if they had engaged with a post that had been removed for including misinformation about Covid-19 in violation of its policies. The social media company will also direct users to myths debunked by the World Health Organization. That marked a major step for Facebook, which has wrung its hands over other forms of misinformation, most notably in political ads. But even while it has refused to fact-check or remove most political ads that contain false information, Facebook said it would remove any that contain misinformation about the coronavirus.
Schiff, chairman of the House Intelligence Committee that investigated Russian meddling in the 2016 election, asked the chief executives of Google, YouTube and Twitter to consider a similar policy to Facebook's in letters sent Wednesday.
"While taking down harmful misinformation is a crucial step, mitigating the harms from false content that is removed requires also ensuring that those users who accessed it while it was available have as high a likelihood of possible of viewing the facts as well," Schiff wrote to the CEOs.
The three companies all have policies about misinformation but have not explicitly said they would notify users who engaged with false content as Facebook has. Taking that extra step could be important given the volume of misinformation that users post to the platforms, making it difficult for content moderators to crack down on all of them in time before they can reach others.
In a statement, a spokesperson for Google-owned YouTube said, "Since early February, we've removed thousands of videos violating our COVID-19 misinformation policies -- such as content that disputes the existence or transmission of COVID-19 as described by local health authorities, or that promotes medically unsubstantiated methods to prevent or cure COVID-19 in place of seeking medical treatment -- and have seen over 20 billion impressions on our information panels for COVID-19 related videos and searches."
A Twitter spokesperson confirmed the company received the letter and is regularly in contact with congressional staff about the issue. A Google spokesperson did not immediately respond to a request for comment.
-- CNBC's Julia Boorstin and Stephen Desaulniers contributed to this report.