Tech

House Republican staff outline principles to reform tech's liability shield

Key Points
  • House Republicans are gearing up to take aim at the legal shield that protects tech platforms from liability over content moderation, with staff outlining key concepts for legislation in a memo obtained by CNBC on Thursday.
  • The E&C Republican staff wrote that the concepts aim to exclusively target "Big Tech companies with an annual revenue of $1 billion," likely targeting companies like Amazon, Apple, Google and Facebook.
Rep. Cathy McMorris Rodgers (R-WA) during a House Energy and Commerce Environment and Climate Change Subcommittee hearing on Capitol Hill on April 2, 2019 in Washington, DC.
Zach Gibson | Getty Images

House Republicans are gearing up to take aim at the legal shield that protects tech platforms from liability for the content users post.

On Thursday, Republican staff for the House Energy and Commerce Committee sent a memo suggesting several concepts for reforming Section 230 of the Communications Decency Act, a 1996 law that protects tech platforms from liability for users' posts and for their own moderation practices.

Those concepts include:

  • Limiting the right of tech companies to exclude users based on their viewpoints or political affiliations
  • Requiring "reasonable moderation practices" to address harms like illegal drug sales and child exploitation
  • Narrowing protected moderation to specific types of speech not protected by the First Amendment
  • Removing protection for discriminatory moderation decisions based on viewpoints.

Underscoring all of the concepts are three main principles: Protecting free speech, balancing interests of small businesses to protect competition, and promoting American leadership in tech.

The memo said proposed legislation would exclusively target "Big Tech companies with an annual revenue of $1 billion," suggesting it will focus on giants like Amazon, Apple, Google and Facebook.

The E&C Republican staff sent the memo to the staff of individual Republican members of the committee, as well as other unspecified stakeholders.

Republicans generally have criticized Section 230 protections for allowing tech platforms to make allegedly biased decisions about what posts to take down, while Democrats seek to place greater responsibility on platforms to expand their content moderation to make their services safer for users.

Other areas of focus

The memo also dives into more specific suggestions, including the following:

Appealing decisions. The memo suggests tech platforms should have a stronger path for users to appeal decisions they feel are unfair. One concept says platforms should be required to maintain a user-friendly appeals process to challenge decisions and tell users why they were made.

Carving some companies out entirely. The memo suggests carving Big Tech companies out of Section 230 protections so only smaller businesses and new entrants will maintain the protections, and repealing the shield for companies that use targeted behavioral advertising (the latter of which is similar to a bill Democrats have proposed).

Reauthorization every five years. The staff proposed that Section 230 should be reauthorized every five years for the Big Tech companies, incentivizing them to be careful and allowing for iteration as the industry evolves.

Transparency. Another set of principles focuses specifically on transparency around the Big Tech companies' content moderation practices, like requiring them to submit detailed descriptions of their policies to the Federal Trade Commission.

Protecting children. The memo also outlines principles to protect children online, a theme that emerged during the committee's last hearing with several tech CEOs in March. Some of the concepts center around holding the companies accountable for content and ads they show to minors, while others require them to track the way their products impact children's mental health.

Working with law enforcement. A final set of concepts outlines the ways Big Tech should be required to work with law enforcement. Many already do work with law enforcement and report illicit material, but conflicts have arisen when enforcers have asked for access to encrypted information. Apple, when in this situation, said it could not create a so-called backdoor for law enforcement that wouldn't jeopardize the security of all of its users. The memo does not make mention of encryption in particular.

Subscribe to CNBC on YouTube.

WATCH: Section 230 explained

Section 230 explained
VIDEO0:0000:00
Section 230 explained