Tech

Big Tech's favorite law is under fire

Key Points
  • The Justice Department is hosting a workshop to examine the scope of a law known as Section 230 of the Communications Decency Act.
  • The law protects online platforms from liability for their users' posts and allows them to moderate users' content without being treated as publishers.
  • As tech companies have grown in size and power, Congress has also questioned whether Section 230 needs an update.
US Attorney General William Barr testifies before the Senate Judiciary Committee on "The Justice Department's Investigation of Russian Interference with the 2016 Presidential Election" on Capitol Hill in Washington, DC, on May 1, 2019.
Mandel Ngan | AFP | Getty Images

The Justice Department is hosting a forum for academics, nonprofit leaders and industry advocates to discuss the future of a law that has shielded tech companies from legal liability for their users' posts since its enactment in 1996.

For critics of the tech industry, Section 230 of the Communications Decency Act has come to symbolize the exceptional treatment from the government that has fueled the growth of a small number of players.

For tech companies, the law represents the internet's founding values of openness and free expression, while also allowing them to remove the most insidious speech without stumbling into a legal minefield.

Attorney General William Barr aligned himself with the skeptics, telling a gathering of the National Association of Attorneys General in December that the department was "studying Section 230 and its scope."

"Section 230 has been interpreted quite broadly by the courts," Barr said, according to a transcript of his remarks. "Today, many are concerned that Section 230 immunity has been extended far beyond what Congress originally intended. Ironically, Section 230 has enabled platforms to absolve themselves completely of responsibility for policing their platforms, while blocking or removing third-party speech — including political speech — selectively, and with impunity."

Here are the key things to know about this piece of legislation that's the subject of Wednesday's DOJ forum:

What is Section 230 and why was it enacted?

Section 230 was introduced by Sen. Ron Wyden, D-Ore., and former Rep. Chris Cox, R-Calif., as a way of protecting tech companies from becoming legally liable for their users' content if they opted to moderate it.

The law followed a court ruling against the online platform Prodigy.

An investment firm sued Prodigy after one of the platform's anonymous users accused it of fraud. Prodigy argued it wasn't responsible for its users' speech, but the court found that because the platform moderated some of its users' posts, it should be treated more like a publisher, which can be held legally liable for misleading or harmful content it publishes.

The ruling galvanized Cox and Wyden to introduce what would become Section 230. The law allows for companies to engage in "good Samaritan" moderation of "objectionable" material without being treated like a publisher or speaker under the law.

That's what allows platforms like Twitter, Facebook and Google's YouTube to take down terrorist content or harassing messages while still enjoying other legal protections. It's also been essential for these companies to achieve massive scale — if they were liable for everything users posted, they'd either have to vet every piece of content before it went live, which would dramatically increase expenses and create delays, or give up all moderation, which would make for a worse user experience.

Why do some people want to change the law?

In recent years, Washington has begun to sour on the tech industry after a series of complaints about privacy and the growing power of a few key players. As politicians and the general public have awakened to the vast power of the large tech companies, they've begun to see Section 230 as a key contributor to that power.

Lawmakers on both sides of the aisle have publicly questioned the broad scope of Section 230. Once a way to protect upstart tech firms, the law now provides a legal shield to some of the most valuable companies in the world. Some fear tech companies lack the incentives to combat misinformation on their platforms as technology that makes it easier to fake video and voices becomes more advanced.

Some conservatives believe Section 230 has aided tech companies' ability to censor speech they don't agree with. There's little evidence mainstream tech firms systematically discriminate against certain ideologies, but they have at points removed politically charged posts, sometimes in error, only to apologize and reinstate them later.

Such claims of bias inspired Missouri Republican Sen. Josh Hawley's proposed revision to Section 230 that would tie the law's promise of immunity to a regular audit proving tech companies' algorithms and content-removal practices are "politically neutral."

What do the law's defenders say?

Tech companies have vigorously defended Section 230, testifying to Congress repeatedly about how it allows them to remove the most objectionable content from their platforms and protects start-ups from being sued out of existence.

Wyden still stands by Section 230, writing in a Washington Post op-ed Monday that efforts to repeal it would punish small start-ups rather than giants like Facebook and Google.

Wyden said corporations lobbying for changes to Section 230 are doing so to find "an advantage against big tech companies."

"Whenever laws are passed to put the government in control of speech, the people who get hurt are the least powerful in society," Wyden wrote, referencing SESTA-FOSTA, a 2018 law that made an exception to Section 230 for platforms hosting sex work ads. The law was billed as a way to mitigate sex trafficking, but opponents, including many sex workers, say it made consensual sex work less safe since those engaging can no longer vet their clients in advance and from behind a screen.

How could the law change?

Congress has held several hearings on Section 230 and sought input from academics and tech executives. Lawmakers in both parties admonished the Trump administration's push to include a similar provision in U.S. trade agreements as Congress continues to debate Section 230's future.

Most critics of Section 230 recognize the importance of maintaining some of its key elements, like moderation protections. Former Vice President Joe Biden revealed himself as a notable exception. In an interview with The New York Times editorial board published earlier this year, the Democratic presidential hopeful said Section 230 "immediately should be revoked" for tech platforms including Facebook, which he said, "is propagating falsehoods they know to be false."

Though few others seem to favor a total repeal, lawmakers have expressed interest in scaling back some of the powers of Section 230 or making platforms earn its protections by complying with certain standards.

"On 230, I know there are some that have said just get rid of it," Rep. Jan Schakowsky, D-Ill., told CNBC in an interview in January. Shakowsky had recently hosted a hearing on deepfakes and digital deception in the consumer protection subcommittee, which she chairs.

"Our view is that we want to protect First Amendment rights, there's no question. But right now, we think the balance favors those who want a liability shield, and [it] goes way too far in that sense."

It's still unclear what specific steps lawmakers may take to change Section 230, but they have often used it as a reminder to tech companies that its protections may not last forever.

At the deepfakes hearing, Rep. Greg Walden, R-Ore., who has advised Congress to revisit Section 230, said, "This hearing should serve as a reminder to all online platforms that we are watching them closely."

WATCH: Facebook lays out details for content oversight board

Facebook lays out details for content oversight board
VIDEO0:4900:49
Facebook lays out details for content oversight board