Tech industry group sues to block California law designed to protect kids online over free speech concerns
- Tech industry group NetChoice is suing the state of California to block its new Age-Appropriate Design Code Act, which it claims violates the First Amendment.
- The group's members include Amazon, Google, Meta, TikTok and Twitter.
- NetChoice previously sued Texas and Florida over their social media laws that seek to poke holes in the tech industry's broad liability shield for content moderation.
NetChoice, a tech industry group that includes Amazon, Google, Meta, TikTok and Twitter, announced Wednesday that it's suing California to block the state's new Age-Appropriate Design Code Act, which it says violates the First Amendment.
Modeled off standards in the U.K., the California law aims to establish rules that make the internet safer for kids. It requires the highest privacy settings to be turned on by default for minors and says that online services targeting kids under 18 must assess the risk of harm to those users that could come from potentially harmful messages or exploitation.
The lawsuit adds to a growing slate of court cases involving free expression on the internet. Lawmakers are, in many instances, attempting to weaken the broad liability protections that online platforms enjoy for their content moderation efforts and their users' posts.
Concern over privacy and moderation issues extends across party lines, though Republicans and Democrats still largely disagree on how they should be handled. While the California bill was passed by a majority Democratic legislature, NetChoice has also sued both Texas and Florida over their social media laws passed by majority Republican legislatures. Those bills seek to hold tech platforms accountable for taking down posts on the basis of political views.
In California, NetChoice alleges the new law will harm minors, rather than protecting them, while also infringing on First Amendment rights to free speech by forcing companies to guess the meaning of "inherently subjective terms" from users.
"The State is empowered to impose crushing financial penalties" if the companies guess incorrectly, the group said. "The State can also impose such penalties if companies fail to enforce their content moderation standards to the Attorney General's satisfaction."
NetChoice says the law, which is set to take effect in July 2024, will produce "overwhelming pressure to over-moderate content to avoid the law's penalties for content the State deems harmful." The "over-moderation," the group says will "stifle important resources, particularly for vulnerable youth who rely on the Internet for life-saving information."
A representative for California Attorney General Rob Bonta's office defended the law in an emailed statement.
The measure "provides critical new protections over the collection and use of their data and works to address some of the real and demonstrated harms associated with social media and other online products and services," the statement said. "We are reviewing the complaint and look forward to defending this important children's safety law in court."
The language in the lawsuit echoes concerns voiced by a range of civil society groups against a federal bipartisan bill that also seeks to impose certain protections for kids on the internet. Those groups warned of potential harm to the rights of the LGBTQ community, in particular, fearing the parameters of content filters could be influenced by political preferences.
The lawmakers leading the federal legislation sought to address some of those concerns in a new version of the bill released Tuesday night, though some dissatisfaction with the changes remained.
The Florida and Texas laws that NetChoice opposes seek to poke holes in the tech industry's broad liability shield, Section 230 of the Communications Decency Act, which protects the right to moderate content. Republicans have been trying to impose greater restrictions on social media companies for what they see as censorship of conservative views on the most popular sites.
Mainstream platforms have repeatedly denied biased enforcement of their community rules, and independent research has shown conservative viewpoints often dominate online discussions.
The Supreme Court in May blocked Texas' version from taking effect, though it didn't rule on the merits of the case, and Florida's version has so far been blocked by lower courts.
The Supreme Court could still choose to take up the cases against both the state laws. In the meantime, it has announced it will hear two different cases next year that implicate Section 230 protection and could potentially weaken it.