LOS ANGELES—Section 230 of the 1996 Communications Decency Act — the law generally considered to make online free speech possible — has come under regular attack from both sides of the political aisle in recent months. Not only is the bipartisan EARN IT Act moving to the Senate floor for debate after gaining approval in a committee vote earlier this month, but the Justice Department has proposed its own rollbacks to Section 230, as well.
And in May, Donald Trump said he would sign an executive order that could undo the legal protections for online platforms that have existed under Section 230 for nearly a quarter-century.
The law guarantees that platforms, such as social media sites, online message boards, and even internet service providers, cannot be held legally liable for content posted by third parties. But the latest attempt to weaken Section 230 comes in the form of the Platform Accountability and Consumer Transparency Act, PACT, co-sponsored by South Dakota Republican John Thune and Hawaii Democrat Brain Schatz (pictured above).
The bill will receive its first hearing on July 28, at a meeting of a Senate Commerce subcommittee, according to a report by Axios.
“Section 230 was created to help jumpstart the internet economy, while giving internet companies the responsibility to set and enforce reasonable rules on content. But today, it has become clear that some companies have not taken that responsibility seriously enough,” Schatz said in a statement, about the bill which would require internet platforms to “explain their content moderation practices in an acceptable use policy that is easily accessible to consumers.”
The bill’s intentions appear positive, and have been broadly supported even by digital rights activist groups such as the Electronic Frontier Foundation. The bill would require platforms to disclose “shadowbanning,” the practice of causing certain users to lose their audience by keeping their posts out of, or ranked low in, search results, among other quiet suppression techniques.
But according to an analysis by EFF, “the PACT Act’s implementation of these good ideas, however, is problematic.”
The law would end Section 230’s liability protections for posts by third parties, if a platform did not take steps to remove potentially illegal content upon receiving a notice that the content has been ruled illegal by a court.
But “the PACT Act poorly defines what qualifies as a court order, fails to provide strong protections, and sets smaller platforms up for even greater litigation risks than their larger competitors,” the EFF analysis says.
The proposed law would place a “thumb on the scale” in favor of suppressing speech, and that would mean that “disempowered communities will be most severely impacted, as they historically have been targeted by claims that their speech is illegal or offensive,” according to EFF.
Sex workers and adult content creators would likely fall in to that “targeted” category.
The PACT Act also fails to state the requirements for content to be riled illegal by a court, according to EFF, meaning that even a preliminary order or “default judgement” — that is, a judgement handed down by a court without being argued by advocates for both sides — could be used to force platforms to remove content.
Photo By Senate Democrats / Wikimedia Commons