Fight Over Big Tech Looms in US Supreme Court

An upcoming U.S. Supreme Court case that asks whether tech firms can be held liable for damages related to algorithmically generated content recommendations has the ability to “upend the internet,” according to a brief filed by Google this week.

The case, Gonzalez v. Google LLC, is a long-awaited opportunity for the high court to weigh in on interpretations of Section 230 of the Communications Decency Act of 1996. A provision of federal law that has come under fire from across the political spectrum, Section 230 shields technology firms from liability for content published by third parties on their platforms, but also allows those same firms to curate or bar certain content.

The case arises from a complaint by Reynaldo Gonzalez, whose daughter was killed in an attack by members of the terror group ISIS in Paris in 2015. Gonzales argues that Google helped ISIS recruit members because YouTube, the online video hosting service owned by Google, used a video recommendation algorithm that suggested videos published by ISIS to individuals who displayed interest in the group.

Gonzalez’s complaint argues that by recommending content, YouTube went beyond simply providing a platform for ISIS videos, and should therefore be held accountable for their effects.

Dystopia warning

The case has garnered the attention of a multitude of interested parties, including free speech advocates who want tech firms’ liability shield left largely intact. Others argue that because tech firms take affirmative steps to keep certain content off their platforms, their claims to be simple conduits of information ring hollow, and that they should therefore be liable for the material they publish.

In its brief, Google painted a dire picture of what might happen if the latter interpretation were to prevail, arguing that it “would turn the internet into a dystopia where providers would face legal pressure to censor any objectionable content. Some might comply; others might seek to evade liability by shutting their eyes and leaving up everything, no matter how objectionable.”

Not everyone shares Google’s concern.

“Actually all it would do is make it so that Google and other tech companies have to follow the law just like everybody else,” Megan Iorio, senior counsel for the Electronic Privacy Information Center, told VOA.

“Things are not so great on the internet for certain groups of people right now because of Section 230,” said Iorio, whose organization filed a friend of the court brief in the case. “Section 230 makes it so that tech companies don’t have to respond when somebody tells them that non-consensual pornography has been posted on their site and keeps on proliferating. They don’t have to take down other things that a court has found violate the person’s privacy rights. So you know, to [say] that returning Section 230 to its original understanding is going to create a hellscape is hyperbolic.”

Unpredictable effects

Experts said the Supreme Court might try to chart a narrow course that leaves some protections intact for tech firms, but allows liability for recommendations. However, because of the prevalence of algorithmic recommendations on the internet, the only available method to organize the dizzying array of content available online, any ruling that affects them could have a significant impact.

“It has pretty profound implications, because with tech regulation and tech law, things can have unintended consequences,” John Villasenor, a professor of engineering and law and director of the UCLA Institute for Technology, Law and Policy, told VOA.

“The challenge is that even a narrow ruling, for example, holding that targeted recommendations are not protected, would have all sorts of very complicated downstream consequences,” Villasenor said. “If it’s the case that targeted recommendations aren’t protected under the liability shield, then is it also true that search results that are in some sense customized to a particular user are also unprotected?”

26 words

The key language in Section 230 has been called, “the 26 words that created the internet.” That section reads as follows:

“No provider or user of an interactive computer service shall be treated as the publisher of or speaker of information provided by another information content provider.”

At the time the law was drafted in the 1990s, people around the world were flocking to an internet that was still in its infancy. It was an open question whether an internet platform that gave individual third parties the ability to post content on them, such as a bulletin board service, was legally liable for that content.

Recognizing that a patchwork of state-level libel and defamation laws could leave developing internet companies exposed to crippling lawsuits, Congress drafted language meant to shield them. That protection is credited by many for the fact that U.S. tech firms, particularly in Silicon Valley, rose to dominance on the internet in the 21st century.

Because of the global reach of U.S. technology firms, the ruling in Gonzalez v. Google LLC is likely to echo far beyond the United States when it is handed down.

Legal groundwork

The groundwork for the Supreme Court’s decision to take the case was laid in 2020, when Justice Clarence Thomas wrote in response to an appeal that, “in an appropriate case, we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by internet platforms.”

That statement by Thomas, arguably the court’s most conservative member, heartened many on the right who are concerned that “Big Tech” firms enjoy too much cultural power in the U.S., including the ability to deny a platform to individuals with whose views they disagree.

Gonzalez v. Google LLC is remarkable in that many cases that make it to the Supreme Court do so in part because lower courts have issued conflicting decisions, requiring an authoritative ruling from the high court to provide legal clarity.

Gonzalez’s case, however, has been dismissed by two lower courts, both of which held that Section 230 rendered Google immune from the suit.

Conservative concerns

Politicians have been calling for reform of Section 230 for years, with both Republicans and Democrats joining the chorus, though frequently for different reasons.

Former President Donald Trump regularly railed against large technology firms, threatening to use the federal government to rein them in, especially when he believed that they were preventing him or his supporters from getting their messages out to the public.

His concern became particularly intense during the early years of the COVID-19 pandemic, when technology firms began working to limit the spread of social media accounts that featured misinformation about the virus and the safety of vaccinations.

Trump was eventually kicked off Twitter and Facebook after using those platforms to spread false claims about the 2020 presidential election, which he lost, and to help organize a rally that preceded the assault on the U.S. Capitol on January 6, 2021.

Major figures in the Republican Party are active in the Gonzalez case. Missouri Senator Josh Hawley and Texas Senator Ted Cruz have both submitted briefs in the case urging the court to crack down on Google and large tech firms in general.

“Confident in their ability to dodge liability, platforms have not been shy about restricting access and removing content based on the politics of the speaker, an issue that has persistently arisen as Big Tech companies censor and remove content espousing conservative political views,” Cruz writes.

Biden calls for reform

Section 230 criticism has come from both sides of the aisle. On Wednesday, President Joe Biden published an essay in The Wall Street Journal urging “Democrats and Republicans to come together to pass strong bipartisan legislation to hold Big Tech accountable.”

Biden argues for a number of reforms, including improved privacy protections for individuals, especially children, and more robust competition, but he leaves little doubt about what he sees as a need for Section 230 reform.

“[W]e need Big Tech companies to take responsibility for the content they spread and the algorithms they use,” he writes. “That’s why I’ve long said we must fundamentally reform Section 230 of the Communications Decency Act, which protects tech companies from legal responsibility for content posted on their sites.”

         

leave a reply: