Supreme Court wrestles with lawsuit shield for social media
WASHINGTON >> In its first case about the federal law that is credited with helping create the modern internet, the Supreme Court seemed unlikely today to side with a family wanting to hold Google liable for the death of their daughter in a terrorist attack.
At the same time, the justices also signaled in arguments lasting two and a half hours that they are wary of Google’s claims that a 1996 law, Section 230 of the Communications Decency Act, affords it, Twitter, Facebook and other companies far-reaching immunity from lawsuits over their targeted recommendations of videos, documents and other content.
The case highlighted the tension between technology policy fashioned a generation ago and the reach of today’s social media, numbering billions of posts each day.
“We really don’t know about these things. You know, these are not like the nine greatest experts on the internet,” Justice Elena Kagan said of herself and her colleagues, several of whom smiled at the description.
Congress, not the court, should make needed changes to a law passed early in the internet age, Kagan said.
Justice Brett Kavanaugh, one of six conservatives, agreed with his liberal colleague in a case that seemed to cut across ideological lines.
Don't miss out on what's happening!
Stay in touch with top news, as it happens, conveniently in your email inbox. It's FREE!
“Isn’t it better,” Kavanaugh asked, to keep things the way they are and “put the burden on Congress to change that?”
The case before the court stems from the death of American college student Nohemi Gonzalez in a terrorist attack in Paris in 2015. Members of her family were in the courtroom to listen to arguments about whether they can sue Google-owned YouTube for helping the Islamic State spread its message and attract new recruits, in violation of the Anti-Terrorism Act. Lower courts sided with Google.
The justices used a variety of examples to probe what YouTube does when it uses computer algorithms to recommend videos to viewers, whether content produced by terrorists or cat lovers. Chief Justice John Roberts suggested what YouTube is doing isn’t “pitching something in particular to the person who’s made the request” but just a “21st century version” of what has been taking place for a long time, putting together a group of things the person may want to look at.
Justice Clarence Thomas asked whether YouTube uses the same algorithm to recommend rice pilaf recipes and terrorist content. Yes, he was told.
Kagan noted that “every time anybody looks at anything on the internet, there is an algorithm involved,” whether it’s a Google search, YouTube or Twitter. She asked the Gonzalez family’s lawyer, Eric Schnapper, whether agreeing with him would ultimately make Section 230 meaningless.
Lower courts have broadly interpreted Section 230 to protect the industry, which the companies and their allies say has fueled the meteoric growth of the internet by protecting businesses from lawsuits over posts by users and encouraging the removal of harmful content.
But critics argue that the companies have not done nearly enough to police and moderate content and that the law should not block lawsuits over the recommendations that point viewers to more material that interests them and keeps them online longer.
Any narrowing of their immunity could have dramatic consequences that could affect every corner of the internet because websites use algorithms to sort and filter a mountain of data.
Lisa Blatt, representing Google, told the court that recommendations are just a way of organizing all that information. YouTube users watch a billion hours of videos daily and upload 500 hours of videos every minute, Blatt said.
Roberts, though, was among several justices who questioned Blatt about whether YouTube should have the same legal protection for its recommendations as for hosting videos.
“They appear pursuant to the algorithms that your clients have. And those algorithms must be targeted to something. And that targeting, I think, is fairly called a recommendation, and that is Google’s. That’s not the provider of the underlying information,” Roberts said.
Reflecting the complexity of the issue and the court’s seeming caution, Justice Neil Gorsuch suggested another factor in recommendations made by YouTube and others, noting that “most algorithms are designed these days to maximize profits.”
Gorsuch suggested the court could send the case back to a lower court without weighing in on the extent of Google’s legal protections. He participated in arguments by phone because he was “a little under the weather,” Roberts said.
Several other justices indicated that arguments in a related case Wednesday might provide an avenue for avoiding the difficult questions raised today.
The court will hear about another terrorist attack, at a nightclub in Istanbul in 2017 that killed 39 people and prompted a lawsuit against Twitter, Facebook and Google.
Separate challenges to social media laws enacted by Republicans in Florida and Texas are pending before the high court, but they would not be argued before the fall or decided until the first half of 2024.
Associated Press writer Jessica Gresko contributed to this report.