The US Supreme Court was struggling to determine when social media companies can be held accountable for supporting terrorism as judges heard the second of two cases poised to shape the legal rules of harmful online material .
In a clash following a 2017 shooting at an Istanbul nightclub, judges spent more than two hours exploring the limits of a federal counter-terrorism law — and deciding whether social media platforms resemble banks and restaurants serving terrorists and people operate the known criminals give weapons.
The judges gave no clear indication of the outcome, although some have indicated they are skeptical of a victim’s family’s efforts to sue Twitter Inc. and other social media companies for allegedly not doing enough to target terrorist content to remove.
Judge Clarence Thomas said that following the family’s reasoning, “it appears that any act of terrorism using this platform would also mean that Twitter is an accomplice and accomplice in these cases.”
The judges urged attorneys for Twitter, the federal government and the family to support federal law allowing victims of terrorist attacks to seek damages from organizations that “supported and abetted” a terrorist attack. They held up countless hypothetical examples to assess how culpable Twitter might be for allowing terrorists to stay on its platform, including resorting to historical figures and corporations.
“Let’s say J. Edgar Hoover tells the Bell Telephone Company that Dutch Schultz is a gangster and uses his phone to conduct mob activities,” Judge Samuel Alito said. “The phone company says, ‘We’re not going to take people off service on that basis.’ That makes them helpers and abbots?”
Read more: Social Media Faces test at Supreme Court after Congressional hesitation
“Maybe not,” Assistant Attorney General Edwin Kneedler said.
“Wow, that’s a maybe?” Alito replied.
The Twitter v. Taamneh case stems from a 2017 terrorist shooting at a nightclub in Istanbul that killed 39 people. A lower court said Twitter, Alphabet Inc.’s Google and Meta Platforms Inc.’s Facebook would have to deal with claims that they played a role by failing to take down Islamic State materials and make money from them.
Twitter said a federal appeals court unreasonably expanded the scope of the anti-terrorism law.
Twitter attorney Seth Waxman argued Wednesday that the company should not be held liable under the law because failure to remove harmful posts does not constitute aiding and abetting in an act of international terrorism. He said Twitter’s site had been “exploited by terrorists in violation of the company’s enforced anti-terrorism policies.”
Latest: Supreme Court holds option to circumvent Social Media Shield issue
The case before the Supreme Court on Wednesday is the second this week involving social media liability for terrorism. On Tuesday, judges heard arguments over whether Google’s YouTube could be held liable for proactively recommending terrorist propaganda to users who did not request that content. This case, Gonzalez v. Google, concerns the basic law of the internet, known as Section 230, which protects online businesses from lawsuits over content posted by their users.
On Wednesday, Judge Clarence Thomas hypothetically asked whether Twitter, which allows terrorists to use its platform, resembled a person giving a gun to “a friend who was a mugger, a murderer and a burglar.” Thomas asked whether Twitter could be held liable for “supporting and instigating” terrorists given the amount of knowledge it had about its users.
It’s possible the court could use the Twitter case to sidestep the larger question about the fate of Section 230. The court appeared wary of opening internet companies to lawsuits over malicious user contributions while conducting nearly three hours of hearings in the Google case.
If the Supreme Court rules that social media companies cannot be held responsible for “aiding and abetting” terrorism in the Twitter case, the judges could choose not to rule on whether Section 230 protects the companies from those lawsuits .
To contact reporters on this story:
Emily Birnbaum in Washington at [email protected];
Greg Stohr in Washington at [email protected]
To contact the editors responsible for this story:
Sara Forden at [email protected]
© 2023 Bloomberg LP All rights reserved. Used with permission.