Big Tech has a lot to fear these days: Congressional scrutiny, layoffs, and more. But this week all eyes will be on the US Supreme Court as it hears two cases that have the potential to revolutionize how social media companies operate.
The two cases Gonzalez v. Google and Twitter v. Taamneh stem from tragedies caused by terrorist attacks. Victims’ families are asking the judiciary to crack the hard shell of immunity from lawsuits stemming from third-party content posted on interactive websites such as Twitter, Facebook and YouTube.
The two cases have similar facts, but they raised slightly different issues when they came to the Supreme Court. Let’s take one by one.
Algorithmic recommendations are under scrutiny
In the Google case, scheduled to be heard on Tuesday, Nohemi Gonzalez’s estate and relatives are suing the social media giant. Nohemi was a US citizen and student in November 2015 when she was killed by terrorists attacking a Parisian bistro. The terrorist militia Islamic State (IS) took responsibility.
In their lawsuit, Nohemi’s relatives alleged that Google violated the Antiterrorism Act of 1990 through YouTube. This law authorizes American nationals to sue for injuries “due to an act of international terrorism.” It imposes liability on “any person knowingly providing significant assistance who is committing an act of international terrorism”.
The lawsuit alleged that Google allowed IS to publish videos inciting violence and recruiting members. It was also claimed that YouTube recommended IS videos to users via an algorithm that identifies users who may be interested in those videos.
Google successfully dismissed its lawsuit by filing Section 230 of the Communications Decency Act of 1996. Section 230, which is heavily criticized today by some members of Congress, Judge Clarence Thomas and others, immunizes interactive websites such as Facebook, YouTube or Twitter from lawsuits arising from third-party content on those sites.
The US Circuit Court of Appeals for the Ninth Circuit affirmed the trial court’s dismissal of the relatives’ claims. In the Supreme Court, they narrowed their case by asking the judges whether Google-owned YouTube enjoyed Section 230 immunity from a claim based on its algorithmic recommendations of third-party content to its users.
Gonzalez’s appeal will mark the first time the Supreme Court has examined Section 230, which was enacted nearly 30 years ago to encourage the growth of the Internet.
A question of aid
Gonzalez’s relatives are also part of the second High Court Twitter case, which will be heard on Wednesday. While Gonzalez was assassinated in Paris, Nawras Alassaf, Sierra Clayborn, Tim Nguyen and Nicholas Thalasinos were killed in separate ISIS terrorist attacks in Istanbul and San Bernardino, California.
In 2015, at least 130 people were killed in Paris in a coordinated attack by terrorists from the Islamic State group. The Supreme Court will consider whether social media companies are responsible for content on their platforms that helps terrorist organizations communicate, fundraise and recruit. Photo by Jeff J Mitchell/Getty Images
The families also sued Google, Twitter and Facebook under the Anti-Terrorism Act. They claimed that these platforms — by hosting and recommending IS content, particularly its use for recruitment, fundraising and communications — “knowingly provided significant assistance” under the law and “aided and abetted” an act of international terrorism.
The family’s claims were dismissed without the trial court invoking Section 230. The Ninth Circuit Court again upheld the dismissals, but with one exception. The Court of Appeal said Nawras Alassaf’s family had made a plausible claim for assistance, which should be reconsidered by the court. Twitter, along with the other two platforms, then petitioned the Supreme Court to review that decision.
Decisions could reshape social media
The issues of immunity, and how judges decide whether under Section 230 or the anti-terrorism law, could have far-reaching implications for social media platforms — the content they post and the content they remove. Unsurprisingly, more than 70 “friend-of-the-court” briefs, mostly from the tech community supporting the platforms, have been filed with the Supreme Court. Recognizing the broad interests at stake, additional briefs have been filed by states, religious groups, gun control organizations, business groups, former national security officials and members of Congress, among others.
The Biden administration has filed a brief in the Google case, arguing that Section 230 bar claims by Gonzalez’s relatives that YouTube failed to block or remove third-party content, but does not exempt YouTube from any liability for its targeted recommendations of ISIS content protects to its users. In the Twitter case, the government asked the judges to rule in favor of the social media platforms, noting that the plaintiffs “allege that the defendants knew that ISIS and its affiliates were using the widely used social media platforms.” of the defendants in common with millions, if not billions, of other people around the world, and that the defendants failed to actively monitor and stop such use.”
These allegations, the government argued, do not “plausibly establish” that Twitter “knowingly significantly assisted” an international act of terrorism.
There are many other aspects to the two cases that are likely to preoccupy and even confuse judges as they first delve into this particular arena. And it probably won’t be the last time they do this.
Perhaps even more controversial and significant are two cases awaiting the court’s decision on whether to hear them in the next term. NetChoice and the Computer and Communications Industry Association have challenged Florida and Texas state statutes in response to conservative complaints about censorship.
In NetChoice v. Paxton, the social media company and association argue that the First Amendment has been violated by a Texas law that prevents social media platforms with at least 50 million active users from blocking content based on users’ views , remove or demonstrate . They claim the law would also prevent them from removing harmful content. A federal appeals court ruled in favor of the state.
Another federal appeals court ruled in favor of NetChoice’s challenge to a similar law in Florida. The state has taken its appeal to the Supreme Court.
The judge writing the Florida advisory said, “The core question of this appeal is whether the world’s Facebooks and Twitters – undeniably ‘private actors’ with First Amendment rights – engage in constitutionally protected expression activities when they moderate and curate content that they distribute on their platforms.”
The judges have asked the US Attorney General for her opinion on whether to review the cases. The separation between the two courts of appeal increases the chances that the judges will agree to take the cases.