This week’s Supreme Court arguments could reshape the future of the internet

The Supreme Court this week is considering a short but effective law that, if changed, could reshape the modern internet.

Section 230 of the Communications Decency Act shields Internet companies from liability for the user-generated content they host, and this has become an unlikely controversy in recent years.

On Tuesday, the Supreme Court heard hearings in the Gonzalez v. Google case. This case, brought by the family of Nohemi Gonzalez, a victim of the 2015 Islamic State terrorist attacks in Paris, argues that Google should be held liable for terrorist content promoted on YouTube that preceded the attack.

On Wednesday, the court will hear a parallel case blaming Twitter for another deadly terrorist attack – in this case one that resulted in the death of Nawras Alassaf, who was killed after an Islamic State gunman opened fire at an Istanbul nightclub in 2017 had.

Plaintiffs in both cases argue that the technology platforms in question should be held legally liable for Islamic State content they hosted or promoted in the run-up to attacks that collectively killed more than 150 people.

The Supreme Court Justices addressed the petitioner’s argument that YouTube’s making content available via its recommendation algorithm is in fact a different type of activity than merely hosting that content – one that is not protected by Section 230.

“We are focused on the recommendation feature, that they positively recommend or suggest ISIS content, and it’s not mere inaction,” said attorney Eric Schnapper, who represented the Gonzalez family at Tuesday’s hearing.

The idea that Section 230 could have exceptions is not new, but it is controversial. In 2018, a bill called FOSTA created an exemption to Section 230 that was ostensibly aimed at reducing sex trafficking, but has since been criticized for making sex work more dangerous.

READ :  The unsolved deaths of Idaho students spark rumors and harassment

The Supreme Court isn’t the only government agency evaluating Section 230, though efforts to repeal the law or condition its protections have largely stalled in Congress in recent years.

The story goes on

On Tuesday, some judges expressed doubts that the country’s highest court was the right body to even reevaluate the internet law.

“We’re a court, we really don’t know about these things,” Judge Elena Kagan said. “These aren’t the nine greatest experts on the internet.”

As Schnapper continued, the judges expressed some confusion about his reasoning, and both sides attempted to clarify it. Schnapper’s main argument focused on the distinction between failing to remove dangerous content – a statistical inevitability given the amount of content online platforms host – and actually promoting that content and expanding its reach:

“In our view, if the only alleged wrong is non-blocking or removal, that would be protected by 230(c)(1). But – but this is – the protection of 230(c)(1) goes no further. And the theory that — to protect the site from that was that the wrong is essentially being done by the person making the post, at most the site is allowing the damage to continue. And what we’re talking about when we talk about them – the site’s own decisions are affirmative actions by the site, not simply allowing third-party material to remain on the platform.”

Ultimately, the judges attempted to define the boundaries of what should and should not reasonably be protected by Section 230 by examining hypothetical extremes: that platforms using algorithms should be allowed to intentionally promote illegal content, or that they should should not make any algorithmic recommendations at all.

READ :  Why Los Cabos is perfect for this emerging travel trend

“Let’s say we’re looking for a line because it’s clear from our questions that we are,” Judge Sotomayor said.

To confuse matters further, Schnapper repeatedly referred to the platform’s algorithmic recommendations as “thumbnails” — a term that would be interpreted more generically as the snaps showing a preview of a YouTube video.

Some judges took Schnapper’s argument to another logical extreme, warning that a spin-off that removes 230 protections from algorithmic recommendations would immediately give search engines that rank search results the same treatment.

“So even up to the direct search engine that they could be liable for their prioritization system?” Kagan asked.

Judges have repeatedly expressed concern about the potential wide-ranging second-order implications of tinkering with Section 230.

“They are now asking us to make a very accurate forward-looking judgment that — don’t worry — it really isn’t going to be that bad,” Justice Brett Kavanaugh said. “I don’t know if that’s even the case, and I don’t know how we can meaningfully assess that.”

Those reservations were nearly universal among judges, who didn’t seem particularly interested in shaking up the status quo – a perspective we’re likely to see resurfacing during Wednesday’s hearings, which will again be livestreamed.

“We’re talking about the prospect of significant liability in litigation, and up to this point people have been focused on that [Anti-terrorism Act] because that’s the only point at issue here,” said Chief Justice John Roberts.

“But I suspect there would be many, many times more defamation lawsuits, discrimination lawsuits… It seems to me that the terrorism support thing would just be a tiny part of all the other stuff. And why shouldn’t we think about it?”

READ :  35 Million US Internet Households Are Interested in Subscribing to a Cloud Gaming Service