When Kanye West made anti-Semitic comments on Instagram and Twitter earlier this month, Meta and Twitter responded by suspending his account, arguing that he violated their community guidelines.
Crucially, the decision to freeze his account – temporarily halting his expression in these areas – was made independently by the companies. No government actor was involved. The First Amendment generally prevents the US government from restricting the ideas of private speakers and corporations or controlling how social media companies manage their spaces.
But other governments may soon fill that void, regulating how American tech giants judge speech on their platforms. Earlier this month, the European Union passed a law regulating social media platforms: the Digital Services Act. The law will come into force in 2024, in time for the next US presidential election, and promises big changes in how online speech is judged not only in Europe but also here at home. The law places significant content moderation expectations on large social media companies, many of which are US-based, among other requirements, including curbing misinformation, hate speech and extremism.
It’s not clear how social media companies will conform to the law, but the fines they face for non-compliance will be massive. Businesses can be fined up to six percent of their annual revenue — that’s $11 billion for Google and $7 billion for Meta. In essence, the EU has created a significant new legal incentive for companies to regulate expression on their platforms.
Written to protect EU citizens, the law will almost certainly prompt social media companies to change their moderation policies worldwide. Thus, with the DSA, the EU will effectively do what the First Amendment ostensibly forbids our own government to do: regulate the editorial decisions of social media platforms on which Americans communicate with each other.
This isn’t the first time an EU law has changed Americans’ rights online. When was the last time you were asked about cookies when you visited a website, or when were you given the option to restrict a technology company’s access to your private data? Most of these changes resulted from the EU General Data Protection Regulation, which came into force in 2018.
Large social media companies will increasingly claim that the DSA requires them to remove specific content, rather than independent, internal policies. This means that an American politician’s conspiracy-filled Facebook post will establish legal liability for Meta. The company could then shut it down to avoid hefty fines in Europe. Likewise, a YouTube video posted by Christian nationalists or COVID misinformation shared on TikTok could be removed due to DSA concerns.
If the American legislature had rolled up their sleeves and done the difficult job of finding a fine line between protecting online spaces and upholding First Amendment rights, we wouldn’t have to speculate about the implications that a by and for those on written law on another continent will have ideas in our own marketplace.
Under the DSA, the range of ideas flowing into online spaces will narrow, for better or for worse. The threat of massive fines creates significant incentives for companies to remove content and speakers out of prudence could lead to EU sanctions. Experience shows that these types of laws lead to over-corrections, encourage content removal, and provide almost no incentive to foster an expansive space for the exchange of ideas.
The scenario becomes more complex when the US Supreme Court upholds Texas and Florida laws that require social media companies to keep content and speakers on their rooms, even if it violates their community guidelines. In this scenario, some online platforms would probably have to choose between the regulations of these two US states and the European Union, which would accelerate the balkanization of the global Internet. A separate Chinese Internet and a separate Russian Internet have developed in recent years, and the danger of the EU’s DSA, particularly if it triggers an American backlash, is that it will drive a wedge between the free North American Internet and the European Internet Union might drift .
Judges have not yet agreed to hear the Florida and Texas content moderation cases, but they are expected to do so. Unlike the DSA, the two states’ laws cannot circumvent US law, and after nearly a century of precedent, state laws are unconstitutional. Finally, the First Amendment generally prevents the US government from restricting the ideas of private speakers and corporations or controlling how social media companies manage their spaces.
Back on the other side of the Atlantic, the tech companies can challenge the DSA in European courts before or after it comes into effect, which EU officials have been expecting. While the law will require large companies to rebuild their moderation systems in a short period of time, it could also help companies by outsourcing all controversial content moderation decisions.
Companies like Meta, Google and Twitter have come under widespread criticism from those who feel they are not doing enough to remove dangerous, inaccurate and extremist content, and by those who feel they are going too far in doing so. Businesses naturally resist regulation, but by instituting standards across all major platforms, the EU could shield these private companies from the wrath of those who would resist their moderation guidelines.
Future Tense is a partnership of Slate, New America and Arizona State University that explores emerging technologies, public policy and society.