Max Fisher is an international reporter and columnist at the New York Times, where he reports on global trends and major world events. He has contributed to a series on social media that was a 2019 Pulitzer Prize finalist.
Below, Max shares 5 key takeaways from his new book, The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World. Listen to the audio version – read by Max himself – in the Next Big Idea app.
1. Social platforms lie to you.
When you open social media, think you see your community’s views, feelings, and political opinions. You think it’s the world reflected through the platforms but you see a lie that the platform is telling you to manipulate yourself.
You’re actually seeing the decisions made by the platform’s artificial intelligence systems, which have combed through vast amounts of online content (more than you could even take in) and selected a tiny fraction of those posts to show you . Then it ordered and sequenced them in such a way that these systems learned to be maximally effective in keeping you busy for as long as possible.
Social platforms have designed these systems – for which they employ thousands of engineers, including some of the world’s best names in artificial intelligence – to consciously turn your greatest cognitive and emotional weaknesses against you. These systems draw (and this used to be openly discussed in Silicon Valley before they learned to hide it) from the largest pools of private user data ever amassed to learn exactly what can best manipulate them.
They also draw from the darkest corners of human psychology – from the science of the addiction behind casinos, the influence of which you can see in the slot machine-like colors and sounds of social platforms, to the science of how we identify truth, or separate right from wrong. Extremely sophisticated AIs are at work every second of every day, deconstructing our deepest impulses and weaknesses so they can turn them against us.
2. Social platforms train you.
Social platforms keep us scrolling and typing for as long as possible so they can show us more of the ads that make them billions of dollars every year. According to data from the Census Bureau, in 2014, for the first time, the average American spent more time connecting on social media than in person, and the gap has widened every year since. Social media is now the dominant way we consume news and information, learn about our world and relate to each other.
Social media platforms have learned to teach you certain behaviors and feelings so that you post in a way that makes both you and the users you interact with more likely to spend more time on the platform. By training you, the platforms change who you are.
“Social media is now the dominant way we consume news and information, learn about our world and relate to each other.”
For example, there’s an experiment in which researchers tested participants for their baseline propensity for outrage before telling them to post on Twitter (it was fake Twitter so researchers could control the experience). Half of the subjects were asked to tweet expressing their outrage. They showed these subjects that their tweets had received a lot of likes and shares. Many other studies on social platforms have shown that social media sites artificially amplify every post with outrage. The platforms know that outrage is well received, so the platform sends outraged posts to many people, resulting in a particularly high engagement for that post. So the researchers showed the subjects these fake tweets with all the engagement. They did this a few times. Soon all subjects, even those who had been identified as resentful of outrage, began to internalize this.
They thought they were getting all this positive feedback from their community, all this validation and attention for voicing their outrage, when in reality it was the platform that was tricking them. Sure enough, these users wanted to post more outraged tweets, with each post being more outraged than the last. But what blew me away is that research subjects became more prone to outrage when they were offline, away from social media. This feigned sense of social reward had been instilled so strongly that it altered their underlying nature as humans.
This training process happens to all of us every day. And it’s more than just outrage that’s being drummed into us. Understanding this was a great first step for me in documenting the impact of social media on our world.
3. The consequences reach into all facets of life.
The algorithms that determine what we watch, what goes viral, and how content is presented essentially drain our deepest, and sometimes darkest, instincts from a car battery. They feed our inner beast by keeping us scrolling and posting, further exacerbating the platforms’ impact on our communities. One example in particular comes to mind.
Seven unconnected villages in different parts of Indonesia had risen simultaneously in spontaneous mob violence. In each case, it was about an innocent man from outside the village who happened to be traveling through the village. It turns out a single viral rumor had surfaced on Facebook. This rumor had originally come from a small account with no real audience, but the platform – with the kind of precision that only a machine trained on billions of online interactions could identify – had identified it as incredibly compelling by using just that pushed by was conspiracy and identity panic, which triggers collective outrage and thus promotes engagement.
“The algorithms that determine what we watch, what goes viral, and how content is presented essentially drain our deepest, and sometimes darkest, instincts from a car battery.”
The system had propagated it so aggressively and so quickly to so many users, in a way that hijacked those users’ instinct for collective action, that these seven villages immediately rose to violence. That was in early 2018, and a few months later, for another story, I happened to walk into Facebook’s Silicon Valley headquarters and meet some of the people who are providing answers to these very types of crises. I told them about the incident in Indonesia and they shrugged and said ok – didn’t ask any more, didn’t ask for the names of the villages or who my original source was.
The specific rumor that’s been causing chaos over there may sound familiar: it claimed shadowy elites are in league with minorities to abduct local children and harvest their organs. The reason it sounds familiar is that a few months later, a version of the same rumor was fiercely promoted to millions of Americans by Facebook and YouTube, dubbed QAnon. Within months, an almost identical rumor was being circulated by platforms in half a dozen other countries, such as Mexico and Germany, as if the system was converging on this bizarrely macabre rumor as some sort of skeletal key to boosting engagement.
This demonstrates the power these systems can wield, which is ubiquitous but can feel so shocking and seismic when revealed through a crisis. But this story also underscores the fact that nobody is behind the wheel. This is a machine with no human driver, no overseer, more or less free to do what it will with us and our societies and our politics.
4. Turning off is not a good answer.
What power does an individual have when literally the largest companies by market capitalization in human history have completely captured our individual attention and the way we interact with each other and understand reality itself? I want to tell you that the answer is to just turn it off. In general, that’s good advice, considering research shows that even a few weeks without social media significantly improve wellbeing.
There was an experiment where researchers had subjects turn off Facebook for four weeks — just that one platform — and compared to the control group of subjects who stayed on social media, those who turned it off became significantly happier and less anxious because they separate from those systems designed to captivate users through negative emotions such as outrage and identity conflicts. In fact, the increase in life satisfaction corresponded to 25 to 40 percent of the effect of therapy. Just by deleting an app. The same study found that people who turned off social media were immediately dramatically less affected by partisan polarization. The drop in polarization among these people was about half the total increase in polarization in America from 1996 to 2018, meaning it was like rolling back two decades of political toxicity and division.
“The same study found that people who turned off social media were immediately dramatically less affected by partisan polarization.”
However, turning it off is not a good answer. As nice as it sounds, most of us don’t have the luxury of simply turning it off. And even if you can miraculously isolate yourself completely from these digital worlds, you are still affected by the consequences of these platforms. They still poison the world around you. Telling people the solution is to delete some apps is a bit like saying, hey, there’s a giant factory polluting the water you drink, so just buy bottled water.
5. Understanding is the first line of defense.
There are two things you can do aside from trying to spend less time on social media. Understand that it hides its influence in what it shows you. They are sated and trained to follow the platforms’ preferences and desired platforms’ policies and desired platforms’ emotional feelings, but they are smuggled by what looks like your peers, friends, and community.
Also understand how your own mind works. The power of social media comes from co-opting your mental machinery, but the truth about how your mind works isn’t necessarily what you would think. How we learn, how we determine right and wrong, how we discern what is persuasive or persuasive and what is not, are all questions that have been studied extensively over the past decade. Social platforms have become incredibly sophisticated at understanding and exploiting fundamental elements of our nature, but if you can understand your own nature and the flaws that come with being human, then you will be able to recognize technological influences and stand up to them to arm.
To hear the audio read by author Max Fisher, download the Next Big Idea app today: