People are less inclined to accurately judge the truth of a headline when asked to share it on social media or otherwise interact with it, according to a new study in Science Advances. These results suggest that content sharing—a key participatory feature of most social media platforms—can inherently evoke a mindset that clouds judgment and hinders the ability to discern truth.
“When you see cat memes, your cousin’s baby photos, and serious news about global events in the same newsfeed, it can be especially difficult to think carefully about the news,” said Ziv Epstein, a Ph.D. Student in the Human Dynamics Group at the Massachusetts Institute of Technology (MIT) Media Lab and lead author of the study. “This is in contrast to other information environments that focus context on a single mindset, such as a newspaper.”
In their online experiment involving 3,157 Americans, Epstein and colleagues gave participants a series of headlines related to COVID-19 or political news that may or may not have been accurate. They asked different sequences of questions about those headlines—sometimes just about their accuracy or whether they would share, like, or comment on them; other times. about their accuracy and then whether they would share and vice versa.
They found that those asked about sharing before accuracy were 35% worse at detecting truth than those asked only about accuracy. When asked about accuracy before sharing, respondents were still 18% worse at spotting true headlines than respondents who only relied on accuracy.
Accuracy competes for attention
The researchers suggest that their findings reflect the mindset that social media evokes. In the context of social media, users are encouraged to endlessly scroll through their feeds and engage, while being distracted and emotionally stimulated by a vast, fleeting array of content from their networks. “The platforms create an attentional environment that draws people’s attention to other factors,” said David Rand, a professor at MIT’s Sloan School of Management and a co-author of the study, during a press conference at the 2023 AAAS Annual Meeting. “If you take the same people and put them in a different context…then they would be a lot more sophisticated. But the social media attention ecosystem is one that doesn’t prioritize [accuracy].”
Previous studies have shown that such environments can create vulnerability to belief in “fake news,” and the researchers’ findings highlight a key aspect of this vulnerability. When deciding what content to share on social media, people might be more prone to spouting untruths they wouldn’t normally believe, simply because they’re too distracted by other motivations to accurately judge if the content is true are.
“There are social motivations for sharing this repression accuracy, such as Like being liked by friends and followers, or having significant group membership,” Epstein said. When asked about ways for social media platforms to downgrade these motivations, Epstein suggested ways to interact with content that are less focused on sharing with followers. “Platforms could emphasize building connections between content rather than sharing content directly with an audience,” he said. “For example, platforms like Are.na and Pinterest accomplish this by allowing users to connect or pin content to channels.”
Can nudges reduce the spread of misinformation?
Epstein also suggested that accuracy nudges — simple prompts that draw a user’s attention to accuracy — could be a simple but effective solution for social media companies.
Epstein and his colleagues also observed that simply asking about accuracy improved perceptions of truth — resulting in people being less likely to share false headlines than those just asked about sharing. This is a hopeful finding that confirms previous findings that suggest accuracy nudges could be effective in reducing misinformation in general. However, the responsibility for implementing such features rests with the social media companies, the researchers said. “To some extent, as individuals, we can try to take steps to become more vigilant, but it’s really a systemic issue that needs to be addressed at the platform level,” Rand said.
“The bottom line is not so much that they need to get rid of the social stuff, but they need to add some things that make people think about accuracy,” suggested Gordon Pennycook, associate professor of behavioral sciences at the University of Regina and co-author the study. Together with Epstein and Rand, Pennycook examined the effectiveness of accuracy nudges in different social media contexts.
When asked whether accuracy nudges could still be effective when using social media to communicate about emotionally moving, evolving developments such as conflict, natural disasters or protests, Pennycook indicated that there are many situations on social media could, where accuracy nudges would be superfluous.
“There are some contexts – maybe many – where accuracy might not be a primary concern,” he said. “Of course, in cases where the truth is difficult to ascertain or information is lacking, such as in the early stages of a natural disaster, it will be difficult to prioritize accuracy.”
The question remains how effective accuracy nudges can be in different contexts, for example over longer periods of time or within or between particular party or geographic groups. Future work will expand current studies to understand how social media mindsets might affect different users around the world, the researchers said.
accuracy and user satisfaction
Epstein and colleagues emphasize that social media companies and policymakers should pay attention if they want to reduce misinformation — and if companies want to improve user experiences.
“Most people don’t like to deal with misinformation,” Rand said. “To an extent, users would prefer platforms where there is less misinformation.”
Rand suggested that accuracy prompts could be implemented on social media platforms to target specific instances of misinformation. Other efforts, such as Social media, such as crowd-sourced fact-checks, have gained popularity as a potential means of combating misinformation, and have recently found success on Twitter with varying degrees of success.
Epstein went on to suggest that the paradigm of maximum engagement that permeates many social media platforms — and is intertwined with the motivation to share — should be reevaluated. “Thinking more about user satisfaction or long-term metrics… maybe it’s less about more myopic engagement and more about how those things are integrated into people’s lives.”