New research examining pro-Kremlin changes made to the English-language site for the Russo-Ukrainian war has shed light on how Wikipedia can be manipulated for information warfare.
Analysis from the Institute for Strategic Dialogue (ISD) and the Center of the Analysis of Social Media (CSAM) in the UK could help develop systems that can detect platform manipulation on one of the world’s most popular websites.
The study, released Monday, details suspicious edits to the Russian-Ukrainian page in the digital encyclopedia and what it describes as a method for identifying potentially coordinated manipulation efforts targeting Wikipedia in general.
Carl Miller, one of the paper’s authors, emphasized that the research is not a smoking gun for state manipulation. “We have not directly attributed suspicious processing activities to the Russian state. We would never have made it,” he says. “The idea was to try to characterize behavior that was already known to be suspicious, to see if we could – over time – use this as signals of undetected suspicious activity.”
The ISD/CSAM analysis focused on 86 accounts edited by the Russian-Ukrainian war site that were subsequently banned by Wikipedia for violating their rules, including being operated as “sock puppet” accounts in order to obscure who their real operator was.
Although the research “was not aimed at identifying unknown suspicious activity, but rather at understanding and describing activity that was already known to be suspicious,” the manipulation they described was not widely reported by Wikipedia, suggesting that the platform may could be exploited similarly to social media platforms.
The editors that the analysis focused on were behind 681 changes to the page. Miller said the researchers identified 16 links to state-sponsored media introduced by these changes, acknowledging that it “is probably not the best way to identify suspicious changes.” So the researchers began manually analyzing each of the edits.
Manual analysis found that the edits featured “narratives consistent with Kremlin-sponsored information warfare” and cast doubt on the objectivity of pro-Western narratives while maximizing objectivity of pro-Kremlin reports.
The edited material also supported Kremlin descriptions of ongoing situations, including introducing historical narratives of the overthrow of former Ukrainian President Viktor Yanukovych – an incident preceded by his refusal to sign an Association Agreement with the European Union and shortly thereafter the military annexation of the Crimea through Russia.
Other edits “added Kremlin quotes and press releases explicitly into the page to increase the importance of pro-Russian arguments and stances,” the authors wrote.
The researchers said their goal is to examine how Wikipedia might be vulnerable to the same types of manipulations that target other social media platforms, including Facebook, Twitter, YouTube and Reddit.
“Wikipedia is notoriously resilient to vandalism. All changes are open, vandalism can be quickly reversed, pages can be blocked and protected, and the site is monitored by a combination of bots and editors,” the study said.
Describing itself as “a brief contribution to the discussions surrounding the threats of information warfare,” the paper notes that one question remains: “How vulnerable is Wikipedia to an information warfare that uses more subtle methods and is carried out over longer periods of time?” “
While the US government has accused Russian organizations of engaging in “information warfare” by conducting social media campaigns to foment division and public understanding of news events, a declassified intelligence community investigation into this interference during the 2016 presidential election no such activity noted Wikipedia.
For the general public, Wikipedia competes with the world’s most respected news organizations as a trusted source of information, despite being created by volunteers who must link to source material – often provided by the same news organizations.
It is regularly targeted by reputation management companies and other forms of malicious editing behavior to promote certain views, and in September 2021 the Wikimedia Foundation cracked down on 19 users, 7 of whom were banned. In an announcement, the organization linked the action to concerns about “infiltration” from mainland China.