YouTube Pulls Down Videos Detailing Russian Propaganda Efforts

This story is part of war in UkraineCNET’s coverage of events there and their wider impact on the world.

YouTube has apparently removed dozens of videos from the Russian Media Monitor, a channel run by Daily Beast columnist Julia Davis that highlights and translates Russian television’s perspectives on the country’s invasion of Ukraine.

Davis complained on Monday on Twitter YouTube had removed 60 videos and shared screenshots of three YouTube notices explaining to Davis that the videos violated the site’s Community Guidelines. CNET has confirmed that these three are no longer available on YouTube.

The YouTube channel has 12,000 subscribers, but Davis also has 390,000 followers on Twitter, where her bio reads, “I watch Russian state TV, so you don’t have to.” Her videos typically feature figures on Russian television discussing the war in Russia speak Ukraine.

A video posted on Twitter but apparently removed from YouTube, features Vladimir Solovyov, whom the US State Department calls “possibly the most energetic Kremlin propagandist today.” In the video, he complains about a lack of stocks of Russian military equipment and says that people were excited about the recent attacks on civilian infrastructure across Ukraine because they showed that the Russians still have some supplies.

The situation highlights the difficulties of moderating content in a world where social media can be used both to elucidate difficult issues and to spread disinformation. With 2 billion monthly users and deep ties to the world’s most powerful Internet search engine, YouTube is one of the most influential sources of online information on earth.

READ :  Barcelona continues to ramp up efforts to increase city’s digital capacity

YouTube can offer outsiders a glimpse into Russian thinking, which can help assess the country’s attitudes towards the war in the same way as journalism does. But video clips by Russian propagandists could also run counter to YouTube’s efforts,”ban content that denies, trivializes, or trivializes well-documented violent events.”

YouTube and Davis did not immediately respond to requests for comment.

Some fans of Davis’ work tried to undo the YouTube move. “What part of the ‘Community Guidelines’ could @JuliaDavisNews possibly be violating by making the statements of Russian propagandists transparent? @YouTube should restore their videos.” tweeted Hans KristensenDirector of the Nuclear Information Project at the Federation of American Scientists.

Since the Russian invasion of Ukraine in February, YouTube has not only tightened its policy on misinformation about the war, but also stepped up its enforcement. Misinformation is also a problem on TikTok, the new social media darling, and has been a big problem on Meta’s Facebook and Instagram.

Because YouTube’s policy enforcement is on such a massive scale, it has to rely on machine learning to track down much of the bad stuff that can be pulled down — and sometimes it’s used to improperly remove good content as well. Nuance is particularly difficult in cases involving multiple languages ​​and videos debunking controversial issues.

YouTube — like Facebook, Twitter, Reddit, and many other internet companies that offer users a platform to post their own content — has struggled to balance freedom of expression with effective monitoring of the worst material posted there. Over the years, YouTube has struggled with various types of misinformation; Conspiracy theories; discrimination; hatred and harassment; child abuse and exploitation; and videos of mass murder, all on an unprecedented global scale. YouTube critics argue that the company’s content moderation and security efforts still too often fall short.

READ :  When You 'Dislike' A Video, YouTube Keeps Recommending Similar Ones, Research Shows

CNET’s Imad Khan contributed to this report.