CNN
—
TikTok can show teens potentially harmful content related to suicide and eating disorders within minutes of creating an account, a new study suggests, likely contributing to increasing scrutiny of the app’s impact on its youngest users.
This was determined by the non-profit Center for Countering Digital Hate (CCDH) in a report published on Wednesday it can take a while less than three Minutes after signing up for a TikTok account to see suicide-related content, and about five more minutes to find a community promoting eating disorder content.
The researchers said they set up eight new accounts in the United States, the United Kingdom, Canada and Australia at TikTok’s minimum age of 13. These accounts briefly paused and liked content about body image and mental health. According to the CCDH, within 30 minutes, the app recommended videos about body image and mental health about every 39 seconds.
The report comes as state and federal lawmakers explore ways to take action against TikTok for privacy and security concerns and to determine if the app is suitable for teenagers. It also comes more than a year after executives from social media platforms, including TikTok, faced tough questions from lawmakers during a series of congressional hearings about how their platforms can guide younger users — particularly teenage girls. to harmful content, its damage mental health and body image.
After this At those hearings, which followed Facebook whistleblower Frances Haugen’s revelations about Instagram’s impact on teenagers, the companies pledged to change. But the latest findings from the CCDH suggest further work may need to be done.
“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, upsetting content that can have a significant cumulative impact on their understanding of the world around them and on their physical and mental health,” Imran Ahmed, CEO of the CCDH , said in the report.
A TikTok spokesman dismissed the study, saying it was an inaccurate representation of the viewing experience on the platform for a variety of reasons, including the small sample size, the limited 30-minute window for testing, and the way the accounts on a Scrolled through a series of unrelated topics to look for other content.
“This activity and the resulting experience do not reflect real behavior or viewing experiences of real people,” the TikTok spokesperson told CNN. “We regularly consult with healthcare professionals, eliminate violations of our policies, and provide access to supportive resources for all those in need. Recognizing that sparking content is unique to each individual, we remain focused on creating a safe and comfortable space for everyone, including people who choose to share their journeys of recovery or others about these important issues clear up.”
The spokesman said the CCDH is doing it doesn’t differentiate between positive and negative videos on specific topics, adding that people often share empowering stories about eating disorder recovery.
TikTok said it continues to roll out new safeguards for its users, including ways to filter out mature or “potentially problematic” videos. In July, it added a “maturity level” to videos identified as potentially containing mature or complex topics, as well as a feature for people to decide how much time to spend on TikTok videos, regular pauses in the Set screen time and provide a dashboard of how many times they have the app open. TikTok also offers a handful of parental controls.
This isn’t the first time Social media algorithms were tested. In October 2021, the US Senator Richard Blumenthal’s staff registered an Instagram account as a 13-year-old girl and followed some diet and eating disorder accounts (the latter are set to be banned from Instagram). Instagram’s algorithm soon began recommending almost exclusively that young teens’ accounts follow more and more extreme dieting accounts, the senator told CNN at the time.
(After CNN sent a sample from this list of five accounts to Instagram for comment, the company removed them, saying they broke all Instagrams Measures against the promotion of eating disorders.)
TikTok said it doesn’t allow content that depicts, promotes, normalizes, or glorifies activities that could lead to suicide or self-harm. Of the videos removed for violating suicide and self-harm guidelines from April through June this year, 93.4% were removed with no views, 91.5% were removed within 24 hours of posting, and 97.1% were removed before a report was made. according to the company.
The spokesperson told CNN if someone searches for banned words or phrases like #selfharm, they don’t see any results and are instead directed to local support resources.
Still, the CCDH says More needs to be done to restrict certain content on TikTok and strengthen protections for young users.
“This report underscores the urgent need for online space reform,” the CCDH said Ahmed. “Without oversight, TikTok’s opaque platform will continue to thrive by offering its users — kids as young as 13, remember — increasingly intense and distressing content without reviews, resources, or support.”