“While you can’t go back in time and undo the broadcast, we can help you move forward.”
That’s the message from a service just launched by the National Center for Missing and Exploited Children that aims to help teenagers and those who were once teenagers remove explicit images of themselves from the internet.
The new service, called Take It Down, is aimed at “individuals who have images or videos of themselves nude, partially nude, or in sexually explicit situations taken when they were under the age of 18 that they believe are that they have been or are being shared online”.
According to its website, Take It Down assigns the explicit images a unique digital fingerprint called a hash value. Online platforms can use the hash values to recognize these images or videos on their services and remove this content. According to Take It Down, this all happens without the image or video ever leaving the user’s device or anyone looking at it. Only the hash value is provided to the National Center for Missing and Exploited Children.
Meta, the parent company of Facebook and Instagram, is helping fund Take It Down, but only a handful of sites are currently involved in the effort, according to The Associated Press.
As of Monday, AP reports that participating platforms include Metas Facebook and Instagram, Yubo, OnlyFans and Pornhub, which is owned by Mindgeek. According to the service, if the image is on another website or sent on an encrypted platform like WhatsApp, it will not be removed.
Identifying an image to remove also presents some challenges. For example, if you crop it, add an emoji, or turn it into a meme, it becomes a new image and therefore needs a new hash. Images that are visually similar — like the same photo with and without Instagram filters — have similar hashes, differing only in one character per AP.
“Take It Down is made specifically for people who have an image that they have reason to believe already exists or may exist somewhere on the Internet,” said Gavin Portnoy, a spokesman for the National Center for Missing & AP Exploited Children. “You’re a teenager, you’re dating someone and you share the picture. Or someone blackmailed you and said, ‘If you don’t give me a picture or another picture of you, I’ll do X, Y, Z.’”
Portnoy said teenagers might feel more comfortable visiting a website than involving law enforcement, who would not be anonymous.
“For a teenager who doesn’t want that level of involvement, he just wants to know it’s shut down, that’s a big deal for him,” Portnoy said. The center reports that it is seeing an increase in reports of child exploitation online. The nonprofit’s CyberTipline received 29.3 million reports in 2021, up 35% from 2020.
“This issue has been incredibly important to Meta for a very, very long time because the damage done is quite severe in the context of teenagers or adults,” Antigone Davis, Meta’s global safety director, told CNN. “It can damage their reputation and family relationships and put them in a very vulnerable position. It’s important that we find tools like this to help them take back control of a very difficult and devastating situation.”
While teens can ask Take It Down for help locating and removing images themselves, parents or trusted adults can also use the platform on a teen’s behalf, according to CNN. The effort is fully funded by Meta and builds on a similar platform she launched in 2021 with more than 70 NGOs called StopNCII to prevent adult revenge porn.