Meta is helping to clamp down on the spread of revenge porn targeting children and teenagers on Instagram and Facebook by funding a new tool that helps users remove sexually explicit images online.
The National Center for Missing & Exploited Children announced the release of Take It Down on Monday – a site that removes nude, partially nude, or sexually explicit images and videos of children under the age of 18 that have been posted online or are believed to have been posted online.
The tool can also be used by those over 18 to remove explicit pictures taken when they were a minor, and it is available globally.
NCMEC said in its release that Meta provided initial funding for building the infrastructure of the program.
Join our WhatsApp Channel for more news
Users can select images and videos on their devices that they don’t want to be posted online or that have already been posted online. Take It Down will generate a unique digital fingerprint, called a hash value, that is assigned to the specific content to help them to identify any copies, according to its website.
The hash is added to a secure list and shared with participating companies who will then scan their public and unencrypted platforms for it. If an image or video is identified that matches the hash value, it will be taken down.
Other platforms that have signed up to NCMEC include Pornhub, OnlyFans, MindGeek, and Yubo.
“We created this system because many children are facing these desperate situations.” Michelle DeLaune, president and CEO of NCMEC, said in the release. “Our hope is that children become aware of this service, and they feel a sense of relief that tools exist to help take the images down.”
But there are caveats to this tool. According to The Associated Press, if someone were to alter the image – by cropping it or turning it into a meme, for example – it becomes a new image and therefore needs a new hash. Moreover, images will still appear on sites that haven’t signed up for this service.
Meta’s global head of safety, Antigone David, said in a separate release shared with Insider: “Meta has worked with NCMEC, experts, and victims to develop this platform and help young people get the resources they need when facing these horrific situations. We look forward to other tech companies joining this effort so we can collectively combat this issue across the internet.”
Insider reached out to NCMEC and Meta for further comment but did not immediately hear back.
Meta recorded more child sexual abuse material on its platforms than any other tech company in 2019 and was responsible for 99% of all reports to the NCMEC at the time. It also detected over 20 million images of child sex abuse on Instagram and Facebook in 2020.
Around 15,000 people globally work as moderators for Facebook and Instagram, according to a 2020 report from the New York University Stern Center for Business and Human Rights. Most of these are contracted through third-party firms. Jennifer Grygiel, a social media scholar at Syracuse University quoted in the report, said these numbers as “woefully inadequate.”
“To get safer social media, you need a lot more people doing moderation,” she said.
There have also been complaints that Facebook has treated its moderators poorly in recent years.
One moderator, hired through an outsourcing company in Kenya, told Insider that he had to sift through traumatic content including beheadings, sexual exploitation of children, graphic violence, and more, and was only paid $1.50 an hour.