Take It Down, a new online tool, has been launched to assist people in removing explicit images and videos of themselves from the internet. The National Center for Missing and Exploited Children operates the tool, which is partially funded by Meta Platforms, the owner of Facebook and Instagram.
It allows anyone to create what is essentially a digital fingerprint of an image anonymously and without uploading any actual images. This fingerprint (a unique set of numbers known as a “hash”) is then entered into a database, and the images are removed from the services of the tech companies that have agreed to participate in the project.
The Take It Down tool was created to combat sexual exploitation and to assist underage children in removing inappropriate photos and videos from the internet. It allows users from all over the world to report online nude, partially nude, or sexually explicit photos and videos depicting a child under the age of 18.
Take It Down, according to the official blog post, works by assigning a hash value, or a unique digital fingerprint, to specific images and videos. Once an online platform signs up with the Take It Down platform, these hash values are provided to them to help them detect and remove the photos and videos on their websites and apps. The entire process occurs without the image or video ever leaving the device or being viewed by anyone.
Furthermore, if the original image is altered — for example, by cropping it, adding an emoji, or making it into a meme — it becomes a new image and thus requires a new hash. Images with similar hashes, such as the same photo with and without an Instagram filter, will have only one character difference.
The sources for this piece include an article in APNews.