Meta, together with the British hotline Revenge Porn Helpline, has launched the StopNCII.org platform. Its launch has been supported by more than 50 non-governmental organizations around the world, reports Media Sapiens.
NCII stands for “non-consensual sharing of intimate images”. Meta believes that publishing intimate photos without consent can ruin a person’s life.
The company invited users whose videos or photos are published or can be uploaded to Facebook or Instagram, to choose the files that the system could quickly find. The files are converted into a digital fingerprint, which social networks will use when searching.
Meta assures that the image remains on the user’s device, it is not downloaded or transferred anywhere. The operation of “hashing” (conversion of input data into “hash”, which protects the integrity of data from third-party changes) is irreversible. It is impossible to restore the original image with an encrypted “hash”.
After uploading a photo, you need to create a unique PIN with which you can track the status of the photo verification, notes NIX Solutions.
StopNCII.org clarified that the function will read only original photos. If the file has been modified, filtered, or cropped, the algorithms may not read it, so you will have to create a separate statement for the changed image.