Revenge porn in becoming an increasing problem but Facebook are doing all they can to stamp it out.
The company is testing a new feature that will enable users to upload their own illicit images so that they can be quickly flagged up.
Currently, the feature is being tested in Australia although the social media giant plans to expand the scheme if it proves successful.
The new system be is based around a similar program used to detect child pornography which is currently used by Google, Twitter and Instagram, amongst others.
Although uploading images that you don’t want anyone else to see may seem strange, it may make a lot of sense. The program asks users to send a message to themselves via Messenger.
The user then reports the image and Facebook uses a cryptographic signature to identify that image, meaning that no else can upload it. It will also stop the image being shared privately through Messenger.
The move follows on from Facebook’s update earlier in the year which allowed users to report images that they thought were revenge porn.
The new system is very much in its infancy and Facebook assures all users that none of the images are stored. The cryptographic signature, marked with a hash, will allow Facebook to identify inappropriate images without the need to store them on a database.
The feature appears to have created a lot of interest with the Australian government’s e-Safety Commissioner agreeing to help guide people through the process.
Other social media sites, as well as tech companies such as Google and Microsoft have taken significant steps in tackling the issue of revenge porn.
Source: ABC News