Facebook is asking its users to send the company nude photos of themselves, as part of a new effort to combat non-consensual sharing of intimate images.

The programme, being piloted in Australia with the country’s eSafety department, is an attempt to give an element of control back to individuals who may face revenge-porn abuse – where an ex-partner shares sexual imagery without the person’s permission.
If a user is worried that intimate photos may be shared online against their will, they can use Facebook Messenger to have the images turned into a “hash”. This is a unique digital fingerprint that can be used to identify and therefore prevent any other attempts to upload the image on Facebook’s platforms, including Messenger, Instagram and Facebook Groups.
Australia’s e-safety commissioner, Julia Inman Grant, told ABC it would allow victims of “image-based abuse” to take action, noting that one in five Australian women aged 18-45 have had intimate photos shared without their consent.
“We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly,” Grant said.
Up to now, Facebook users have been told to report images to “specially trained representatives” on the company’s community team. The image would be taken down if it violates Facebook’s terms and conditions, and photo-matching techniques would be used to prevent the image from being uploaded again.
This hashing technique aims to streamline that considerably, stopping malicious parties from uploading the pictures in the first place. In a blogpost, Facebook describes it as “an emergency option for people to provide a photo proactively to Facebook”. Users will need to first complete an online form on the eSafety website, then send the pictures they are worried about to Facebook via Messenger while the government department notified the social network about the incoming submission. A Facebook community operations analyst will then find the image and hash it.
According to the company, after the image has been converted into a hash code, the social network will store the picture for a short amount of time before automatically deleting it.
“These tools, developed in partnership with global safety experts, are one example of how we’re using new technology to keep people safe and prevent harm,” said Facebook’s head of global safety, Antigone Davis, in a statement.
An NSPCC spokesperson told Alphr: “Children as well as adults can become victims of revenge porn as images they have shared can be used against them causing distress and humiliation. It is great that Facebook are looking at using photo-matching techniques to prevent an image from being uploaded again and this could be an effective way to fight non-consensual sharing of images which affects children and adults.
“The process for users requesting removal of content needs to be simple and straightforward – it must be as easy to get content removed as it is to upload and share it. This initiative is a welcome step and has the potential to be rolled out across platforms.”
Danielle Keats Citron, Morton & Sophia Macht professor of law at the University of Maryland Carey School of Law – and author of Hate Crimes in Cyberspace – commended Facebook’s work with victim support groups: “This is a complex challenge and they have taken a very thoughtful, secure, privacy sensitive approach at a small scale with victim advocates on the frontline. They are working to identify the best way to help people in a desperate situation regain control and prevent abuse that has severe consequences, including the loss of employment, loss of friends, not to mention the intended embarrassment and humiliation.”
While the pilot is limited to Australia, Facebook has said it is looking into other partners across different countries.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.