How Facebook Is Ramping Up Its Efforts to Detect and Curtail the Spread of Revenge Porn

Machine learning, artificial intelligence and a new victim support hub

Not Without My Consent is a new victim support hub within Facebook’s Safety Center
Facebook

Facebook is using a new detection technology powered by machine learning and artificial intelligence to more quickly sniff out revenge porn—intimate images that are shared without the consent of the subjects—and the social network also created a new support hub for victims of this type of activity.

In the past, the social network has relied on people reporting images of this type, as well as photo-matching technology to prevent reported images from being reshared.

Global head of safety Antigone Davis said in a Newsroom post that with its new technology, Facebook can more proactively detect near-nude images or videos shared on its platform or Instagram, often before those images are reported.

She stressed that the rapid discovery of this content is important, saying, “Often, victims are afraid of retribution, so they are reluctant to report the content themselves, or they are unaware that the content has been shared.”

Once content has been identified, specially trained members of the social network’s community operations team will review it and determine whether Facebook’s community standards have been violated.

Davis said that in most cases, accounts that share intimate content without permission will be disabled, nothing that an appeals process is available if someone believes their account was disabled in error.

She also referred to a pilot program Facebook began in November 2017 with victim advocate organizations, whereby people have access to “an emergency option to securely and proactively submit a photo to Facebook,” after which digital fingerprints of those images are created to prevent them from being shared further on the social network’s platform.

Davis said this program will be expanded over the coming months.

Facebook also rolled out Not Without My Consent, which Davis described as a victim support hub within the social network’s Safety Center.

Davis said Not Without My Consent was developed with experts, and it gives victims access to organizations and resources, as well as steps they can take to remove offending content and ensure that it is not reshared.

She added that Facebook plans to simplify its reporting process for this type of content over the coming months, as well as to build a victim support toolkit to provide more local and culturally relevant support.

The toolkit is being created in partnership with the Revenge Porn Helpline in the U.K., the Cyber Civil Rights Initiative in the U.S., the Digital Rights Foundation in Pakistan, SaferNet in Brazil and South Korean Prof. Lee Ji-yeon.

In a separate Newsroom post, Facebook head of product policy research Radha Iyengar and global safety policy programs manager Karuna Nain detailed research conducted by the social network on how it handles these types of situations, and based on that research, Facebook will focus on three key principles:

  • Build clear, accessible tools to support victims in reporting a violation.
  • Develop prevention methods such as tools to report and proactively block someone from sharing non-consensual images.
  • Give victims the power they need over their online space to feel safe.

They wrote, “Over the past year, we’ve conducted our own research and partnered with many international safety organizations to review and improve our response to the sharing of what we call non-consensual intimate images anywhere on Facebook, Messenger or Instagram. We tried to understand the experience of victims, how victims reported their experience, what barriers arose when they made a report and what support or tools they needed to feel safe on our platform.”

Iyengar and Nain continued, “We interviewed victim support advocates and victims themselves from around the world, including Kenya, Denmark and the U.K. Last summer we brought together over 20 academics and nonprofit leaders from 10 countries to improve our tools and understanding of how to support victims. This included educational information about NCII, information on where victims can go for help and psychosocial support for those who had been victimized. For everyone, we instructed them on what precautions people can take on Facebook and other platforms to reduce their chances of being victimized.”