Facebook releases new tools to fight "revenge porn"

Impact

Facebook announced on Wednesday that it has rolled out a new set of tools to prevent "revenge porn" — when sexually explicit and intimate content is distributed without their partner's (or ex's) consent — from being shared on the social media platform. The tools will be utilized on Facebook, Messenger and Facebook-owned photo-sharing app Instagram.

Revenge porn is a real problem in the digital age — there are around 2,000 revenge porn websites around the world — and one that overwhelmingly affects women. According to data from the Cyber Civil Rights Initiative, 90% of victims are women. The research also spotlights how revenge porn impacts people: 93% of victims reported "significant emotional distress," 82% revealed "significant impairment in social, occupational or other important areas of functioning," 51% said they have had suicidal thoughts as a result of being a victim and 42% admitted to reaching out for psychological services.

A new way to report revenge porn

Facebook's head of Global Safety, Antigone Davis, outlined how the platform will tackle revenge porn in a new blog post. When a user spots a sensitive image that appears to be shared without consent, they have the option to report it. Once reported, "specially trained representatives" from Facebook's Community Operations team will review the image. If it violates the company's Community Standards, then the image will be removed and the account sharing the sensitive picture will be disabled.

Removing one image from a particular account doesn't guarantee a photograph won't be uploaded again. That's where Facebook's "photo-matching technologies" factors in: if and when another user tries to share an image that has already been removed, they will be alerted that the photograph violates Facebook's policies and will not be able to share it. Additionally, by partnering with safety organizations, Facebook plans to offer support and resources to victims of revenge porn.

Earlier in 2017, Facebook CEO Mark Zuckerberg wrote about the company's efforts to build a safer community and how technology can provide a valuable set of tools:

There are billions of posts, comments and messages across our services each day, and since it's impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events — like suicides, some livestreamed — that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more.Artificial intelligence can help provide a better approach. We are researching systems that can look at photos and videos to flag content our team should review. This is still very early in development, but we have started to have it look at some content, and it already generates about one-third of all reports to the team that reviews content for our community.

Human-powered moderation

While Facebook is using artificial intelligence to detect and report offensive content, the company is leaving revenge porn to human discretion. "At this moment, we're not using AI to go through this particular content," Davis told TechCrunch. "There is significant context that's required for reviewing non-consensual sharing."

Facebook isn't the only social media platform to tackle revenge porn. Two years ago, Twitter outlined explicit rules that prohibited sharing naked photographs or videos without the permission of the individual in the content. Around the same time frame, Reddit banned explicit content on its platform.

"No matter who you are, if a photograph, video or digital image of you in a state of nudity, sexual excitement or engaged in any act of sexual conduct, is posted or linked to on Reddit without your permission, it is prohibited on Reddit," Reddit wrote in an announcement. "We also recognize that violent personalized images are a form of harassment that we do not tolerate and we will remove them when notified."

Unanswered questions

Facebook's new tools are a step in the right direction, though questions still remain. While a special team of humans will review reported photographs, Facebook has not revealed whether the size of this team has grown or how they will maneuver through the plethora of reported content. What's more, while the photo-matching technology is potentially an efficient way to prevent further sharing of explicit images, its effectiveness remains to be determined. After all, Facebook has wrongfully taken down or banned images several times in the past from the picture of the bronze Neptune statue in Italy and mannequin photos to an iconic Vietnam war photo.