Revenge porn is pervasive, and Facebook wants to do its part to stop it from spreading on its platforms.
The term refers to non-consensual pornography that's distributed online to shame, exploit or extort its victims.
And on Wednesday, the company said it would apply photo-matching to ensure intimate, nonconsensual images that are reported once aren't able to be uploaded again through Facebook's properties, including Messenger and Instagram.
Facebook (FB) said once an image is reported, it is reviewed by the company's community operations team and then photo-matching will be applied.
From there, "if someone tries to share the image after it's been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it," Facebook head of global safety Antigone Davis said in a company blog post.
A study from Data & Society Research Institute found that one in 25 people has been a victim of either threats, or actual posts, of revenge porn. The phenomenon is emotionally distressing, even resulting in some publicized suicides as a result of the shame and bullying that often results.
"It's wrong, it's hurtful, and if you report [revenge porn] to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms," said CEO Mark Zuckerberg in a Facebook post Wednesday afternoon. "We're focused on building a community that keeps people safe."
Facebook partnered up with the Cyber Civil Rights Initiative to develop its approach, and also launched "Not Without My Consent," a guide to help people through the process.
"We're very pleased about Facebook's announcement," Dr. Mary Anne Franks, Cyber Civil Rights Initiative's legislative and tech policy director, told CNNTech. "These new tools demonstrate Facebook's leadership and innovation in responding to abuses of technology."
According to Franks, her relationship with the company dates back to 2014, when she was asked to give a presentation about nonconsensual pornoagraphy as part of the company's safety series. Facebook (FB) sponsored a cross-industry summit on the issue featuring presentations by CCRI in February 2015, Franks said.
Related: The 20-year-old leading the March Against Revenge Porn
"We have been working with Facebook on this issue ever since. In addition to helping them develop reporting and support procedures, we have been urging Facebook (and other companies) to move beyond purely reactive approaches to the problem and to adopt more preemptive measures, such as photo-matching," she said.
There's currently no federal law against revenge porn. Thirty-five states and Washington, D.C., have enacted state laws against it, but online harassment laws (which include revenge porn) are notoriously weak and rarely match the damage revenge porn creates. For some victims, the only way to get their pictures off the internet has been to copyright their own naked bodies and sue on intellectual property grounds.
The issue is one that's hit Facebook hard, in the form of a lawsuit. Facebook lost its bid in September to stop a lawsuit by a 14-year old girl whose naked photo appeared on its site. The girl is suing Facebook and the man who repeatedly posted her photo. At the time, Facebook did not comment on why this image -- once flagged -- wasn't caught by the PhotoDNA system, a tool used by a number of tech companies including Twitter (TWTR) to detect and stop the spread of child porn.
Related: Can a prenup for sex tapes keep you safe?
The vast majority of revenge porn affects private citizens, but the issue has made headlines as celebrities have fallen victim as well. In August, hackers posted nude photos of comedian Leslie Jones on her web page, prompting federal authorities to investigate. Earlier this month, news surfaced that an ex-boyfriend of actress Mischa Barton was shopping around sexually explicit photos of her.
Some lawmakers have pushed for reform, including Representative Jackie Speier, who proposed the Intimate Privacy Protection Act in July to criminalize revenge porn.