How and when Facebook removes offensive content is once again being called into question.
This week, an Australian man named Hayden Brien flagged a post to Facebook that was shared in a private men's group on the platform because he believed it constituted non-consensual pornography.
Brien said the post, shared among the community of roughly 14,500 men, included screenshots of a man engaging in sexual acts with a woman. The caption on the post, published on Sunday and reviewed by CNN Tech, read: "What is the biggest whale that you have harpooned? I went through a tubby phase and landed this 130kg beast."
Brien, who was a member of the private group, flagged it to Facebook on Tuesday, believing it went against its community standards. He said he received an email indicating that the post didn't violate its guidelines. At that point, he posted about it on his personal page. "For too long these issues get swept to the side," he told CNN Tech.
"It's a matter of promoting rape culture," Brien, 20, added, noting that men were adding comments like "Any more big girl pics lads?" and "Legend!!" Brien has in turn become a target, noting that he and several of his friends were almost immediately ousted from the group after he posted about the picture. He said he's received profane messages and death threats on Facebook for speaking out.
Related: When your job is viewing child porn and bestiality, it takes a toll
"We are investigating what happened here to ensure that our policies are applied consistently with respect to this type of content," a Facebook spokeswoman told CNN Tech. The company said it believes the original image is no longer on its platform.
In addition to banning nudity on its platform, Facebook says that it removes intimate images that are not consensually shared as it becomes aware of them. But last week, The Guardian reported on leaked Facebook documents that show how its content moderation policies are actually implemented. When it comes to revenge porn, Facebook defined it as "attempts to use intimate imagery to shame, humiliate or gain revenge against an individual."
According to Dr. Mary Anne Franks, Cyber Civil Rights Initiative's legislative and tech policy director, that piece of information is "pretty illuminating and extremely disappointing."
"It's another way of saying that it is not abuse if it's motivated by something else -- greed, voyeurism, or, in the case of 'secret groups,' some kind of grotesque male bonding," added Franks, who has been working with Facebook to address revenge porn. "It essentially defines the abuse in such a way that 'secret' groups can never be guilty of it. By hiding their activities from their victims, these groups demonstrate that their intent is not to 'shame or embarrass' their victims."
Related: The 20-year-old leading the March Against Revenge Porn
Brien isn't the first to go public about non-consensual images being shared in a private Facebook group. Closed Facebook groups were at the center of the Penn State fraternity case, where men were allegedly posting compromising pictures of women on a private Facebook page, as well as the Marines' nude photo scandal.
It's also not the only platform that's grappling with hidden groups sharing content that could violate its standards.
According to attorney Carrie Goldberg, who focuses on online and offline sexual violence, her firm discovered a private Twitter account, with 125,000 followers, this week that was dedicated to posting non-consensual pornography of undergrads. Once Twitter was alerted, Goldberg said the company swiftly took action and banned the account. Twitter did not respond to request for comment.
Related: 1,100 strangers showed up at his home for sex. He blames Grindr.
For Facebook, content moderation is partly a volume issue: There are millions of reports weekly, and the company recently announced it is hiring 3,000 more people to sift through all of the flagged posts. But the system will be imperfect because it relies, in part, on humans. Facebook says that it is applying image matching technology to prevent flagged images (that moderators have deemed offensive) from being shared again on its platforms.