Facebook receives more than one million reports of violations from users every day.
That's according to Monika Bickert, Facebook's head of policy management.
Bickert spoke to the fine (and imperfect) line between free speech and hate speech at SXSW's first Online Harassment Summit on Saturday. Bickert told CNNMoney she didn't know offhand what percentage are serious and taken off the site.
The panel centered on how far tech companies could -- and should -- go in removing potentially harmful content on their platforms.
"You can criticize institutions, religions, and you can engage in robust political conversation," said Bickert, of where Facebook draws the line. "But what you can't do is cross the line into attacking a person or a group of people based on a particular characteristic."
Obama: We can't fetishize our phones
Bickert said crafting the policy is "tricky," especially given that 80% of Facebook's 1.6 billion users are outside of the U.S. and likely have different views on what content might be offensive or threatening.
But the most challenging part is enforcement, she said.
Bickert told CNNMoney that Facebook prioritizes the review of posts inciting physical harm -- but all reports of violations are reviewed by trained Facebook employees.
She said she often gets asked why the company doesn't have its "world-class engineers" tackle hate speech "proactively and perfectly."
"When it comes to hate speech, it's so contextual ... We think it's really important for people to be making that decision," she said, adding that, one day, automation could play a bigger role.
She noted that the number of reported violations has been "steadily increasing" as Facebook has allowed users to flag them from all devices.
Other panelists included Juniper Downs of Google (GOOG), Lee Rowland of the ACLU, Deborah Lauter of the Anti-Defamation League, and the National Constitution Center's Jeffrey Rosen.
Rosen spoke to the "tremendous pressures" that tech companies have to "adopt a more European" approach to free speech, whereas anything that's offensive to a person's "dignity" can be a basis for removal.
But this opens up the possibility that not just individuals would request content be removed -- but also the government.
"It's messy," Rosen added. "As a society, we have to decide what do we value more -- privacy and dignity, or free expression?"
Olympic fencer asked to remove hijab for SXSW photo
Rowland, meanwhile, said she'd had a blog post removed on Facebook because it contained a photo of a nude statue. (Bickert later said this wasn't a violation of Facebook's policies -- it was a mistake.)
Rowland said she knew who to call to find out why it was removed -- but most people don't.
"For the average user, there's an incredible black space," she said, pleading for tech companies to be more transparent about their policies.
"People don't clearly understand why their speech may have been taken down," added Rowland. "It's ultimately not going to be a good business plan if people don't know where that [free speech] stops."