In the weeks ahead of Saturday's Unite the Right rally, organizers used Facebook to coordinate the event. Hundreds of people had indicated they would attend and many more declared that they were interested.
When the situation in Charlottesville, Virginia began to make national news, Facebook took the event's page down.
Facebook (FB) has struggled with how to handle hate groups on its platform. Like many other tech companies facing additional scrutiny this week, it has intensified its policing of hate groups.
Since Saturday, Facebook has taken down at least eight pages that had previously been active on the site, including for Vanguard America, White Nationalist United and Genuine Donald Trump. It said it took the event page down for the rally after becoming aware of the threat of real-world harm.
"Across the entire tech industry, there's been a lot more attention and care paid to it than they gave it months ago," said Keegan Hankes, an analyst at the Southern Poverty Law Center, which tracks hate groups.
Related: Neo-Nazi website loses protection from key tech firm
One by one, tech companies have turned their backs on white supremacist organizations this week. GoDaddy and Google stopped hosting Daily Stormer, a popular Neo-Nazi website. PayPal said it would not process payments for hate groups, and GoFundMe has banned fundraisers for white supremacists including James Fields, the man accused of driving his car into group of protesters and killing one person.
In addition to removing groups and events, Facebook is paying closer attention to shared links. It removed a link to a Daily Stormer article that attacked the victim of Saturday's violence in Charlottesville, which had already been shared more than 65,000 times. Now Facebook is scanning all posts with that link and only allowing posts that include a critical comment condemning the article.
"We've always taken down any post that promotes or celebrates hate crimes or acts of terrorism -- including what happened in Charlottesville," said Facebook CEO Mark Zuckerberg on Wednesday in a post condemning white supremacists. "With the potential for more rallies, we're watching the situation closely and will take down threats of physical harm."
Civil rights groups worry that the increased willingness of tech companies to crack down on hate groups could have chilling effects for free speech.
"Do we the people really want private entities calling the shots as to who can or can't participate in the discussion on the internet?" said David Snyder, executive director of the First Amendment Coalition. "There's a risk that the pendulum could swing too far and those who are now cheering... might find that their speech ... ultimately is banned."
Related: Mark Zuckerberg explains why he just changed Facebook's mission
For groups like the Southern Poverty Law Center, the actions are long overdue. Twice last year, the SPLC gave Facebook a list of more than 200 hate groups it found on the site. The list included pages devoted to denying the Holocaust as well as to white nationalist, anti-Muslim, black separatist and Neo-Nazi groups. Facebook has removed 57 of the recommended groups.
"They're not doing nearly enough to combat organized hate groups on the platform," said Hankes.
Facebook says it has its own internal guidelines about what constitutes a hate group. Simply being white supremacists or identifying as "alt-right" doesn't necessarily qualify. A person or group must threaten violence, declare it has a violent mission or actually take part in acts of violence.
"It's important that Facebook is a place where people with different views can share their ideas. Debate is part of a healthy society," said Zuckerberg in his post.
To find people or groups that do violate its community standards, Facebook automatically scans posts, including conversations in private and hidden groups. Its technology, which is still in the early stages, flags slurs and violent language. Then those posts are reviewed by people on the community operations teams to determine if they cross the line. Individuals can also flag posts, but that is less likely to occur in closed groups of like-minded individuals.
Related: Facebook wants to be 'hostile' to terrorists
Facebook deletes around 66,000 posts a week that it deems to contain hate speech, the company said in a recent blog post. But groups will continue to exist on the platform and use it to communicate and organize. The platform is an especially important tool for hate groups because it allows them to recruit new members and spread propaganda, says Hankes.
"They want to be where the normal people are. They don't want to be cordoned off," said Hankes. "They don't want to talk to a vacuum."
Hate groups and activists will be watching closely to see if Facebook and other tech companies continue to crack down on hate speech. Hanks predicts there will only be an increase in white supremacist activity in the near future.
"The rhetoric is more extreme than I've seen it in a long time."