Europe is telling tech companies to take down terrorist content within an hour of it being flagged -- or face sweeping new legislation.
The European Union, which published a broad set of recommendations on Thursday, said that tech companies would have three months to report back on what they were doing to meet the target.
"While several platforms have been removing more illegal content than ever before ... we still need to react faster," Andrus Ansip, a European Commission official who works on digital issues, said in a statement.
The Commission said that tech companies should employ people to oversee the process of reviewing and removing terrorist content.
If there is evidence that a serious criminal offense has been committed, the companies should promptly inform law enforcement.
The guidelines could form the basis of new legislation that would heap new demands on social media companies operating in Europe, where regulators have taken a tougher approach to the industry.
Facebook, Twitter, YouTube and Microsoft (MSFT) all agreed in 2016 to review and remove a majority of hate speech within 24 hours. The category includes racist, violent or illegal posts.
Tech industry 'dismayed'
EDiMA, an industry association that includes Facebook (FB), YouTube parent Google (GOOGL) and Twitter (TWTR), said it was "dismayed" by the Commission's announcement.
"Our sector accepts the urgency but needs to balance the responsibility to protect users while upholding fundamental rights -- a one-hour turn-around time in such cases could harm the effectiveness of service providers' take-down systems rather than help," it said in a statement.
Related: Clean up the 'swamp', Unilever tells tech giants
Facebook said that it shares the European Commission's goal.
"We have already made good progress removing various forms of illegal content," the company said in a statement. "We continue to work hard to remove hate speech and terrorist content while making sure that Facebook remains a platform for all ideas."
Twitter and Google did not immediately respond to requests for comment.
Related: Mark Zuckerberg is fighting to save Facebook
European lawmakers are concerned that social media platforms can be used to spread extremist content, and influence elections on the continent. Some have called for tech companies to be made legally responsible for content on their platforms.
But critics say that a heavy handed approach could restrict the free speech of Europeans.
Joe McNamee, executive director of European Digital Rights, described the Commission's proposal as "voluntary censorship."
"Today's recommendation institutionalizes a role for Facebook and Google in regulating the free speech of Europeans," he said in a statement. "The Commission needs to be smart and to finally start developing policy based on reliable data and not public relations spin."