Facebook, Twitter, Microsoft and YouTube must move faster to remove hate speech from their platforms, European officials said Monday.
The four U.S. tech companies made a big public promise in May to review a majority of content flagged by users within 24 hours. Any racist, violent or illegal posts would be deleted, they said.
The European Commission has found that only 40% of hate speech published on Facebook (FB), Twitter (TWTR), Microsoft (MSFT) and Google (GOOGL) platforms was reviewed within 24 hours, according to a Commission official.
A further 43% of content was reviewed within 48 hours, the official said.
The European Commission is due to publish the details of its findings in a report on the companies' performance on Wednesday.
According to the official, almost a quarter of the flagged content was related to antisemitism and a fifth to anti-Muslim hatred.
The report will be discussed by European justice officials on Thursday.
Related: Zuckerberg investigated over hate speech complaint
None of the four companies would comment on the European Commission report.
The Commission found stark differences in how Facebook, Twitter and others tackle hate speech across Europe.
While they manage to remove more than 50% of recorded hate speech in Germany and France within 24 hours, the situation is much worse elsewhere. Only 4% of hate speech posts were removed in Italy, and 11% in Austria.
Social media have come under increased scrutiny recently over how they handle such content.
Prosecutors in Germany are investigating Mark Zuckerberg over hate speech complaints, and the U.K. government launched a plan to tackle hate speech in July after a reported rise in hate crime following the Brexit referendum.
Big tech companies met U.S. government officials earlier this year to discuss how to stop ISIS from recruiting terrorists on social media.