Facebook, Twitter and Google should be held responsible for illegal content on their platforms, a top U.K. ethics panel said Wednesday.
The Committee on Standards in Public Life said that tech firms have failed to stop the proliferation of hostile, offensive and threatening messages. It urged lawmakers to make the tech companies more legally accountable for the messages and videos on their platforms.
"Facebook, Twitter and Google are not simply platforms for the content that others post; they play a role in shaping what users see," the panel said in a report. "With developments in technology, the time has come for the companies to take more responsibility for illegal material that appears on their platforms."
Social media companies have long argued that they should not be subjected to the same rules as publishers.
But the ethics committee, which advises Prime Minister Theresa May, said that rules classifying the firms as content "hosts" are out of date, and new legislation is needed to "shift the balance of liability for illegal content to the social media companies."
Related: Google's top searches for 2017
TechUK, which represents technology firms in the U.K., said new laws are not the right approach.
"Blanket legislative solutions may appear attractive but are unlikely to be effective," the group said in a statement. "Moreover, there is a real risk that serious unintended consequences could result from tampering with the fundamental framework that underpins the whole of the digital economy."
The ethics committee, which includes lawmakers and experts from outside the government, said that social media is being used in ways that threaten society. Specifically, they said online harassment is undermining democracy by discouraging candidates from seeking office.
"The social media companies are not providing a safe experience for their users," they wrote. "This is having a severely negative impact on a wide range of people in public life, who can be subject to persistent, vitriolic and threatening abuse online."
Related: Google is hiring 10,000 people to clean up YouTube
In response, Facebook (FB) said it had offered advice and training to politicians on how to report abusive comments. It said it would continue "to improve how we tackle this kind of abuse" and hire more workers to address the issue.
Nick Pickles, head of U.K. public policy for Twitter (TWTR), said his firm works to "proactively find abusive content" and is expanding its "safety efforts across the platform."
"We remain committed to playing our part in the electoral process and working with political parties to support candidates, as well as working with the police and parliamentary authorities to facilitate their vital work," he said.
Google (GOOGL), which owns YouTube, declined to comment on the committee report. But YouTube CEO Susan Wojcicki committed this month to having more than 10,000 people "working to address content that might violate our policies" by 2018.
The tech industry has come under increased scrutiny in Europe in recent years.
In Germany, social media companies can already be fined as much as €50 million ($59 million) if they fail to quickly remove illegal posts.