NEW powers will be given to the watchdog Ofcom to force social media firms to act over harmful content.
Until now, firms like Facebook, Tiktok, YouTube, Snapchat and Twitter have largely been self-regulating.
The companies have defended their own rules about taking down unacceptable content, but critics say independent rules are needed to keep people safe.
It is unclear what penalties Ofcom will be able to enforce to target violence, cyber-bullying and child abuse. There have been widespread calls for social media firms to take more responsibility for their content, especially after the death of Molly Russell who took her own life after viewing graphic content on Instagram.
The government will officially announce the new powers for Ofcom – which currently only regulates the media, not internet safety – as part of its plans for a new legal duty of care.
The regulator has just announced the appointment of a new chief executive, Dame Melanie Dawes, who will take up the role in March.
“There are many platforms who ideally would not have wanted regulation, but I think that’s changing,” said Digital Secretary Baroness Nicky Morgan. “I think they understand now that actually regulation is coming.”
The new rules will apply to firms hosting user-generated content, including comments, forums and video-sharing – that is likely to include Facebook, Snapchat, Twitter, YouTube and TikTok.
The intention is that government sets the direction of the policy but gives Ofcom the freedom to draw up and adapt the details. By doing this, the watchdog should have the ability to tackle new online threats as they emerge without the need for further legislation.