Image: Tanja Cappell/Flickr Image: Tanja Cappell/Flickr

The UK will launch an independent watchdog that will write a code of conduct for social media firms, if recommendations by the Home Office and Department for Media, Culture and Sport are carried through.

The watchdog would have the power to fine or even block the companies if they fail to adequately tackle what it is calling "online harms" such as terrorist propaganda and sexually explicit content.

The paper suggests introducing a levy to fund the regulator and says senior management could be liable for breaches. However, critics are arguing that the idea contradicts freedom of speech.

It comes amid rising pressure for greater regulation of social media after 14-year-old Molly Russell took her own life in 2017 after being exposed to suicidal content on Instagram. And last month, thousands of copies of videos surfaced online after the man behind the New Zealand terrorist attack.

Digital, Culture, Media and Sport Secretary Jeremy Wright said: "The era of self-regulation for online companies is over.

"Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough."

Home Secretary Sajid Javid said: "Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online."

Rebecca Stimson, head of UK policy at Facebook, said: "New regulations are needed so that we have a standardised approach across platforms and private companies aren't making so many important decisions alone.

"New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech."

Twitter's head of UK public policy, Katy Minshall, said: "We look forward to engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the internet."