In light of the recent spate of violent content posted in real time to social media sites and the spread of disinformation, the UK Government has introduced new regulatory proposals to hold internet companies liable for all sorts of online harm shared on their platforms.
As part of the new online safety measures, the government is exploring the creation of a new regulatory body which will ensure that all internet platforms fulfill their "duty of care" in order to protect their users from online harm. Additionally, the regulator will be tasked with enforcing steps to make social media executives liable for harmful content, impose heavy fines, and sever access to erring online services.
The joint Online Harms White Paper, proposed by the Department for Digital, Culture, Media and Sport, and Home Office, seeks to cover a wide range of issues including violent content, suicide, disinformation, cyberbullying, terrorism, and child sexual abuse. The new proposed laws require various types of online services including social media firms, file hosting sites, discussion forums, messaging platforms, and search engines to curb the spread of those types of content. Tech firms will also be forced to issue annual transparency reports on their efforts to minimize harmful content online.
The policy paper was introduced two months after the UK's Digital, Culture, Media and Sports Committee sought for the formation of an obligatory "Code of Ethics" and a regulatory body to penalize social media and tech firms. The regulator may receive funding from a levy on the tech industry, among other options being considered by the UK Government. Britain's final proposals on this new regulation will be finalized after its 12-week consultation period concludes.