More on KentOnline
Proposed new laws to regulate social media firms will take “decisive action to protect people online”, the Culture Secretary has said.
Oliver Dowden said “trust in tech is falling” and new rules under the Government’s Online Harms legislation will force social media platforms to “clean up their act”.
Speaking in the House of Commons as the Government published its full response to the Online Harms White Paper, the Culture Secretary said the new laws force the biggest tech firms, such as Facebook and Google, to abide by a duty of care to their users, overseen by Ofcom as the new regulator for the sector.
“Platforms will no longer be able to mark their own homework,” he told MPs.
“To hold them to their responsibilities, I can also announce to the House today that major platforms will be required to publish annual transparency reports to track their progress – this could include the number of reports of harmful content received and the action taken as a result.”
The proposals include punishments for non-compliant firms such as large fines of up to £18 million or 10% of their global turnover – whichever is higher – as well giving Ofcom the ability to block access to platforms in the UK if they fail to stick to the new rules.
However, there has been criticism of the Government’s decision to hold further back punishment – such as criminal liability for senior managers at firms who fail to comply – and planning to only introduce it through secondary legislation.
Mr Dowden told MPs that while the Government hopes not to need to use these powers, they do “remain an option and we will us them if we need to”.
Under the proposals, Ofcom will issue codes of practice for tech giants around the systems and processes they will need to adopt in order to comply with the duty of care.
The largest platforms, including Facebook, Instagram, Twitter and TikTok, will be held to a higher standard of care than other, smaller firms.
In addition to being required to take steps to address illegal content and activity and extra protections for children who access their services, firms in this group will be asked to assess what content or activity on their platform is legal but could pose a risk of harm to adults, and to clarify what “legal but harmful” content they see as acceptable in their terms and conditions.
The Government has published interim codes around how to identify, monitor and remove terrorism content and child sexual exploitation and abuse, which it says set out the actions it expects firms to begin taking now, ahead of the legislation being introduced and Ofcom publishing full codes.
We already have strict policies against harmful content on our platforms, but regulations are needed so that private companies aren’t making so many important decisions alone
Responding to the proposals, Facebook said it welcomed the plans and looked forward to discussing them further with the Government.
“Facebook has long called for new rules to set high standards across the internet,” the social network’s head of UK public policy Rebecca Stimson said.
“We already have strict policies against harmful content on our platforms, but regulations are needed so that private companies aren’t making so many important decisions alone.
“Over the last few years we’ve invested billions in safety, tripled the size of our safety team to 35,000 and built artificial intelligence technology to proactively find and remove harmful content. While we know we have more to do, our industry-leading transparency reports show we are removing more harmful content before anyone reports it to us.
“Protecting people from harm without undermining freedom of expression or the incredible benefits the internet has brought is a complex challenge. We look forward to continuing the discussion with government, Parliament and the rest of the industry as this process continues.”
The legislation is expected before Parliament next year.