Tech bosses could face criminal cases over online harm, warns UK minister

date_range 06-Feb-2022
visibility 30

Addressing the wider tech industry, Dorries said: “Remove your harmful algorithms today and you will not be subjected – named individuals – to criminal liability and prosecution.”

Dorries was speaking to a parliamentary joint committee scrutinising the draft bill, which aims to impose a duty of care on tech firms – with a particular emphasis on social media platforms such as Facebook and Twitter and video platforms such as YouTube and TikTok – to protect users from harmful content.

The bill contains provisions for a deferred power to impose criminal sanctions on executives if they do not respond to information requests from the communications watchdog Ofcom accurately and in a timely manner.

Dorries said that power could be brought in within three to six months of the bill becoming law, rather than the two years initially envisaged, and indicated that the criminal sanctions would be extended to cover failing to tackle algorithms that serve up harmful content.

“[Platforms] know what they are doing wrong. They have a chance to put that absolutely right, now. Why would we give them two years … to change what they can change today,” she said.

Dorries was scathing about Facebook’s rebranding and its investment plans for the metaverse, a virtual world where people will be able to live their social and professional lives through avatars, or animated digital representations of themselves.

Referring to Meta’s recent announcement that it would hire 10,000 people in the EU to help build a metaverse, she said: “They are putting ten, twenty thousand engineers on to the metaverse. And rebranding does not work. When harm is caused we are coming after it. Put those ten or twenty thousand now on to abiding by your terms and conditions and to removing your harmful algorithms, because if you don’t this bill will be watertight. I am looking at three to six months for criminal liability.”

Meta, which also owns Instagram and WhatsApp, is under severe political and regulatory pressure on both sides of the Atlantic after revelations by the whistleblower Frances Haugen. The former Facebook employee has detailed how the company was aware that its platforms harmed users and spread misinformation.


Addressing the wider tech industry, Dorries said: “Remove your harmful algorithms today and you will not be subjected – named individuals – to criminal liability and prosecution.”

Dorries was speaking to a parliamentary joint committee scrutinising the draft bill, which aims to impose a duty of care on tech firms – with a particular emphasis on social media platforms such as Facebook and Twitter and video platforms such as YouTube and TikTok – to protect users from harmful content.

The bill contains provisions for a deferred power to impose criminal sanctions on executives if they do not respond to information requests from the communications watchdog Ofcom accurately and in a timely manner.

Dorries said that power could be brought in within three to six months of the bill becoming law, rather than the two years initially envisaged, and indicated that the criminal sanctions would be extended to cover failing to tackle algorithms that serve up harmful content.

“[Platforms] know what they are doing wrong. They have a chance to put that absolutely right, now. Why would we give them two years … to change what they can change today,” she said.

Dorries was scathing about Facebook’s rebranding and its investment plans for the metaverse, a virtual world where people will be able to live their social and professional lives through avatars, or animated digital representations of themselves.

Referring to Meta’s recent announcement that it would hire 10,000 people in the EU to help build a metaverse, she said: “They are putting ten, twenty thousand engineers on to the metaverse. And rebranding does not work. When harm is caused we are coming after it. Put those ten or twenty thousand now on to abiding by your terms and conditions and to removing your harmful algorithms, because if you don’t this bill will be watertight. I am looking at three to six months for criminal liability.”

Meta, which also owns Instagram and WhatsApp, is under severe political and regulatory pressure on both sides of the Atlantic after revelations by the whistleblower Frances Haugen. The former Facebook employee has detailed how the company was aware that its platforms harmed users and spread misinformation.