Social media bosses must protect children from ‘addictive features’ or face jail

House of Lords votes for amendment to Online Safety Bill to place legal duty on companies not to harm younger users

Social media firms will be required to protect children from “addictive” online features such as autoplay, reward loops and nudges under a new law.

The House of Lords voted by 240 votes to 168 to place a legal duty on companies such as Meta, the owner of Facebook and Instagram, and Google, which owns YouTube, to design their services in a way that did not harm children.

The rebel amendment to the Online Safety Bill, put forward by campaigner Baroness Kidron, would mean technology giants would have to adapt or turn off “addictive” features if they were putting children at risk.

She cited examples of a Pokémon design feature that ended every game in a McDonald’s car park, algorithms that deliberately pushed 13-year-old boys towards content by misogynist Andrew Tate, and geolocators that allowed a child to be tracked by a potential predator.

‘Safety by design’

We are not making anything illegal. We are not picking any specific feature out of the box. What we are doing is asking the companies to look at their features and see whether individually or in combination, they create an unacceptable risk or potential harm. It is safety by design,” said Baroness Kidron.

The new rules would be policed by Ofcom, the online regulator, which has the power to fine companies up to 10 per cent of their global turnover if they breach their duties under the act to protect children. 

Technology bosses will be held criminally liable for persistent failures and face up to two years in jail.

The changes are opposed by ministers, who may seek to overturn them when they return to the Commons. 

A government spokesman said it was disappointed by the House of Lords vote and believed it would weaken and delay the legislation.

‘Travesty for children’

However, Baroness Kidron believed the requirement could be accommodated within the Bill with a simple clause requiring technology firms to assess the risk of their systems, and not just their content that could be harmful.

“It is imperative the features, functionalities or behaviours harmful to children, including those enabled or created by the design or operation of the service, are in scope of the Bill,” she said. 

“This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.

“For example, there are the many hundreds of small reward loops that make up a doom scroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park.

“Or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups.

“For example, they deliberately push 13-year-old boys towards Andrew Tate – not for any content reason, but simply on the basis that 13-year-old boys are like each other and one of them has already been on that site.

“The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers.

“To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.”

Government sources said it would consider its next steps in the coming weeks.

License this content