(844) 627-8267
(844) 627-8267

Bosses of online companies like Meta, YouTube and instagram must protect children or face jail | #childpredator | #onlinepredator | #sextrafficing

Social media firms will be required to protect children from “addictive” online features such as autoplay, reward loops and nudges under a new law.

The House of Lords voted by 240 votes to 168 to place a legal duty on companies such as Meta (owner of Facebook and Instagram) and Google (which owns YouTube) to design their services in a way that did not harm children.

The rebel amendment to the online safety bill, put forward by campaigner Baroness Kidron, would mean tech giants would have to adapt or turn off “addictive” features if they were putting children at risk.

She cited examples of a Pokemon design feature that ended every game in a McDonald’s car park, algorithms that deliberately pushed 13-year-old boys towards content by misogynist Andrew Tate, and geolocators that allowed a child to be tracked by a potential predator.

“We are not making anything illegal. We are not picking any specific feature out of the box. What we are doing is asking the companies to look at their features and see whether individually or in combination, they create an unacceptable risk or potential harm. It is safety by design,” said Baroness Kidron.

The new rules would be policed by Ofcom, the online regulator, which has powers to fine companies up to 10 per cent of their global turnover if they breach their duties under the act to protect children. Tech bosses will be held criminally liable for persistent failures and face up to two years in jail.

Opposed by ministers

The changes are opposed by ministers who may seek to overturn them when they return to the Commons. A government spokesman said it was disappointed by the House of Lords vote and believed it would weaken and delay the legislation.

However, Baroness Kidron believed the requirement could be accommodated within the bill with a simple clause requiring tech firms to assess the risk of their systems, and not just their content that could be harmful.

“It is imperative the features, functionalities or behaviours harmful to children, including those enabled or created by the design or operation of the service, are in scope of the bill,” she said. “This would make it utterly clear that a regulated company has a duty to design its service in a manner that does not harm children.

“For example, there are the many hundreds of small reward loops that make up a doom scroll or make a game addictive; commercial decisions such as Pokémon famously did for a time, which was to end every game in a McDonald’s car park.

Pushing children into silos

“Or, more sinister still, the content-neutral friend recommendations that introduce a child to other children like them, while pushing children into siloed groups.

“For example, they deliberately push 13-year-old boys towards Andrew Tate – not for any content reason, but simply on the basis that 13 year-old boys are like each other and one of them has already been on that site.

“The impact of a content-neutral friend recommendation has rocked our schools as female teachers and girls struggle with the attitudes and actions of young boys, and has torn through families, who no longer recognise their sons and brothers.

“To push hundreds of thousands of children towards Andrew Tate for no reason other than to benefit commercially from the network effect is a travesty for children and it undermines parents.”

Government sources said it would “consider” its next steps in the coming weeks.

Source link


Click Here For The Original Source.

National Cyber Security