Regulator calls on social media firms to use robust age checks to keep harmful content away from children
Social media firms have been told to “tame aggressive algorithms” that recommend harmful content to children, as part of Ofcom’s new safety codes of practice.
The children’s safety codes, introduced as part of the Online Safety Act, let Ofcom set new, tight rules for internet companies and how they can interact with children. It calls on services to make their platforms child-safe by default or implement robust age checks to identify children and give them safer versions of the experience.
More Stories
Microsoft unveils chip it says could bring quantum computing within years
Virologist Wendy Barclay: ‘Wild avian viruses are mixing up their genetics all the time. It’s like viral sex on steroids’
Jeff Bezos takes one small step for feminism and social progress, and one giant leap for self-publicity | Sarah Manavis