Regulator calls on social media firms to use robust age checks to keep harmful content away from children
Social media firms have been told to “tame aggressive algorithms” that recommend harmful content to children, as part of Ofcom’s new safety codes of practice.
The children’s safety codes, introduced as part of the Online Safety Act, let Ofcom set new, tight rules for internet companies and how they can interact with children. It calls on services to make their platforms child-safe by default or implement robust age checks to identify children and give them safer versions of the experience.
More Stories
Bizarre Australian mole even more unusual than first thought, new research reveals
Male mosquitoes to be genetically engineered to poison females with semen in Australian research
Collaborative research on AI safety is vital | Letters