Regulator calls on social media firms to use robust age checks to keep harmful content away from children
Social media firms have been told to “tame aggressive algorithms” that recommend harmful content to children, as part of Ofcom’s new safety codes of practice.
The children’s safety codes, introduced as part of the Online Safety Act, let Ofcom set new, tight rules for internet companies and how they can interact with children. It calls on services to make their platforms child-safe by default or implement robust age checks to identify children and give them safer versions of the experience.
More Stories
Elon Musk’s company town: SpaceX employees vote to create ‘Starbase’
Scientists record seismic tremors from title-clinching Liverpool win over Spurs
Please welcome to the stage … Anita Dump! The Facebook group with the best and worst drag names