TikTok’s highly active recommendation system is designed to keep users clicking on videos, even if they contain racist or homophobic content
Get our morning and afternoon news emails, free app or daily news podcast
TikTok’s algorithm works in mysterious ways, but a Guardian Australia experiment on a blank account shows how quickly a breaking news event can funnel users down a conservative Christian, anti-immigration rabbit hole.
Last week we reported how Facebook and Instagram’s algorithms are luring young men into the Manosphere. This week, we explore what happens when TikTok’s algorithm is unleashed on a blank account in the absence of any interactions such as liking or commenting.
More Stories
Black Mirror’s pessimism porn won’t lead us to a better future | Louis Anslow
As a geneticist, I will not mourn 23andMe and its jumble of useless health information | Adam Rutherford
Bank of England says AI software could create market crisis for profit