Exclusive: NSPCC finds Apple implicated in more cases of predators sharing child abuse imagery in England and Wales alone than the company reported globally in a year
Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, which is raising concerns about how the company can handle growth in the volume of such material associated with artificial intelligence.
The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.
More Stories
Bizarre Australian mole even more unusual than first thought, new research reveals
Male mosquitoes to be genetically engineered to poison females with semen in Australian research
Memo to Trump: US telecoms is vulnerable to hackers. Please hang up and try again | John Naughton