Criminals are cloning voices and making calls to trick victims into sending them money. How can they be stopped?
Jennifer DiStefano, a mother of four, got a call one day from an unknown number. Two of her children were off snowboarding, so she picked up, worried that one of them might have been injured. It was her daughter Bree, screaming, crying and pleading for help. A man came on the line and told DiStefano that he had kidnapped her daughter and that if she didn’t pay up, he would kill her.
DiStefano was terrified, but her fear and horror was the only real thing about that phone call. Bree had not been kidnapped, she was with her brother, safe. Instead, scammers had used AI to replicate Bree’s voice so accurately that her own mother could not recognise the difference – and they were using it to try to extort money from DiStefano.
More Stories
Revisited: a new approach to quitting smoking; how to stop people-pleasing; and why do we have the dreams we do? – podcast
Quitting smoking may be easier with a smartwatch app, researchers say
Meet Gem the cocker spaniel – the face of UK pet cloning