Taylor Swift is just one of countless victims of deepfake videos. Firms feeding off this abuse should pay for the harm they cause
Imagine finding that someone has taken a picture of you from the internet and superimposed it on a sexually explicit image available online. Or that a video appears showing you having sex with someone you have never met.
Imagine worrying that your children, partner, parents or colleagues might see this and believe it is really you. And that your frantic attempts to take it off social media keep failing, and the fake “you” keeps reappearing and multiplying. Imagine realising that these images could remain online for ever and discovering that no laws exist to prosecute the people who created it.
More Stories
Elon Musk shows he still has the White House’s ear on Trump’s Middle East trip
Australians may soon be able to download iPhone apps from outside Apple App Store under federal proposal
The Guardian view on the Trump-Musk feud: we can’t rely on outsized egos to end oligopoly | Editorial