Cybersecurity

Inside the Increasing Trend of Male Sextortion Scams: ‘I Wanted to Bury Myself Alive by Lisa

Owen was taken aback ⁤when he saw a video of himself in a compromising position. At first, he thought ⁤it was genuinely him, but upon closer ‌inspection, he realised that‍ the body below the waist wasn’t⁣ his. This was his first encounter with deepfake pornography, and he was left speechless. He felt like he wanted ​to disappear.

An anonymous sender had sent the video, threatening to share it with Owen’s loved ones ​unless he paid a hefty sum. Owen stood his ground and refused to pay, resulting ⁣in the video being shared with about 20 ⁢to ⁣30 of his friends and family. Convincing them⁢ that the video was a deepfake was a challenge, and he had to​ provide screenshots of his conversation with the blackmailer as proof.

Owen ⁣reported ⁣the ‌incident to the police the following day. Their advice was to ignore​ the ‍scammers, even if they continued to harass him. This ​left Owen living in fear⁢ for months, worried that people might recognise⁣ him from the video.⁣ Even two years​ after the incident,​ he admits to still feeling ‌scared, but acknowledges that there’s little he can do about it.

Owen had always ⁢been cautious about his privacy, but⁣ he believes that one ⁢lapse in⁣ judgement cost him dearly. He suspects‍ that answering a‌ call from an unknown number allowed the scammer to access his contacts and camera gallery. “Many people suggested that the scammer might have installed spyware during the call,” he says.

The sense of helplessness that⁢ Owen felt is common among ​deepfake victims, who often ‍lack legal protection. However, this could⁢ change soon as the UK⁤ is considering making the creation of deepfake pornography illegal. If the law is ⁢passed, those who⁤ create and circulate deepfake ‌porn could ​face hefty fines and potential jail time.

This legislation ⁣is much needed. Since their emergence in 2017, deepfakes have ⁣become alarmingly common, with such images being viewed millions of times ⁢monthly worldwide. In the ⁢first three quarters ​of 2023 alone, 143,733 new ‍deepfake porn⁢ videos were uploaded online, surpassing the total⁣ number from all previous years. As Owen’s experience shows, these fake images are often incredibly realistic,⁢ and ‌victims are usually ‍unaware and unable to consent to being sexualised in this manner.

The issue of deepfake pornography came into the spotlight earlier this year when numerous explicit deepfake images of Taylor Swift were circulated on social media. This incident highlighted how easily ‍women and ⁣girls can become targets of this technology. A recent study reported that ⁤96% of deepfake ‍victims are sexualised, and⁢ almost all of them ⁢are women.

Read more