AI Voice Call Scam: Imagine, one day, you get a call from one of your loved ones, your wife, child, parents, or even a dear friend, sounding in distress. They tell you about an emergency they are in and ask for money so they can get out of the dangerous situation. You would help them, right? Most would, and that is the natural thing to do. But what if we told you that it could actually be a scam? Yes, a scam (not new) is going around that involves scammers using artificial intelligence technology to clone a loved one’s voice and then use it to trick people into siphoning off their hard-earned money.

This scam has recently been spotted spreading in the US, according to several reports, where scammers have been targeting the elderly, receiving calls from people pretending to be their grandchildren. Notably, this is a type of AI deepfake that has become alarmingly convincing.
Also Read: How to stream PS5 games on your Windows PC
AI Voice Clone Scam: Here’s The Modus Operandi
So, you might be curious about how these scammers gain access to your voice. Firstly, they can find your voice on social media. A McAfee report from 2023 states that scammers only need about three seconds of your audio to create a convincing voice clone. It also mentions that more than half, 53 percent of adults share their voice online at least once a week.
What happens is that these scammers use voice cloning technology to imitate people’s voices and send fake voicemails, or even call a victim pretending to be their loved one. They usually act as if they are in distress and urgently need help.
What’s the worrying bit? The worrying part is that there are both free and paid tools available to carry out these scams, and most only require basic expertise and experience to use. This is why this scam seems to have taken off. And the thing is, it’s not just limited to the US, we have also seen the same scam happening in India.
Scammers pretending to be police officers reach out to people, saying that their loved one has been arrested in connection with a potential crime. They then make you talk to the supposed loved one, who is often heard crying and in distress. Reportedly, scammers use AI cloning technology to make themselves sound like your loved ones.
Out of fear of their loved one being arrested, people often end up sending money to these scammers in an attempt to free them.
Also Read: Hitman: World of Assassination to release on Switch 2 soon; Pre-orders now open
So, What Do You Need To Do To Stay Safe?
Whenever you receive such a call, always make sure to call your loved one, the one the scammers are pretending to be, to confirm if they are indeed in distress. This could potentially save you a lot of money.
Secondly, considering these voice clones are made by AI, the voices are often not perfect. So, whenever you do receive such a call, try engaging the scammer in a longer conversation and carefully analyse the voice for any inconsistencies. AI-cloned voices can sound robotic, with oddly similar-sounding sentences. If you spot this, you will know it’s not the real deal.
Finally, if you do end up losing money this way, it’s in your best interest to reach out to the authorities and report what has happened.
MOBILE FINDER: iPhone 16 LATEST Specs, Price And More
Leave a Reply