As technology advances, scams are becoming quite sophisticated. One of the latest threats is AI voice cloning which a malicious party can use to make it seem like they have a loved one held hostage. Mix that with caller ID spoofing and it’s a very scary and convincing scam that can impact users on iPhone, Android, and really any phone. Read on for more details and how to protect against AI voice clone and caller ID spoofing scams.
Table of contents
Caller ID spoofing
Over the years, caller ID spoofing or number spoofing has become more of an issue. This is when an attacker is able to make a call and fake the incoming phone number. This is used to make it appear that someone your know or trust is calling.
For example, it can look like your sister is calling with even the contact image pulling up on your phone. Scammers using this technique may say something like they have your loved one hostage, demand you don’t call the police, and that you need to send money immediately. These attacks may include background sounds like muffled cries to make it feel very real and threatening.
Another tactic is saying your loved one was in an accident and you need to send money right away.
How to protect against caller ID spoofing scams
- If you’ve already answered, a fast way to know if a caller has spoofed a loved one’s number is to hang up and call that person’s number from your phone – it’s very unlikely the attackers can intercept your outgoing call to the actual number
- Ask a challenge question – something only your loved one would be able to answer (don’t ask a question to which the answer could be found on social media, online, etc.)
- The FCC recommends never answering calls from unknown numbers – let them go to voicemail – but of course, the power of spoofed calls is they appear to be someone you know
- Don’t share your phone number on social media, online, etc. if possible, and encourage loved ones to not share their numbers publicly
Caller ID spoofing is a common enough problem that the FCC has a support document with some more tips including never giving personal information away if you receive and answer a call like this.
AI voice clone scams
Even though caller ID spoofing can be very convincing and scary, the next scam is more terrifying. Attackers are using AI voice clone attacks in the wild which are incredibly realistic.
Just this month, an Arizona mother received a call from an unknown number but it was her daughter who was crying for help. The scammer/attacker then got on and threatened to hurt her daughter if the mother didn’t hand over ransom money.
Fortunately, she had friends around who were able to confirm her daughter was safe within four minutes which made her realize it wasn’t actually her daughter on the phone with her. But the level of accuracy of AI voice clone really shook her. “It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”
How to protect against AI voice clone scams
- Ask a challenge question or even two – something only your loved one would be able to answer (e.g. name your childhood stuffed animal, etc.)
- Remember, don’t ask a question to which the answer could be found on social media, online, etc.
- If possible, have someone call or text the person directly that the scammer is claiming needs help
- Letting unknown numbers go to voicemail may help, but if the attackers are able to leave a voicemail with your loved one’s voice, it could sound real
- Set your social media profiles to private – many attackers look for voice samples from public social media profiles to generate the convincing AI voice clone
- It’s believed as little as 3 seconds of someone’s voice is needed to create a realistic clone
- Don’t share your phone number on social media if possible
Unfortunately, AI voice cloning is becoming common enough that the FTC has shared a warning about it.
AI voice cloning plus caller ID spoofing
Sadly, it’s probably just a matter of time before advanced attackers will use both of these tactics simultaneously to create an even more convincing threat. That would allow them to call you from what appears as a loved one’s phone number as well as make it sound exactly like their voice on the other end.
Another terrifying evolution of this could also include AI deep fake video of a loved one. The same steps shared above will help protect you and loved ones.
Thanks for reading our guide on how to protect against AI voice clone and caller ID spoofing scams, stay safe out there!
More on security and privacy:
FTC: We use income earning auto affiliate links. More.