The call came from your son's number. The voice sounded exactly like him — the accent, the pace, the specific way he says "Mom." He was crying, saying he'd been in a car accident, he'd hurt someone, there was a lawyer involved, he needed bail money wired immediately and please don't tell Dad.
None of it was real. AI voice synthesis tools can clone a person's voice from as little as three seconds of audio. That three seconds is available for free — from a TikTok, a birthday video on Facebook, a YouTube comment with audio. The technology is commercially available, cheap, and widely deployed by criminal operations targeting families across North America.
The FTC has documented a sharp rise in these impersonation calls, with losses in the tens of millions of dollars annually. This guide covers exactly what to do — whether you're in the middle of a suspicious call, or you've already sent money.
How AI Voice Cloning Phone Scams Work
Modern AI voice synthesis — sometimes called voice cloning or voice skin — works by analyzing a speaker's audio and building a mathematical model of their vocal characteristics: pitch, resonance, accent, speaking rhythm, and breath patterns. Given a target voice sample, the AI can generate new speech in that voice that says anything the operator types.
The tools required are no longer the province of sophisticated state actors. Commercial products marketed to the entertainment industry (dubbing, voice acting) have been repurposed by criminals. The criminal operation requires:
- 3–30 seconds of the target's voice (obtained from public social media)
- A commercial or open-source voice synthesis tool
- A real-time voice changer or pre-generated audio clips
- A spoofed caller ID showing the victim's real phone number
The sophistication of the resulting voice clone varies — some are detectably artificial, others are genuinely indistinguishable from the real person in a short, high-stress call.
Common Types of AI Voice Cloning Scams
Grandparent Scams (Family Emergency Scams)
The most prevalent voice cloning scam: a caller impersonating a grandchild (or child) contacts a grandparent or parent claiming to be in legal trouble — car accident, arrest, or some other emergency requiring immediate bail or legal fees. The urgency is maintained by keeping the victim on the phone and insisting they not contact other family members "for privacy reasons." A second caller posing as a lawyer or bail bondsman usually appears to collect payment.
Virtual Kidnapping Scams
A more aggressive variant: the caller claims to have kidnapped a family member and plays a crying voice (now often AI-cloned from real family members' social media) to "prove" the abduction. Ransom demands are made, and the caller insists on staying on the phone to prevent you from verifying. Real kidnappers don't call you directly and don't keep you on the phone. If you receive this call: ask a specific verifiable question only your family member would know, and have a second person call the alleged victim on their real number simultaneously.
Bank Impersonation Calls
Scammers use AI voice synthesis to impersonate bank fraud departments, calling account holders to "alert" them to suspicious activity. They then guide victims through steps that actually transfer money to the scammer's accounts — framed as "securing" the victim's funds. These calls are highly polished and often include spoofed caller IDs showing the bank's real number.
No legitimate bank will ever ask you to move money to "secure" it. If you receive this call, hang up and call the number on the back of your card directly.
IRS, Social Security, and Government Impersonation
AI voice synthesis is increasingly used in government impersonation scams that claim you owe back taxes, your Social Security number has been "suspended," or there's a warrant for your arrest. The IRS does not initiate contact by phone, and no government agency requests gift card payment.
How Scammers Get Your Voice Samples
The voice samples that power these scams come almost entirely from public social media content:
- Facebook/Instagram videos and Stories — the most common source, especially for older adults whose family members post videos publicly
- TikTok and YouTube content — public videos of the "impersonated" person
- Voicemail greetings — often publicly accessible if someone's voicemail is set up without a password
- LinkedIn audio introductions — increasingly used for business-targeting scams
- Podcasts, webinars, and public recordings — for executive impersonation
This is why reducing your public data footprint matters for AI scam prevention: less publicly accessible audio means less source material for voice cloning attacks.
How to Recognize an AI Voice Call in Progress
Modern voice clones can be very convincing, but there are tells:
- Slight audio artifacts: A subtle "electronic" quality, slight echo, or digital compression artifacts that the real person's calls don't have
- Resistance to off-script questions: Ask about a specific shared memory ("What did we name the dog we had when you were eight?") — operators struggle to improvise convincing answers
- Scripted urgency: Real emergencies have pauses, confusion, and emotional inconsistency. AI-assisted calls often feel surprisingly smooth given the supposed distress
- The "don't tell anyone" instruction: Almost always present — this is how scammers prevent verification
- Caller ID doesn't guarantee identity: Phone number spoofing is trivially easy. A call from your son's number is not proof it's your son
🔑 The One Rule That Always Works
If someone calls claiming to be a family member in an emergency, hang up and call them back on their known number immediately. This single action defeats every version of the voice cloning scam, regardless of how convincing the voice sounds. The resistance you might feel ("but what if it really is them?") is exactly what scammers count on.
Set Up a Family Code Word Right Now
The most effective preventive measure against AI phone scams — recommended by the FTC and FBI — is a pre-established family code word that any caller claiming to be a family member must provide before any action is taken.
How to Set It Up
- Choose a random, memorable word or short phrase. Not your address, pet's name, or anything an adversary could find online. Something like "thunderstruck" or "Aunt Margaret's pie recipe" — something specific to your family's shared history that doesn't appear in public records or social media.
- Share it privately. Tell all family members in person or via a secure message. Do not post it anywhere online.
- Establish the protocol clearly: "If you ever call claiming to be in an emergency and need immediate help, I will ask for the code word. If you can't provide it, I will hang up and call you back on your regular number."
- Practice it. Do a test run so it feels natural to ask for, not awkward or suspicious.
- Update it annually or if you suspect it may have been compromised.
What to Do If You Already Sent Money
If you realize after the fact that you were scammed, the timeline for recovery action is measured in minutes and hours, not days. Every payment method has a different recovery path:
Wire Recall Is Time-Critical
- Call your bank's fraud department immediately — not customer service, specifically fraud
- Request a wire recall. The effective window is approximately 30–90 minutes from transfer initiation for domestic wires; international wires are slightly longer but still measured in hours
- Ask the bank to contact the receiving bank directly and request an account hold
- Provide the receiving bank name, account number, and transfer reference number if you have them
- File with FBI IC3 simultaneously — the FBI's Recovery Asset Team can contact the receiving bank
- Success rates decline sharply after 2 hours and approach zero after 24 hours
Gift Card Recall Procedure
- Call the gift card issuer immediately with the card number and PIN
- Amazon: 1-888-280-4331
- Google Play: support.google.com/googleplay
- Apple iTunes: 1-800-275-2273
- Walmart/Vanilla: 1-800-531-1911
- Ask if the card balance has been redeemed — if not, they may be able to freeze remaining funds
- Report to the FTC — gift card scams are specifically tracked
Disputing P2P Transfers
- Contact both Zelle (zellepay.com) and your bank simultaneously
- Zelle's updated fraud policy covers more impersonation scam cases since 2023 regulatory pressure
- File a dispute explicitly — use the word "fraud" and describe the impersonation
- Cash App disputes: cashapp.com/help → Contact Support → "I need help with a Cash Out"
- Venmo: contact support and file under "Unauthorized transaction"
- First denials are not final — escalate and file with the CFPB at consumerfinance.gov if your bank refuses reasonable dispute handling
Reporting to the FTC and Your State AG
Reporting these scams is critical even if you don't expect direct recovery — your report helps law enforcement identify patterns, locate criminal networks, and issue public warnings that protect others:
- FTC: reportfraud.ftc.gov — select "Impersonation of someone you know"
- FBI IC3: ic3.gov — especially critical if wire transfer or large dollar amount
- State Attorney General: Find yours at naag.org
- Local police: File a report — you'll need the case number for insurance and some bank dispute processes
🛡️ Monitor Your Identity After a Phone Scam
If personal information was shared during the call, identity protection services can catch misuse before it compounds your losses.
Protection Steps Going Forward
- Implement the family code word system — this is the single highest-impact protection step
- Make your social media videos private — this limits the voice samples available to scammers targeting your family
- Register your number on the Do Not Call Registry (donotcall.gov) — doesn't stop scammers but reduces total call volume
- Enable spam call blocking on your phone (iOS: Settings → Phone → Silence Unknown Callers; Android carriers have similar options)
- Never provide payment via gift cards — no legitimate entity (government, bail bondsman, bank) accepts gift card payment
- Remove your personal data from data broker sites — scammers use these databases to learn your family members' names, relationships, and phone numbers to personalize their attacks. Visit TakeBackYourData.com for step-by-step opt-out guides.
- Share this guide with elderly family members who are disproportionately targeted by grandparent scams
For more prevention strategies, see PreventAIScams.com.
More Recovery Guides
Related Resources
- Remove your personal data from broker databases After being scammed, removing your data from broker sites reduces future risk.
- How to prevent AI scams before they happen Prevention is the best defense.
- Latest AI scam alerts and warnings Stay current on new AI fraud tactics.
Frequently Asked Questions
How do AI voice cloning phone scams work?
Scammers extract voice samples from public sources — social media videos, YouTube, voicemails — then use AI voice synthesis to generate a cloned voice. They call family members, employers, or banks while impersonating the victim using this cloned voice to authorize transfers, request help, or manufacture emergencies.
How can I tell if a phone call is using AI voice cloning?
Signs include slight audio artifacts, unnatural pacing, resistance to off-script personal questions, inability to recall specific shared memories, and manufactured urgency designed to prevent independent verification. The best defense is a pre-established family code word.
What should I do if I sent money in an AI phone scam?
Act immediately. For wire transfers: call your bank and request a recall — the effective window is 30 minutes to a few hours. For Zelle: contact both Zelle and your bank simultaneously. For gift cards: call the card issuer immediately with the card number and PIN. Report to the FTC and FBI IC3.
What is a virtual kidnapping scam?
Virtual kidnapping scams involve a call claiming a family member has been kidnapped and demanding ransom. The caller keeps you on the line to prevent verification. AI voice cloning now allows scammers to include a fake "victim" voice in the call. If you receive such a call, ask a verifiable question only your family member would know, and have someone else call the family member's real number simultaneously.
How do I set up a family code word to protect against AI voice scams?
Choose a random, memorable word or phrase that doesn't appear in your public social media or records. Share it with all family members privately. Agree that any caller claiming to be a family member in an emergency must say this word before any action is taken — or you hang up and call their known number back.
Stay One Step Ahead
Get alerts on new AI scam types and recovery resources. No spam — only what matters.
Your email is never shared. Unsubscribe anytime.