AI voice cloning technology allows fraudsters to replicate any person's voice from a few seconds of audio — making it possible to impersonate a grandchild in distress, a corporate executive authorizing a wire transfer, or a government official demanding immediate payment. The technology is consumer-grade, inexpensive, and increasingly indistinguishable from authentic speech. This page compiles current statistics on the scale, financial impact, and demographics of voice cloning fraud from government agencies, consumer advocacy organizations, and cybersecurity researchers. All figures are sourced. This reference is intended for journalists, legal professionals, elder care advocates, and fraud researchers.

AI scam and cybersecurity statistics — laptop displaying security code

For more on this, see our guide on grandparent scam voice.

For more on this, see our guide on spot voice cloning.

1 in 4
Adults globally have experienced or know someone who has experienced an AI voice cloning scam, according to McAfee's 2023 international consumer survey of 7,054 adults across 7 countries.
— McAfee, "The Artificial Imposter," 2023

Table of Contents

  1. Prevalence & Growth
  2. Grandparent Scams
  3. Business & Executive Fraud
  4. Victim Losses
  5. Technology & Access
  6. Who Is Targeted
  7. Enforcement & Legislation
  8. Frequently Asked Questions

Prevalence & Growth

77%
Of voice clone scam victims who reported the incident to McAfee lost money
— McAfee, 2023
$11,000
Average loss among voice cloning scam victims who lost money (U.S.)
— McAfee, 2023
Voice cloning fraud complaint volume has grown by an estimated 1,000%+ since 2022, driven by the proliferation of consumer-grade AI voice cloning tools that require no technical expertise. — Sumsub Identity Fraud Report, 2023; Europol IOCTA, 2024
The AARP Fraud Watch Network reports that 1 in 3 Americans knows someone who has been targeted by a voice cloning scam — and that awareness of the technology does not substantially reduce susceptibility to real-time deception. — AARP Fraud Watch Network, "AI and Fraud," 2024
In McAfee's 2023 survey, 45% of respondents said they would not be confident they could tell the difference between an AI-cloned voice and the real voice of a family member. — McAfee, "The Artificial Imposter," 2023
The FTC reported that family emergency impersonation scams — the primary vehicle for voice cloning grandparent fraud — cost Americans $41 million in 2022, and losses have grown substantially since the adoption of voice AI tools. — FTC Consumer Sentinel Network Data Book, 2022

Grandparent Scams

Grandparent scams involve callers impersonating a grandchild (or other relative) claiming an emergency — an arrest, car accident, or hospitalization — and requesting immediate money. AI voice cloning makes calls sound authentically like the family member. — FTC, "Grandparent Scams," Consumer Information, 2023
In 2023, the FTC received over 100,000 reports of family and friend impersonation scams, with losses averaging $2,000 per incident for phone-based impersonation — but voice-cloning-enhanced incidents show substantially higher median losses. — FTC Consumer Sentinel Network Data Book, 2023
Adults aged 70 and older are the most frequent targets of grandparent scams, accounting for approximately 47% of all family impersonation fraud reports filed with the FTC. — FTC Consumer Sentinel Network Data Book, 2023
A landmark 2023 Washington Post investigation documented families losing between $5,000 and $750,000 to AI voice cloning grandparent scams — with several cases involving victims liquidating retirement accounts and taking out home equity loans. — Washington Post, "An AI grandparent scammer stole thousands…", 2023
The Canadian Anti-Fraud Centre recorded a 73% increase in grandparent scam reports from 2022 to 2023, with AI-enhanced voice calls cited as the primary driver of increased persuasiveness and successful deception. — Canadian Anti-Fraud Centre, Annual Statistical Report, 2023
An AARP survey found that 36% of adults aged 55+ reported receiving a suspicious call from someone claiming to be a family member in an emergency — with 9% confirming they believed the call to be from AI. — AARP Fraud Watch Network Survey, 2024

Business & Executive Voice Fraud

AI voice cloning is increasingly used in Business Email Compromise (BEC) schemes to impersonate CEOs and CFOs in phone calls authorizing wire transfers. The FBI documented at least 58 confirmed corporate voice fraud incidents in European financial institutions between 2022 and 2024. — Europol IOCTA, 2024
The first widely documented AI voice BEC case occurred in 2019, when fraudsters cloned the voice of a CEO to instruct a UK subsidiary to wire €220,000 to a Hungarian bank account. — Wall Street Journal, 2019; confirmed by Euler Hermes Group insurer
A 2023 case in the UAE involved fraudsters using AI-cloned voices of company directors to authorize a $35 million international wire transfer — one of the largest single voice fraud losses on record. — Krebs on Security, "Voice Deepfakes Are Now Threatening Businesses…," 2024
Security firm Pindrop's 2024 Voice Intelligence & Security Report found that 1 in 90 contact center calls to financial institutions in the U.S. involved AI-synthesized or manipulated voice, a rate that has doubled annually since 2021. — Pindrop Voice Intelligence & Security Report, 2024
Corporate voice fraud incidents have an average financial loss of $800,000 per incident when they result in successful wire transfers, compared to $11,000 for consumer-targeted voice cloning scams. — Association of Certified Fraud Examiners (ACFE), Report to the Nations, 2024

Victim Losses

$11,000
Average consumer loss per AI voice cloning incident (U.S.)
— McAfee, 2023
$800K
Average corporate loss per successful voice BEC incident
— ACFE, 2024
Among consumers who lost money to voice cloning scams, 36% sent the funds via bank wire transfer, 22% via gift cards, and 19% via cryptocurrency — payment methods that are nearly impossible to reverse. — McAfee, "The Artificial Imposter," 2023
The FTC found that phone-based fraud (the delivery mechanism for voice cloning) produces the highest per-incident losses of any contact method — a median of $1,400 in 2023, driven largely by AI-enhanced impersonation calls. — FTC Consumer Sentinel Network Data Book, 2023

Technology & Access

AI voice cloning requires as few as 3 seconds of target audio to produce a passable voice replica. Many social media videos, voicemail greetings, and public interviews provide sufficient audio. — Microsoft Research, VALL-E paper, 2023; ElevenLabs documentation, 2024
Consumer-grade voice cloning services available online in 2024 charge as little as $5–$20 per month for unlimited voice cloning, with no meaningful identity verification required. — MIT Technology Review, "The Voice Cloning Arms Race," 2024
In a 2023 experiment by VICE Media's Motherboard publication, journalists successfully used a freely available AI voice cloning tool to generate a synthetic voice of a reporter and deceive a bank's voice authentication system in a live demonstration. — VICE Motherboard, "I Cloned My Own Voice and Used It to Hack a Bank," 2023

Who Is Targeted

Adults aged 65 and older are significantly overrepresented in voice cloning scam victim populations, accounting for approximately 47% of family emergency impersonation reports despite representing only 17% of the U.S. population. — FTC Consumer Sentinel; U.S. Census Bureau, 2023
Corporate targets of voice fraud skew toward finance department employees and executive assistants — individuals with authority to approve payments or access to financial systems and wire transfer credentials. — FBI Private Industry Notification, 2023; Pindrop, 2024

Enforcement & Legislation

The FTC issued a final rule in 2024 expanding its prohibition on government impersonation to explicitly cover AI-generated voice impersonation, enabling the agency to seek civil penalties and restitution in voice fraud cases. — FTC, "Government and Business Impersonation Rule," 16 CFR Part 461, 2024
As of 2024, 19 U.S. states have enacted laws specifically prohibiting the use of AI-generated voice cloning in fraud, extortion, or non-consensual use — with additional legislation pending in 14 more states. — National Conference of State Legislatures (NCSL), AI Legislation Tracker, 2024
The FBI's IC3 notes that voice fraud cases are among the most difficult to prosecute because perpetrators frequently operate across multiple international jurisdictions and use VoIP numbers that route through uncooperative countries. — FBI IC3 Internet Crime Report, 2023
Cite This Page:

AIScamRecovery.com. "AI Voice Cloning Scam Statistics 2026: Grandparent Scams & Business Fraud." April 2026. https://aiscamrecovery.com/stats/ai-voice-cloning-scam-statistics-2026

Frequently Asked Questions

How common are AI voice cloning scams?

McAfee's 2023 survey of 7,054 adults across 7 countries found that 1 in 4 adults have experienced or know someone who experienced an AI voice cloning scam. In the U.S., 77% of victims who reported the incident lost money, with an average loss of $11,000. Complaint volume has grown over 1,000% since 2022.

How little audio does it take to clone someone's voice?

Modern AI voice cloning can generate a convincing replica from as little as 3 seconds of audio. Consumer-grade tools widely available online can clone a voice from a short social media video or voicemail greeting — making anyone with any public audio presence a potential target. No special technical knowledge is required.

What is a grandparent scam?

A grandparent scam involves a fraudster calling an elderly victim and impersonating a grandchild or other relative claiming to be in an emergency — arrested, in a hospital, or in danger — and urgently needing money. AI voice cloning makes these calls far more convincing by actually sounding like the victim's family member. They are often followed by a second call from someone posing as a lawyer or police officer.

How do I protect myself from voice cloning scams?

Establish a family "safe word" — a code word only your family knows that must be spoken in any emergency call before you take action. Hang up and call back on a known number. Be suspicious of urgency and secrecy ("don't tell anyone"). Never send cash, wire transfers, gift cards, or cryptocurrency based on a phone call alone, regardless of how authentic the caller sounds.

How do I report a voice cloning scam?

Report to the FTC at reportfraud.ftc.gov. File an internet crime complaint at ic3.gov. For large-dollar business fraud, contact your local FBI field office directly. AARP's Fraud Watch Network Helpline (1-877-908-3360) provides free support for elder fraud victims including voice cloning scams, and can assist with next steps regardless of whether you lost money.