"She heard her daughter's voice screaming 'Mom! I messed up!' A man's voice took over: 'Your daughter is with me. $50,000 or she gets hurt.'"
Jennifer DeStefano was at her other daughter's dance studio when her phone rang. She heard what she was absolutely certain was her 15-year-old daughter's voice — with her inflection, her way of crying — screaming for help. A man demanded ransom. It took four minutes, several phone calls, and a confirmation from her husband for DeStefano to realize her daughter was safe in a ski lodge, unaware any of this was happening. The voice was AI-generated from audio harvested from her daughter's social media. No technical skill was required. It took seconds. DeStefano testified before the U.S. Senate Judiciary Committee in June 2023.
Artificial intelligence has made it possible to clone someone's voice from three seconds of audio. In 2024, Americans lost $16.6 billion to internet fraud — a 33% increase from the year before.
lost to fraud
The number behind this guide
lost to AI-facilitated fraud in 2024.
A 33% increase from the year before. The call that felt like your daughter's voice cost nothing to make.
Try It
"The Call" — Scam Scenario Simulator
Walk through a virtual kidnapping call or a romance investment fraud scenario — making real-time decisions. Google Research found that interactive scenario-based "inoculation" significantly improves real-world scam recognition. Includes facilitator mode for group use.
Try the simulation →What the simulator covers
- →Virtual kidnapping / voice clone call scenario
- →Romance investment fraud (pig butchering) scenario
- →Branching decisions with immediate outcome feedback
- →Warning sign review at the end of each scenario
- →Family safe word setup guide
- →Facilitator mode with psychological explanations
Five categories of AI fraud. All of them are operational now.
These are not experimental threats — they are documented, scaled operations causing billions in annual losses.
Voice Clone Scams
AI clones a family member's voice from 3-10 seconds of social media audio. Scammer simulates a kidnapping, arrest, or accident. Demands gift cards, Zelle, or wire transfer.
Deepfake Investment Fraud
AI generates video of a trusted public figure (Musk, Buffett) endorsing a fake investment platform. Distributed via Facebook and TikTok ads. Early 'gains' are fake; withdrawal triggers invented fees.
Pig Butchering / Romance Scams
Weeks or months of AI-automated emotional contact from a fake persona. Leads to a fake investment platform. Withdrawal attempts generate fees. Funds disappear.
Deepfake Video Conference Fraud
Real-time AI deepfakes of company executives impersonate CFOs or CEOs in live video calls. Employees authorize fraudulent wire transfers.
AI-Personalized Phishing
AI generates emails using your name, employer, and colleagues at near-zero cost. Click-through rate: 54% — vs. 12% for generic phishing.
Three seconds of audio. That's all it takes.
Modern open-source voice cloning tools require only 3–10 seconds of audio. Generation time: approximately 75 milliseconds per response.
Minimum audio needed to clone a convincing voice using modern open-source tools.
Security researchers, 2024
of people cannot confidently distinguish a real voice from a cloned one.
McAfee 'Artificial Impostor,' 2023
Increase in AI-enabled scams from 2023 to 2025.
Microsoft AI for Good Lab via AARP
Total U.S. internet fraud losses in 2024 — a 33% increase year-over-year.
Where scammers get audio
- →Social media videos (TikTok, Instagram Reels, school events posted online)
- →Voicemail greetings (scammers call repeatedly to capture the greeting)
- →Podcast or interview clips
- →Online conference recordings
- →School or church presentations posted online
The 5-phase methodology
Target selection
Data brokers legally sell name, age, address, family relationships, and income data. Scammers can build a family profile in under 10 minutes.
Audio harvest
3-5 seconds from a child's TikTok, a grandchild's Instagram Reel, or a voicemail greeting.
Clone generation
Audio is fed into a voice AI. A distress script is typed. Background crying and noise are layered in. Ready in seconds.
The call
The cloned voice plays. A human operator isolates the victim, prevents verification, and escalates panic.
Payment extraction
Gift cards, Zelle, wire transfer, or cash. Once sent, funds are nearly unrecoverable. Less than 5% of losses are ever returned.
Kai Zhuang
17 · Riverdale, Utah · December 2023
Scammers from Hong Kong spent weeks cultivating a 17-year-old Chinese foreign exchange student, then convinced him to isolate in a tent in a canyon in freezing winter temperatures. They used recorded and AI-manipulated calls to convince his family in China he had been kidnapped. His family transferred $80,000 before police found him on December 31.
Your intelligence does not protect you. Neither does your education.
The scam exploits something much older than either — and research confirms that credential and intelligence predict almost nothing about susceptibility.
The amygdala hijack
When a call tells you your child is being held at gunpoint, your brain's fear response — mediated by the amygdala — can completely override the prefrontal cortex where rational analysis happens. This is not a character flaw. It is neurological.
"Research confirms: once emotionally overwhelmed, even highly intelligent people default to fast, emotional decision-making. People with high verbal ability are no more resistant to romance scams than anyone else."
Summarized from peer-reviewed fraud psychology research (PNAS Nexus, 2024; Frontiers in Psychiatry, 2021)
The six psychological tools
Increase in older adults losing $100,000+ to impersonation scams since 2020 ($55M → $445M).
FTC data, August 2025
of people say they would send money if they received a call claiming a family member was in a car accident.
McAfee 'Artificial Impostor,' 2023
click-through rate for AI-personalized phishing — vs. 12% for generic phishing.
Peer-reviewed research, 2024-25
Everyone. But not everyone equally.
AI scams are adapted for specific populations with specific vulnerabilities. Falling for one is not a sign of weakness.
Steve Beauchamp
82 · Retired
AI deepfake videos of Elon Musk endorsing a fake investment platform appeared in his Facebook feed. He invested $27,000 initially, watched it appear to grow, and kept investing — paying fee after fee when he tried to withdraw. He lost his entire $690,000 retirement fund.
Older Adults (60+)
Adults 60+ filed 147,127 IC3 complaints in 2024 — a 46% increase from 2023 — and lost $4.885 billion. More than 7,500 lost over $100,000 each. Average loss per complaint: $83,000.
Important framing: falling for a well-crafted AI scam is not a sign of weakness or cognitive decline. These scams are professionally designed to work on intelligent, careful people.
Naum Lantsman
75 · Los Angeles restaurateur
Contacted via Telegram in his native Russian by scammers who impersonated a financial advisor. The language choice built immediate trust. Led through a fake investment platform with fabricated gains. Lost $340,000 life savings.
Immigrants
Scammers impersonate immigration authorities triggering fears specific to status. AI now works in dozens of languages.
People who are grieving
Obituaries are harvested to identify recent widows and widowers. Scammers claim shared grief to establish connection quickly.
Job seekers
Fake AI-generated job postings, fake video interviews, then an 'equipment deposit' ($1,500–$5,000). FTC job scam losses: $513M in 2024.
The law is catching up. Enforcement is still lagging.
Federal statutes now cover AI-enabled fraud. Offshore prosecution remains extremely limited.
FCC Ruling (February 8, 2024)
AI-generated voices in robocalls are now illegal under the Telephone Consumer Protection Act (TCPA). Penalty: up to $23,000 per call, plus private right of action. Enforcement against offshore operations remains extremely limited.
Existing Wire Fraud Statutes (18 U.S.C. § 1343)
Already cover AI-enabled fraud. Used in the Canadian grandparent ring prosecution (25 indicted) and the Shan Hanes conviction — the longest federal sentence in Kansas history for white-collar crime.
Canadian nationals indicted in federal court for a $21M AI-voice grandparent scam ring targeting 46 states.
DOJ, March 2025
Federal sentence for Shan Hanes, Kansas bank CEO who lost $47M to pig butchering scammers and destroyed his bank.
DOJ, August 2024
Arrests in INTERPOL's 2024 Operation HAECHI V targeting global AI-enabled financial crime.
INTERPOL, 2024
Four tiers of real protection.
From a single 2-minute action to long-term habits — built from documented scam defenses.
Set up a family code word. Today.
- Choose a random word or phrase known only to your family. Not a pet's name. Not your street. Something genuinely random.
- In any call claiming a family emergency, demand the code word. An AI cannot know it. A scammer cannot guess it.
- An experienced scammer may respond 'I'm too scared to remember it' — if the code word isn't given, the call is a scam.
- Test the system periodically. Make sure everyone knows it and knows when to use it.
- "The code word idea is simple and nontrivial to subvert." — Hany Farid, UC Berkeley, Scientific American
Build the pause-and-verify reflex.
- The 10-second rule: before taking any action on a frightening call, force a 10-second pause. Scammers create urgency specifically to prevent this.
- Hang up and call back on a number you already have stored. Any legitimate emergency survives a 60-second delay.
- Never call back a number given in the same conversation. Never click links in the same message chain.
- Reach the family member through a completely different channel — text from another phone, call a friend, contact another family member.
Reduce the attack surface.
- Social media videos containing children's voices are source material for voice clones. Set accounts to private or friends-only where possible.
- Enable carrier-level spam protection (AT&T Call Protect, T-Mobile Scam Shield, Verizon Call Filter — all free).
- Data broker opt-out: services like DeleteMe or manual opt-outs from Spokeo, Whitepages, BeenVerified reduce scammer targeting data. Do this annually.
- No legitimate emergency service ever requests gift cards, Zelle, cryptocurrency, or cash delivered to a stranger.
Know the warning signs before you're in the moment.
- Gift cards for emergencies: 100% of requests to buy gift cards for bail, fees, or government fines are scams. Every time.
- 'Keep it secret': any instruction not to tell your spouse, children, or other family members is a manipulation tactic.
- Too-good returns: investment platforms offering guaranteed high returns from an online contact are pig butchering operations.
- Urgency: real emergencies allow time for verification. Scams do not.
- Video call 'proof': seeing someone who looks like a public figure or your colleague on a video call is not proof of identity — deepfake video technology is real-time.
You are not foolish. These scams are professionally designed to work on intelligent, careful people.
Stop all payments immediately
Even if the scammer calls back demanding more. Payment does not guarantee safety — it guarantees more demands.
Call your bank now
Ask to freeze the account and reverse any pending transactions. For wire transfers, hours matter. Call within 72 hours.
Gift card issuers
Call the gift card issuer immediately. There is a small window where unspent balances can sometimes be recovered.
Wire transfers
MoneyGram: 1-800-926-9400. Western Union: 1-800-448-1492. Contact them immediately.
Document everything
Screenshot conversations. Save the phone number. Write down the time, what was said, and what you paid.
Report — even if you feel ashamed
FTC: reportfraud.ftc.gov. FBI: ic3.gov. Shame is what keeps people from reporting. Reporting is how scammers are caught.
Recovery reality
Less than 5% of funds lost to sophisticated AI scams are ever recovered. Gift card payments are almost never recoverable. Cryptocurrency is nearly impossible to recover. Wire transfers have a window of hours. Credit card disputes offer the best chance. One rare success: Aleksey Madan, 69, had his full $140,000 returned after Massachusetts law enforcement action.
For Educators
Workshop facilitation guide and curriculum resources
For teachers, librarians, senior center staff, and community educators: facilitation guide for the scam simulator, discussion questions, case study handouts, and guidance on running workshops for older adults and families.
Go to the Educator Guide →Who to call.
How we sourced this page
Every statistic on this page comes from: FBI IC3 Annual Reports, FTC Consumer Sentinel Network data, AARP Fraud Watch Network, McAfee, INTERPOL, peer-reviewed journals including PNAS Nexus and Frontiers in Psychiatry, and credible news reporting including the New York Times and Associated Press.
Want to bring this workshop to your community?
CPAI delivers family scam awareness training to libraries, senior centers, faith organizations, and schools.