The voice was my relative's. The call wasn't.
My phone rang and the voice on the line sounded like my adult son—upset, out of breath, saying he was sorry and that something bad had happened.
A second voice cut in demanding crypto payment immediately and telling me not to hang up or “he gets hurt.”
They pushed me toward wiring while I could still hear crying on the line.
My wife texted our son’s roommate from her own phone; the reply came back that he was in class and fine.
Police later said several families in our area had reported the same synthetic audio pattern that month.
AI-cloned voice scams use short clips from social media to imitate someone you love, then add urgency and a payment rail that is hard to recall.
The technology is new; the playbook—panic, secrecy, speed—is old.
While the call lasted, checking facts felt like gambling with my child’s safety, which is why the out-of-band text from my wife mattered.
Our son video-called from the cafeteria a few minutes later; hearing his live voice over the same phone I had almost sent money from ended the ransom story cold.
Unknown numbers still make me tense; we filed a report and told relatives about the code word so the next urgent call gets a verification step first.
We agreed a family code word and a rule: any emergency demand for money gets checked on a second channel before we act.
- Pause urgent voice demands; contact the person through a separate number or app you already trust.
- Report AI voice extortion to local police and fraud centres (e.g. FBI IC3 in the US).
For more help, see our Report a scam page and Spot and avoid scams guide.
Test your understanding
Flip each card to check your answer
AI-cloned voice scams use short clips from social media to imitate someone you love, then add urgency and a payment rail that is hard to recall.
Tap to flipAI-cloned voice scams use short clips from social media to imitate someone you love, then add urgency and a payment rail that is hard to recall.