Who: The FTC collects reports of impersonation and AI-related scams.
When to use: Use when AI or deepfake was used to impersonate someone.
What to prepare:
- How you were contacted
- What they asked for
- Any recording or link
Go to FTC ReportFraud~5 min
Category: Emerging & other
Karen gets a call from “her daughter”—the voice is identical, crying, saying she’s been in an accident and needs money for the hospital. Karen wires $8,000. The voice was an AI clone made from a few seconds of her daughter’s social media audio. Scammers now use deepfake voice or video to impersonate family, bosses, or public figures and pressure victims into sending money. The technology has improved so much that a short clip from social media can be enough to clone a voice. Always verify through a separate channel—call the person back on a number you know, or contact another family member—before sending money or sharing sensitive information.
Common red flags: pressure to act immediately, requests for payment by gift card or wire, offers that seem too good to be true, or unsolicited requests for your personal or financial details.
Scammers use AI-generated voice, video, or images to impersonate someone you know or a public figure to trick you into sending money or information. Verify through a separate channel before acting.
Who: The FTC collects reports of impersonation and AI-related scams.
When to use: Use when AI or deepfake was used to impersonate someone.
What to prepare:
Go to FTC ReportFraud~5 min
Who: The FBI's IC3 tracks emerging tech fraud.
When to use: Use when you lost money or shared sensitive info.
What to prepare:
Go to IC3~10 min
Build your knowledge: Recommended reading — books & free websites on financial literacy and fraud awareness