Before we start, a reality check. The FBI's Internet Crime Complaint Center reports that Americans over 60 lose more money to online fraud than any other age group — and AI has made every one of the most common scams dramatically more effective in the last two years.
This is not a reason to panic. It's a reason to know what you're looking at.
Here are the five biggest AI-powered scams targeting seniors in 2026, and exactly what to do about each one.
Scam 1: The voice-cloned "grandchild in trouble" call
What it looks like: Your phone rings. It's a young voice that sounds exactly like your grandchild, crying. They've been in a car accident, or they've been arrested, or they're in a foreign country and need bail money. A "lawyer" or "officer" takes the phone and asks you to wire money, send gift cards, or read out your credit card number. Right now. Don't tell anyone. They're too embarrassed.
Why it works now: A scammer only needs 3 to 10 seconds of your grandchild's voice — lifted from a TikTok, Instagram reel, or birthday video — to clone it. The cloned voice can then say anything in real time.
How to spot it:
- They demand urgency — act now, don't hang up
- They demand secrecy — don't tell mom and dad
- They ask for unusual payment methods — wire transfers, gift cards, cryptocurrency
- They resist giving you a callback number
What to do:
Hang up. Then call your grandchild back on the number you already have saved. If you can't reach them, call their parents. Do not call any number the scammer gave you.
Set up a family safe word — a single word or phrase that everyone in the family agrees on. If a caller who sounds like your grandchild can't say it, it's not them.
Scam 2: Deepfake video calls from "family"
What it looks like: You get a video call on WhatsApp, Facebook Messenger, or FaceTime. On the screen is a face that looks exactly like your son, daughter, or close friend. They're upset. They need money transferred urgently because of an emergency. The video may be grainy, the connection "bad" — that's on purpose.
Why it works now: Video deepfakes that used to take Hollywood studios are now something a scammer can run from a cheap laptop.
How to spot it:
- Strange or stiff eye movement — real people blink and glance around naturally
- Lip movements that are slightly out of sync with the words
- Lighting that looks wrong around the jawline or hairline
- Long pauses when you ask unexpected questions ("What did we have for Thanksgiving last year?")
- They refuse to turn their head or stand up
What to do:
- Ask a question only the real person would know — and make it something recent and specific, not "mom's maiden name" (scammers can find that)
- Hang up and call them back on the number you have saved
- If it's really them, they'll understand completely
Scam 3: AI romance scams
What it looks like: Someone messages you on a dating site, Facebook, or Instagram. They're warm, attentive, an excellent writer. They have a plausible story — widowed, working overseas, a doctor, an engineer on an oil rig. Over weeks or months, they become a trusted confidant. Then something happens: a medical emergency, a customs problem, a business deal that needs just a small loan. They promise to pay you back.
Why it works now: AI can generate thousands of personal, emotionally intelligent messages with no effort. A scammer can run dozens of "relationships" at once. The photos may be AI-generated too — not stolen from a real person's profile, so reverse image search won't find them.
How to spot it:
- They move the conversation off the dating app quickly (to WhatsApp, Telegram, email)
- They always have a reason they can't video call, or the video is always brief and glitchy
- Their story involves being far away, unable to meet in person
- Eventually, there's a money request — always with a heartbreaking reason
What to do:
- Never send money to someone you have not met in person, no matter how close the connection feels
- Ask for a real-time video call with a specific request ("wave your left hand and say hello") — AI video is harder to fake live
- Tell a family member or friend what's going on — scammers work hard to isolate their targets
- Report the account to the platform and at reportfraud.ftc.gov
If someone you've never met in person asks you for money, the answer is always no — even if every fiber of your being wants to help. Real love never requires a wire transfer.
Scam 4: Fake AI customer support chatbots
What it looks like: You're searching online for the phone number or help page of Amazon, Apple, Microsoft, your bank, or a cable company. The first result is a chatbot that looks official. It asks for your account number, your password, maybe a code sent to your phone. A "support agent" then calls you to "fix" the problem.
Why it works now: Scammers buy ads that put their fake support pages at the top of search results. AI chatbots on those pages now sound exactly like real customer service.
How to spot it:
- The web address (URL) in your browser doesn't match the real company (e.g.
amazon-support-help.cominstead ofamazon.com) - The chatbot asks for passwords, PINs, or security codes — real companies never do this
- They want to install "remote support software" on your computer
- The whole thing feels rushed
What to do:
- Always go to the company's website directly by typing the address you know — never click a search ad
- Bookmark the real support pages of companies you use often
- Hang up on any caller who says they're from "support" and asks you to install software
- Remember: real customer service will never ask for your password
Scam 5: AI-written phishing emails
What it looks like: An email that appears to come from your bank, the IRS, Medicare, FedEx, Amazon, or your email provider. It's professional, friendly, and well-written. There's a link to click — to confirm your account, verify a delivery, update your information, or claim a refund.
Why it works now: For twenty years, the biggest clue to a phishing email was bad grammar. That clue is gone. AI-written phishing emails are fluent and personalized — sometimes using your name, your bank, and real details scraped from data breaches.
How to spot it:
- Judge emails by the action they ask you to take, not how well they're written
- Hover over links (don't click) to see the real web address — it almost always looks wrong
- Urgent language: "Your account will be closed in 24 hours"
- Requests for personal information that a real institution already has
- Attachments you weren't expecting
What to do:
- Never click links in unexpected emails from banks, delivery companies, or government agencies
- If you're worried the email might be real, go directly to the company's website in a new browser tab and log in there
- Forward suspicious emails to
reportphishing@apwg.organd then delete them - Enable two-factor authentication on important accounts (email, bank, Amazon) so a stolen password alone isn't enough
The universal rule that stops every one of these scams
Every scam on this list — voice clones, deepfakes, romance, support, phishing — uses the same three ingredients:
- Urgency (act now)
- Secrecy (don't tell anyone)
- Unusual payment (gift cards, wire transfer, crypto)
If a message has all three of those things, it is a scam. Full stop. There are no exceptions. No real grandchild, bank, lawyer, or romantic interest will ever need all three.
What to do this week to protect yourself
- Pick a family safe word. Agree on it with your children and grandchildren. Write it down somewhere only you can find.
- Bookmark real support pages for your bank, Amazon, Medicare, and anything else you use.
- Enable two-factor authentication on your email account. Your email is the key to every other account you have.
- Enable your carrier's free spam-blocking app. Call your phone company and ask — it takes 5 minutes.
- Tell one friend or family member what you read here. Scammers rely on you being alone when they call. Being connected is the best defense.
If it's already happened to you
Call your bank right away — many transfers can be reversed if you act within the first hour. Report the fraud at reportfraud.ftc.gov and to your local police.
And please, please do not be hard on yourself. These scams are engineered by professionals to fool intelligent, caring people who are trying to help family. They work on doctors, lawyers, and software engineers every day. Falling for one is not a reflection of your intelligence — it's a reflection of how sophisticated the attacks have become.
The fact that you're reading this is already the best protection there is.
Want a personalized guide to using AI safely in your own life? That's exactly what we make. Our AI guides are written for your situation — whether you're just curious, or actively trying to use these tools without getting burned. Get your personalized guide →