Skip to main content
Back to Blog

WellWired Journal

A Guide to AI Voice Cloning Scams

28 February 20263 min readBy Rex Blackwell

Quick Summary: Scammers are using AI to make phone calls that sound exactly like a loved one. The best way to stay safe is to hang up and call them back on a number you trust.

It's a strange new type of phone scam, related to the same technology behind deepfake videos. A criminal can use a short audio clip of your relative's voice to create a computer-generated copy. They then use this 'voice clone' to call you, often pretending to be a family member in trouble who needs money urgently.

Because it sounds so real, it's easy to get caught off guard. But once you know how the trick works, it's much easier to spot.

How scammers find a voice

An AI only needs a few seconds of a person's voice to learn how to copy it. A scammer might get this from:

  • Videos posted on social media, like Facebook or Instagram.
  • A person's voicemail greeting.
  • Recording a cold call where they pretend to be from a marketing company.

The AI software analyses the voice and can then be used to say anything the scammer wants.

The modern 'grandchild in trouble' scam

You might have heard of scams where someone pretends to be a relative in an emergency. This is the same trick, but the AI voice makes it sound much more believable. Hearing what sounds like your own grandchild's voice saying they've been in an accident is designed to make you panic and act without thinking.

How to protect yourself

You don't need complicated tools to stay safe. A simple plan is the best defence.

  1. Don't act on impulse. If a call asks for money and creates a sense of panic, the first thing to do is take a breath. Scammers want you to act quickly.
  2. Hang up and call back. This is the most important step. End the call and phone the person back on a number you already have for them. If the emergency is real, you'll find out.
  3. Use a family safe word. Agree on a secret word with your family that a stranger wouldn't guess. If you get a suspicious call, ask for the word.
  4. Ask a personal question. You could also ask something only they would know, like "What's the name of your dog?" or "Where did we go on holiday last year?".

What to do if you get a scam call

If you think you've been targeted, tell your bank, especially if you've sent money. It's also helpful to report the scam to Action Fraud, the UK's national reporting centre for fraud and cybercrime. It helps them track what the criminals are doing.

The National Cyber Security Centre also publishes guidance on phone scams at ncsc.gov.uk. For broader advice on keeping safe online, see our guide to staying safe with AI and our article on AI scam phone calls.

FAQ

Is voice cloning technology illegal?

The technology itself has legal uses, like for voiceovers in adverts. Using it to impersonate someone and steal money is illegal.

Can any voice be cloned?

Yes. The AI tools are good at copying voices, including regional accents.

What if I think I've already been scammed?

Contact your bank immediately. They may be able to stop the payment. Don't feel bad about it. These scams are designed to fool people.

AI Voice ScamVoice CloningPhone Scam UKVishingCyber Security

📬 Free AI Starter Kit

A friendly 5-page guide to get you started with AI. Plus a weekly "AI in Plain English" email. Unsubscribe anytime.

About the Author

Rex Blackwell avatar
Rex BlackwellCTO & Technical Reviewer

Rex handles the technical side of WellWired.

Want to keep learning?

Explore more in-depth guides or start a structured learning path built for beginners.