Skip to content   Skip to footer navigation 

AI voice scams are coming. Here's what you need to know

Hi Mum 2.0: With the latest phone scams, it's even harder to spot fact from fake.

scammer_using_headset_and_laptop
Last updated: 02 February 2024
Fact-checked

Fact-checked

Checked for accuracy by our qualified fact-checkers and verifiers. Find out more about fact-checking at CHOICE.

Need to know

  • Scammers can use AI to impersonate the voice of your loved ones over the phone in attempts to get you to transfer money
  • There have been no reports of these scams in Australia yet, but experts warn they could start to emerge this year
  • AI cloning is improving quickly, so listening to the content of a call rather than voice quality can help you identify this type of scam

Australians are being warned to watch out for a new breed of scam in 2024.

Well-organised criminals are leveraging the latest technology to double down on established methods to create new cons. 

Harnessing the power of AI to impersonate the voices of our loved ones, these new scams have devastated victims overseas, getting around the defences of even seasoned scam-avoiders. 

How likely is it you'll encounter this type of scam here, and will you be able to spot it if you do?

What are AI voice scams?

Having emerged overseas in the last year, many AI voice scams appear to be an evolution of the text message-based "Hi Mum" scam which gained notoriety in Australia in 2022.

This scam saw criminals contact victims by text, pretending to be a family member (often a child) in urgent need of money after losing their phone.

But the power of AI has now enabled scammers to ensnare victims in a much more personal way.

You can create a convincing, false recording of someone's voice that could then fool a family member into thinking that it's their loved one

Associate professor Toby Murray, University of Melbourne

Associate professor in the school of computing and information systems at the University of Melbourne, Toby Murray, says "AI voice cloning technology allows you to mimic someone's voice pretty closely and [it's] getting good enough now that the results are becoming almost indistinguishable."

"You can create a convincing, false recording of someone's voice that could then fool a family member into thinking that it's their loved one."

concerned_senior_on_phone

Experts say the latest AI voice clones can sound like a loved one in trouble.

How scammers clone your voice

A convincing clone can be made with as little as one minute of original audio – something scammers can recover from sources such as videos on social media.

Once they've made a clone, scammers will call the impersonated person's loved ones with a pre-recorded message spoken in their voice.

Dr Shaanan Cohney, researcher at Melbourne University's Centre for AI and Digital Ethics, says these messages closely follow the "Hi Mum" progression.

"A common one is someone you know is in urgent need and needs you to make a transfer of some funds to a particular location," he explains.

A convincing clone can be made with as little as one minute of original audio – something scammers can recover from sources such as videos on social media

"There'll be an excuse provided for why the funds are needed and also for why no further voice communication can happen. The goal is to minimise the opportunity for the person to identify that something's wrong with the voice communication."

The experts we spoke to also believe the criminals operating these schemes have the potential to target workplaces, calling staff using AI clones of the voices of senior executives, in an attempt to get employees to redirect invoice payments to a scammer's bank account.

In any case, if the person receiving the call is convinced it's genuine, criminals will often direct them to send money via a gift card, cryptocurrency or bank transfer.

AI voice-cloning technology

The growing prevalence of sophisticated AI tools has delivered criminals the means to copy voices quickly for relatively little cost.

"The technology has crossed a threshold where the improvements have accumulated to the point where [it's] now very usable with very little effort by ordinary people, rather than just by experts," explains Murray.

audio_editing_software_on_screen

Many online services let users clone voices relatively quickly for little cost.

Cohney says that although sophisticated computer software programs are available, he suspects most groups running this scam are using easy-to-find, web-based options.

"[If you] type 'voice cloning' into Google, [you] get 'AI voice cloning', 'Clone your voice in minutes', 'Free AI voice cloning in 30 seconds'. There are hundreds of these services now."

Signing up with one of these businesses, CHOICE was able to secure the services of an AI voice cloning tool for $USD1.10 ($AUD1.67) for one month.

We then uploaded a few smartphone recordings of this author's voice, ensuring the original files were of a similar quality to social media videos a scammer might use to extract a victim's voice from.

Our AI voice clone experiment

Running these recordings through the tool several times, we were able to make our clone read a message similar to one a scammer might play to a victim's loved one.

We found the first versions of the voice we created would occasionally drop into an English accent. But after uploading further samples and changing the prompts to the cloning tool (instructing the engine on the gender, age and nationality of the original voice, for example) we were able to get a clone that sounded quite similar to the author's voice.

If you listen to a sample of the voice below, you might notice subtle changes in accent in some parts, as well as a lack of some of the emotion one might expect someone in a stressful situation to carry in their voice.

Press play to hear our AI voice clone

Are AI voice scams happening in Australia?

The ACCC tells CHOICE that although it can confirm that scammers are using AI to create videos and to power chatbots to aid in scamming Australians, it's "unclear" if the technology is being used in AI voice scams targeting local consumers.

person_making_payment_on_smartphone

Banks are warning customers to look out for AI voice scams in 2024.

Businesses are already sounding the alarm, though, with the National Australia Bank (NAB) putting AI voice clones at the top of its list of scams for customers to watch out for in 2024.

The experts we spoke to for this article seconded NAB's warning, agreeing it's likely criminals will attempt to scam Australians with this method in the coming year.

"We really should expect that, because this voice cloning technology is getting so good," says Murray. 

"People have started to become much more aware of traditional [text message] scams and I think scammers know this, so we should expect … that scammers are going to adopt [AI voice] technology."

AI voice scams have proved devastating overseas. Families in the US, for example, have been confronted by elaborate schemes where AI clones of a child's voice were used in an attempt to make them believe a family member had been kidnapped.

Can you tell if you're speaking to an AI clone?

AI experts tell CHOICE that while even mass-market voice cloning tools are impressive in their ability to mimic someone's speech, glitches still occur.

"The telltale signs are going to be the imperfections in the generated voice, in the same way that most spam emails have spelling or grammar mistakes," says Murray.

Because cloning models are always improving, it could soon be difficult to spot a clone based on voice quality alone

This is reflected in our experience with these tools: as mentioned above, the cloned product we tried occasionally came with an English accent that undermined its effectiveness.

Cohney, however, warns that because cloning models are always improving, it could soon be difficult to spot a clone based on voice quality alone. 

He says people should pay attention to the content of a call and listen for:

  • a sense of urgency
  • an unwillingness to explain things further
  • the absence of normal social cues 
  • missing signs of ordinary communication, such as if your loved one is not greeting you in a way they normally would. 

"Notice if things seem out of the ordinary for your communication with this particular person," he advises. "If something appears out of context, then it's a wise move to inquire further."

Inquiring further is an easy way to catch an AI scammer in the act, as most current cloning tools are most effective at delivering short, pre-prepared messages, and can't engage in spontaneous conversation.

Finally, as with SMS-based "Hi Mum" scams, scammers will likely call from an unknown or private number. Therefore, treat unusual calls that appear to be from a loved one on an unknown number with suspicion.

Can you protect your voice from AI?

Many of us already have videos or recordings of ourselves online or on social media, so keeping our voices out of the clutches of AI could be difficult.

"Being vigilant is the best defence," says Murray. He says the previous mentioned methods for identifying an AI voice scam offer the easiest way to protect yourself.

Agree on a code word among loved ones to use when calling in a crisis

Safeguarding your social media accounts, however, is one other straightforward step you can take to reduce the chance of your vocals getting poached. 

Set your profile to private so your videos or other posts can't be seen when people you're not connected with visit your page.

Finally, if you're particularly concerned by these scams, our sister organisation in the US has some good advice. Consumer Reports recommends agreeing on a code word among loved ones to use when calling in a crisis, or asking a question only they would know the answer to.

We care about accuracy. See something that's not quite right in this article? Let us know or read more about fact-checking at CHOICE.

Stock images: Getty, unless otherwise stated.