SecurityBrief Australia - Technology news for CISOs & cybersecurity decision-makers
Worried australian man answering phone ghostly double ai fraud

AI voice cloning scams cost Australians AUD $25.8m

Wed, 4th Feb 2026

Swinburne University of Technology has warned that AI-driven voice cloning is increasing the risk from scam calls, with losses already reported at AUD $25.8 million in the first half of 2025.

Dominique Carlon, an AI expert at Swinburne, said scammers now use generative AI to produce convincing impersonations across text, images, audio and video. She said voice cloning stands out because it can replicate familiar voices and create a sense of urgency during a live call.

"Voice cloning has become one of the most confronting new tools in the scam toolkit," said Dominique Carlon, AI expert, Swinburne University of Technology.

Voice risk

Carlon said a convincing voice can prompt quick decisions before a person verifies identity. She said the method works because it uses relationships and trust as leverage.

"A call that sounds exactly like your child, boss or partner asking for urgent help can trigger an immediate emotional response. Because it unfolds in real time, people often do not get the chance to pause or verify before acting," said Carlon.

She said earlier synthetic voices often sounded robotic. She said newer systems can sound natural with fewer cues that a listener might notice. She said scammers can collect voice samples from social media posts, voicemail greetings, or other online clips.

Carlon linked voice cloning to several types of fraud. She listed requests for money or personal information, romance scams, job scams, and ransom-style attacks that use an impersonated voice to increase pressure.

Loss figures

Scam calls remain a major source of losses, according to the figures cited by Swinburne. The AUD $25.8 million figure covers reported losses in Australia in the first half of 2025.

Carlon said people should focus on the content of the call, rather than the sound of a voice alone. She also said urgency should act as a warning signal.

"If you feel pressured to act quickly, particularly if it involves money or information, the safest thing you can do is slow down," said Carlon.

Psychological tactics

Carlon said the current wave of scams relies on psychological manipulation as well as technology. She described scams as operations that target instincts and decision-making under pressure.

"Scams exploit fundamental human responses such as urgency, fear, hope and trust. They create emotionally charged or time sensitive scenarios that push people to act before their rational brain has a chance to catch up," said Carlon.

She said AI-generated content increases the scale and personalisation of deception. She said the pace of change in scam techniques has outstripped public awareness.

Uneven impact

Carlon said the harms from scams do not affect all groups equally. She pointed to people under financial stress, people facing visa uncertainty, people dealing with language barriers, and people who mistrust institutions as groups that can face greater exposure and harm.

"These crimes do not just take money. They deepen inequality. The shame around being scammed keeps people silent, which is exactly what allows these operations to flourish," said Carlon.

Education gaps

Carlon called for more research into how AI-driven features shape scams from first contact through to payment or data theft. She also called for safety education that reflects different communities and skill levels.

She said public messaging often focuses on prevention while offering less practical guidance on what to do after a person shares information or transfers money. She also said people can struggle to apply standard advice in a high-pressure phone call.

"Stop. Check. Protect. is still the best defence, but it can be hard to apply during a phone call when emotions are high," said Carlon.

In guidance shared by Swinburne, the "Stop" step flags pressure to act quickly as a major warning sign and advises people to hang up if anything feels wrong. The "Check" step advises verification by calling back directly or looking up official contact details independently. The "Protect" step advises immediate action if personal or financial information has been shared, including contacting a bank and reporting the incident.

"And importantly, talk about it. Scams thrive in silence. We need clearer pathways for support, reporting and recovery, and we must dismantle the stigma. Anyone can fall for a sophisticated scam. Intelligence is not a protective factor. The design is simply that good," said Carlon.

Carlon said the use of AI across text, image, audio and video impersonation will continue to reshape scam tactics, with voice cloning remaining a high-risk channel because it can compress manipulation and decision-making into a single live interaction.