Commissioner's Blog: AI scams – The new face of deception

This announcement is for: 

Deepfake videos and voice impersonations are now a chilling reality. Scammers are weaponising artificial intelligence (AI) to create convincing video and audio of well-known people, work colleagues, family members or even someone you have fallen in love with online.

Consumer Protection’s ScamNet is predicting the use of AI voice and video cons to be an emerging scam feature to watch out for this year.

Just recently, Andrew ‘Twiggy’ Forrest hit out at social media giant Meta for allowing the publication of a deepfake video of him spruiking a fake cryptocurrency scheme.

With the re-emergence of the ‘Hi Mum’ scam, ScamNet expects to see this evolve in Australia to using AI voice impersonations. It is already happening in the USA.

Technology is making it easier and cheaper for scammers to impersonate voices or create deepfake videos. All it takes is for them to harvest a social media account and use less than a minute of someone’s voice to AI generate entire blocks of speech. A few photos are all that’s needed to create a realistic deepfake video. It’s a timely reminder of the value of locking down your social media accounts.

When it comes to romance online, Consumer Protection’s advice was to video call the person to verify they are who they say they are.

This advice has now changed. Arrange to meet in person and if they can’t, it’s a warning sign. Video calls may be manipulated by scammers with deepfake technology.

But what should you look out for?

When it comes to video, look out for unnatural movements of the eyes and hands, or the mouth not moving in time with the speech. The face may also lack detail, such as the teeth not looking as they should.

With audio, the voice may have unusual speech patterns, or the person may not greet you how they normally would. There could also be a lack of emotion, or they won’t answer any of your questions.

Be wary of unexpected phone calls, even from people you know, as caller ID numbers can be faked.

Don’t share personal identifying information such as your address, birth date or middle name.

Consumer Protection continues to urge everyone to ‘practice the pause.’ Tell friends or family you’ll call them back to directly to verify their identity. Discuss creating a ‘safe word’ to say over the phone to confirm a real emergency.

Consumers targeted by AI voice impersonation and deepfake video scams should report them to WA ScamNet via the website, or by calling 1300 30 40 54 or emailing

Consumer Protection
Media release
21 Feb 2024

Last modified: