SecurityBrief Australia - Technology news for CISOs & cybersecurity decision-makers
Story image

Concerns over election-influencing deepfakes soaring

Wed, 1st May 2024

Recent data reveals that Australians' fears over 'deepfakes' are on the rise, particularly regarding their potential influence on elections. Global cybersecurity organisation McAfee reports that worries over the impact of AI-generated deepfakes have surged by 66% in the past year, with 43% of respondents expressing specific concern about their potential impact on elections.

Given the major global and Australian state elections taking place this year, there are fears that these deepfake technologies will be used to proliferate misinformation. Not only can this influence voters, but it can also impact the media and public figures.

Deepfaked content can distort historical facts question the credibility of the media, and target public figures. In light of these risks, McAfee has issued advice on how Australians can protect themselves from deepfake content and maintain information integrity.

A new study from McAfee, reveals shocking statistics. Almost 1 in 5 Australians stated that they recently encountered a political deepfake that they later discovered to be untrue. The number of people exposed to such misinformation is likely far greater owing to the sophistication of AI technologies, which often makes real and fake content indistinguishable.

Misinformation and disinformation have become key concerns among Australians. Disturbingly, 36% of survey respondents specified influencing elections as one of the most worrying potential uses of deepfakes, with 38% troubled by the undermining of public trust in the media. A significant 42% expressed worry about public figures being impersonated (such as politicians or famous media personalities), and 33% about the distortion of historical facts.

According to Tyler McGee, Head of APAC at McAfee, the tools to create cloned audio and deepfake videos are readily available and can be mastered within hours. The subsequent content can convincingly appear genuine in a matter of seconds. This poses now even more crucial questions concerning content authenticity amidst influential election periods.

However, people can protect themselves from misinformation, disinformation and deepfake scams. Maintaining a healthy sense of scepticism is key. Questions should be asked regarding the content's source, its credibility, and the plausibility of the information. Innovative measures can be used to protect against deepfakes, such as robust detection tools, online protection technologies and McAfee's deepfake audio detection technology which will be made available soon.

Increasingly, people are finding it difficult to distinguish truth from falsehoods. Up to 68% of people are more concerned about deepfakes now than they were a year ago, and over half feel that AI has made detecting online scams a greater challenge. A staggering 79% of Australian social media users find it hard to detect AI-generated content such as fake news and scams. Worryingly, just a quarter of people are confident about differentiating between a genuine call from a friend or loved one and an AI-generated one.

As the global political landscape heats up with numerous elections around the corner, concerns about deepfake technology are only likely to increase. Concerned people fear being fooled by an AI-generated voice clones purporting to be a loved ones or celebrities including politicians, potentially impacting political discourse and election results.

McAfee's survey reveals that, over the last year, 40% of Australians have seen deepfake content, 21% have encountered a deepfake scam, and 7% have fallen prey to a deepfake scam. A few simple steps can help people stay safe and promote information integrity, which include verifying sources before sharing information, checking for distorted images, being cautious of robotic voices, scrutinising emotionally charged content, and investing in tools to identify online scams.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X