SecurityBrief Australia - Technology news for CISOs & cybersecurity decision-makers
Moody human silhouette data fragments facing shadowy ai privacy loss

AI heightens data privacy risks & reshapes digital trust

Thu, 29th Jan 2026

Technology and data specialists have warned that artificial intelligence and weak data governance are sharpening privacy risks for organisations, as businesses mark World Data Privacy Day.

Executives from Secure Code Warrior, Customer Science and Ping Identity said rapid adoption of AI tools, rising consumer scepticism and inconsistent ownership of personal data were combining to reshape the demands on corporate security and compliance teams.

World Data Privacy Day focuses on how organisations collect, use and protect personal information. The comments highlight pressures on businesses that deploy AI in software development, manage large customer datasets or operate digital identity systems.

AI in development

Pieter Danhieux, Co-Founder and CEO of Secure Code Warrior, said AI-based coding tools were entering enterprise environments faster than security controls.

"AI tooling is being integrated into enterprise development workflows quicker than security programs and policies can mitigate the risk it poses. Now, more than ever, enterprise security leaders must build their security arsenal with planned, strategic observability of AI coding tools and agents, especially when it comes to the commits they're making to sensitive codebases," said Pieter Danhieux, Co-Founder and CEO, Secure Code Warrior.

Developers in many large organisations now use generative AI assistants to suggest and generate code. Security teams are assessing how these tools handle sensitive data, how they influence code quality and how they interact with protected systems.

Danhieux said the privacy spotlight on personal data applied directly to these tools in day-to-day engineering work.

"World Data Privacy Day often highlights the importance of upholding policies around the safe handling of Personally Identifiable Information (PII), and with many developers actively leveraging AI-powered coding assistants, it is a non-negotiable that security guardrails are deployed to ensure the safe handling, not to mention security configuration, of any code that could expose private data. These tools are best reserved for only the most skilled, security-savvy developers in enterprise environments," said Danhieux.

The comments reflect concerns that AI-generated code could introduce flaws or misconfigurations that expose PII, especially in systems that hold financial, health or behavioural data.

Data ownership

Consultancy Customer Science said many organisations still struggle to map and classify the personal data they control, which affects both compliance and risk decisions.

"Data Privacy Day is a great opportunity to highlight the critical importance of understanding the privacy risks specific to your organisation. At Customer Science, we believe that getting data privacy right starts with high visibility. Fundamentally, we need to know exactly what personal information is held, where it resides, and the specific risks involved in its retention in order to make decisions around privacy with confidence," said Darius Vitlin, Senior Consultant, Customer Science.

Businesses face stricter requirements in many jurisdictions to locate personal data, respond to access and deletion requests, and justify ongoing retention. Vitlin said the issue extends beyond legal exposure.

"This is not merely about regulatory compliance risks. Data privacy is about protecting from potential harm to customers, staff, and the long-term viability of the business itself. To make proactive decisions, an organisation must understand the purpose and provenance of its data assets," said Vitlin.

Vitlin said that responsibility for personal information should sit clearly with business owners rather than only with centralised compliance teams.

"Effective privacy management needs clear business ownership of personal information. Those who work with the data know the data and the risks it carries best. These owners need to be supported and empowered in their role. They should have the understanding of the data's lifecycle and the potential for harm which enables them to make informed decisions that balance operational utility with robust protection. When ownership is clear and data managers are empowered, privacy becomes a proactive, integrated part of the organisational culture," said Vitlin.

Digital trust

Digital identity specialist Ping Identity pointed to growing consumer unease about how organisations use personal data and about the effect of AI on fraud and impersonation.

"This week offers an opportunity to pause and assess the rapidly evolving landscape of digital trust, as privacy really boils down to choice and trust around how personal data is being used. Data privacy is no longer a passing concern for consumers - it has become a defining factor in how they judge brands, with three-quarters now more worried about the safety of their personal data than they were five years ago, and a mere 14% trusting major organisations to handle identity data responsibly," said Patrick Harding, Chief Product Architect, Ping Identity.

Organisations across sectors have invested in fraud detection, authentication and consent management tools as digital transactions increase and as regulators introduce stricter rules on data usage and profiling.

Harding said AI would challenge existing approaches to identity assurance.

"Whether it's social engineering, state sponsored impersonation or account takeover risks, AI will continue to test what we know to be true. As threats advance and AI agents increasingly act on behalf of humans, only the continuously verified should be trusted as authentic," said Harding.

He said businesses would face pressure to justify their collection and use of identity data.

"For businesses, the path forward is clear: trust must be earned through transparency, verification, and restraint in how personal data is collected and used. The businesses that adopt a "verify everything" approach that puts privacy at the centre and builds confidence across every identity, every interaction, and every decision, will have the competitive edge," said Harding.