SecurityBrief Australia - Technology news for CISOs & cybersecurity decision-makers
Gina mccintltd.com  43

AI, cyber threats & the rise of strategic data privacy

Mon, 2nd Feb 2026

Industry leaders in data, cybersecurity and artificial intelligence are urging organisations to treat privacy as a strategic issue rather than a compliance exercise, as Data Privacy Day and Data Privacy Week focus attention on the control and protection of personal information.

Commentary from InfoSum, Tenable and Cohesity highlights growing regulatory pressure, the rise of AI-driven cyber threats, and a shift towards privacy-enhancing and sovereign AI technologies across the Asia-Pacific region.

Consumer power

Richard Knott, SVP APAC at data collaboration company InfoSum, said data privacy in 2026 has moved beyond a focus on basic protection.

"This year's Data Privacy Day theme, "Taking control of your data," is a reminder that privacy is no longer just about protection; it's about power. Taking control means deciding who can access your data, how it's used, and what value you receive in return," said Knott, SVP APAC, InfoSum.

Knott pointed to regulatory momentum worldwide and in Australia.

"Consumers are no longer passive participants in the digital ecosystem. They're actively choosing the brands they trust and the platforms they engage with, based on how their data is handled. Globally, more than 80 percent of people are now protected by some form of privacy legislation. In Australia, long-awaited Privacy Act reform is nearing its conclusion, reinforcing the shift toward greater transparency and accountability," said Knott.

He said the media and marketing sector is responding by rethinking how audience and customer data is used.

"For the media and marketing industry, this shift presents an opportunity. Embedding privacy into data strategies isn't just the right thing to do - it's unlocking smarter, more responsible innovation. Brands that adopt privacy-by-design principles are finding new ways to collaborate and drive results without compromising control," said Knott.

Privacy technologies

Knott said new technical approaches are emerging within this context.

"A new generation of technology is making this possible. The arrival of privacy-enhancing technologies (PETs) and secure data collaboration has changed the game, enabling data to be connected without being shared, moved or commingled. When we stop treating data as something to extract and own, and instead see it as something to connect and safeguard, we create a better system for everyone. That's how trust is built, how innovation becomes sustainable, and how privacy becomes a competitive advantage," said Knott.

He linked privacy directly with business performance and growth.

"In 2026, privacy is no longer a trade-off. It's the engine of performance, insight and long-term growth," said Knott.

AI-driven threats

Security specialists are drawing attention to the way attackers are using advanced automation and machine learning. Bernard Montel, Field CTO at cybersecurity firm Tenable, said organisations face significant risks as personal data is exposed and reused in scams and extortion attempts.

"This Data Privacy Day, protecting personal data is about more than compliance; it's about defending freedom and privacy. As scams and extortion exploit exposed information, data leaks are causing real-world harm.

"With cybercriminals weaponising AI, attacks are becoming faster, smarter and harder to detect. At the same time, companies are adopting agentic AI, introducing a new risk: digital identities acting independently within sensitive systems. Effective governance now demands visibility into machine behaviour, not just human access.

"To combat these emerging challenges, businesses must invest in identity governance. Compliance should also be the baseline, with prevention and resilience built in from day one," said Montel.

AI resilience

Recent incidents in the education sector have sharpened focus on how AI interacts with core data stores. Gregory Statton, Vice President, AI Solutions at data security and management firm Cohesity, said organisations now face a different kind of threat to sensitive information.

"Recent data exposure in the Victorian education sector is a reminder that the threat landscape has shifted. Cybercriminals are no longer just attacking systems - they are targeting the foundational data that underpins our communities. This is not simply a security issue; it's a signal that we must rethink how AI is used to protect our most sensitive data.

"The starting point for data privacy today should be simple: ask not what you can do with AI, but what AI can do for you. In 2026, AI must move beyond hype and generic tools and be treated as a practical problem-solver. Organisations that focus on real business value (with data integrity and privacy built in from the ground up) will be the ones that emerge as winners in the era of AI.

"Across APAC, interest in sovereign AI is accelerating as organisations recognise the importance of keeping data within corporate and geographic borders. A sovereign-first approach improves control, compliance, and strategic autonomy, but success depends on balance. Regulations must remain elastic enough to enable innovation without creating isolated data silos or inhibiting creativity.

"Effective data protection also requires a shift away from one-size-fits-all platforms. AI now enables highly targeted, department-specific solutions where access is limited to those who truly need it. This approach reduces risk while improving speed and precision.

"Finally, technology alone is not enough. Cybercriminals exploit people as much as systems. Building real resilience means empowering staff, students, and stakeholders to actively participate in data privacy. When human judgment is combined with AI-driven precision, organisations gain a level of protection that generic security tools simply cannot provide.

"At the heart of AI lies data. For AI systems to operate effectively, they must be trained on trusted, high-quality data free from tampering. Embedding privacy-by-design principles into the workflow processes and adopting privacy-enhancing technologies such as encryption and access controls, in parallel with continuous employee education - are all important steps in laying the foundation for AI to become the strongest asset in protecting privacy - not our greatest risk," said Statton.