SecurityBrief Australia - Technology news for CISOs & cybersecurity decision-makers
Simon

The real-time data solution to the AI energy problem

Fri, 24th Apr 2026 (Today)

The rapid rise of artificial intelligence (AI), and particularly generative AI (GenAI), is driving new energy demands from the data centres that power this game-changing technology, with Goldman Sachs noting that the AI arms race is set to require more high-density data centres and more electricity to power them.  

Indeed, it's thought that global power demand from data centres will increase 50% by 2027 and by as much as 165% by the end of the decade. But AI doesn't have to come with such a massive energy bill if we manage our data more efficiently.

Power-hungry data processes

With the increased demand in data centre infrastructure, it is anticipated that the sector in Australia will double in value to $40 billion by 2028, according to analysis by CBRE. This AI-accelerated growth of data centres around the nation doesn't just present financial ramifications, it also presents environmental ramifications. 

As AI workloads - and the data centre infrastructure supporting them - expand, the energy needs required to keep pace with the demand is substantial. This is partly because most AI systems still rely on data fed through batch processing, a decades-old method where data is collected, stored and processed at scheduled intervals. This approach is costly, slow and inefficient. 

It also leads to unnecessary energy usage. Batch processing is like driving in stop-and-go traffic - every restart burns more fuel than simply maintaining a steady speed. AI running on batch processing wastes energy in the same way, forcing data centres to crunch numbers in massive, inefficient bursts. But what if AI didn't have to come with such a massive energy bill? What if the way we manage data could make AI more sustainable?

Reducing energy demands with streaming

The good news is that a more sustainable approach has surfaced and is already transforming AI infrastructure. Real-time data streaming - what we at Confluent call 'data in motion' - allows AI to continuously process data at all times, reducing compute demands through the following mechanism:

  • Less Disk I/O: Streaming allows data to be processed in real-time rather than relying on traditional batch processing, which often involves heavy disk operations. By streaming data directly as it arrives, the reliance on slower disk reads and writes is minimised, leading to lesser resource consumption.
  • Optimised Compute Cycles: Streaming processes in smaller and continuous batches instead of larger, infrequent ones. This leads to more optimised compute cycles, reducing peak power draw and allowing for more efficient scaling. 
  • Decoupled Architecture: Data streaming shifts away from tightly coupled, point-to-point integrations to a decoupled architecture. This allows for efficient data flows where systems can independently process real-time data without heavy dependencies on other systems, reducing the computational load.

Cleaning and de-duping data before it lands in the data warehouse reduces the amount of data stored. Compared to batch processing, where computing resources usually have to work overtime to process data unnecessarily, real-time data streaming-based processing requires far fewer resources. With less data, there are smaller I/O and compute energy costs.  

Reshaping the future of AI

There's enormous potential for real-time data streaming to transform the energy requirements of Australia's expanding AI needs. But it's critical that we build the supporting infrastructure around AI workloads - and the data centres that power them - with long-term sustainability in mind. That means adopting real-time data streaming platforms from the outset, or transitioning to it decisively. Doing so enables systems to be designed from the ground up or redesigned to operate more efficiently, helping significantly reduce the overall energy impact of AI. 

Australia's in a prime position to bring such sustainability to the infrastructure powering our AI workloads. Real-time data streaming offers a way forward, reducing AI's environmental impact while making it more powerful, responsive, and efficient. The transition from batch processing to data in motion isn't just an upgrade - it's the only way for AI's future.