Why future AI data centres could shirk strained grids to cosy up with wind: DNV report
AI boom has energy suppliers scrambling to meet demand, but locating modular facilities close to wind and solar clusters could provide answers to come of the problems
The galloping growth of power-hungry artificial intelligence is already increasing the strain on grids – particularly in North America – but new AI training efficiencies and a possible role for off-grid renewables mean the impact may be less dramatic than feared, finds a new DNV outlook.
As AI's cognitive capacity expands exponentially, DNV forecast that data-centre electricity consumption will more than double globally by 2030.
But AI also faces its own growth constraints, including hardware and supply-chains, cooling and grid-connection limitations and deficiencies in data availability and quality, DNV pointed out.
Efficiency gains
AI demand for power will also become less volatile an more "linear" as a result of improved efficiencies and control mechanisms, DNV found, citing recent empirical work showing that machine learning hardware has been gaining in energy efficiency at a rate of around 40% per year.
"We expect this trend to continue with improvements in chip design and smaller node size," DNV said. "We also expect further efficiencies in the form of better architecture, algorithms and in cooling engineering at data centres — the combined effect of which will be multiplicative."
"Our forecast assumes continued efficiency improvements and maturing assurance ecosystems," DNV continued.
"Growth will be tempered by compute, energy and integration bottlenecks that slow uptake of AI outside of those sectors to which it is well suited,"
These "well-suited" sectors were described as information-rich, workflow-standardised, and KPI-visible... a sweet spot where AI can see, read, predict, and decide with humans in the loop, producing fast, auditable return on investment (ROI)".
Examples of these were customer support, finance, legal, marketing and computer programming.
Renewables role?
Physical constraints continue to play a part in limiting capacity on overall AI growth, DNV said.
"If either lag behind demand, deployment slows and costs remain high; if both expand in step, construction accelerates, and services spread more quickly.
"The sector therefore scales at the rate set by its supply chains – and the willingness of investors to finance physical infrastructure under conditions of scarcity – rather than by unconstrained demand."
Over the longer term, DNV postulated that grid congestion, carbon constraints and the falling cost of renewables will incentivise smaller, modular data centre facilities that can be deployed closer to where power is available, scaled up over time, and run flexibly to avoid peak load periods or follow variable generation.
"This shift opens the possibility that part of future data centre growth could integrate more closely with renewables rather than relying exclusively on large grid connections," DNV stated.
Stopping short of 'superintelligence'
Some analysts have forecast extraordinarily rapid take-off scenarios for cognitive AI predicted to attain "superintelligence" in just a few years.
Others argue that progress in large language modules (LLMs) has stalled and that current enthusiasm resembles a bubble.
The DNV report leaned toward the latter view, predicting that expansion of AI will stop short of turning the technology into a 'replacement' rather than a mere ‘helper’ for performing deep cognitive tasks.
“Our forecast expects that AI adoption will not be gravity-defying," it stated.
"Instead, we see a steady embedding of AI, with aggregate productivity effects that are material, but aligned more with the pattern of historical, technology-based improvements in labour productivity rather than the unlocking of unquantifiable value creation through a rapid transition to superintelligence."
Overall, the report predicted that data centres' electricity use will quintuple to 2040, when AI will capture about 3% of global use, and traditional data centres 2%.
Energy demand from AI training and inference was forecast to surpass all other kinds of data centre use by 2035, and represent 11% (6,300TWh) of final electricity demand by 2060.
“Is this huge? Yes, it is,” said Sverre Alvik, energy transition director at DNV. "But it will still be outpaced by EV charging and cooling from air conditioners."
Only in North America will demand push further, the report suggested, forecasting that 16% of all US electricity will be consumed by data centres by 2035, including 12% for AI.
Safe AI?
Regulatory factors presented another limitation.
"Because those risks need to be addressed through rigorous assurance, certification, and operational controls, the rollout of industrial AI is likely to proceed at a pace determined by regulatory approvals, demonstrable risk reduction, and stakeholder acceptance, rather than by vendor roadmaps."
The DNV forecasting was based on assumptions about computing capacity improvements, semiconductor throughput, model efficiency and energy-infrastructure build-out.
Modelling ranged from general purpose data centres handling traditional workloads like email storage, data backup, real-time communication, gaming, streaming, and crypto to those used to train AI models and for inference — where a trained model takes new data to make decisions or predictions.
The study acknowledged the likelihood of a feedback loop — where falling energy costs per computation tend to encourage a chain effect of higher investment in AI training, resulting in more powerful AI, new and extended use cases and, ultimately, a new cycle of rising demand — but it still concluded that AI will not sustain current rates of growth.
(Copyright)