top of page

How Will AI’s Insatiable Demand for Energy Meet Today’s Limited Production?

Data center

 

AI Energy Demand and the Global Power Crunch


Artificial intelligence is growing at a pace the world has never experienced before. From smart assistants to advanced analytics, AI now sits behind almost every digital tool we rely on. But powering this new intelligence comes at a significant cost: AI energy demand is rising far faster than today’s electricity systems can keep up with.

 

Data centres, GPU clusters, and high-performance computing hubs now consume astonishing amounts of power. AI training runs for large language models (LLMs) require tens of thousands of processors running non-stop for weeks or months. At the same time, national energy systems are struggling to expand generation, modernise ageing grids, and maintain reliability.

 

This article explores how AI’s exploding demand for electricity will collide with the reality of today’s energy constraints — and what governments, industry leaders, and policymakers can do to close the gap.

 

AI Energy Demand Is Surging Beyond Expectations


The world’s AI infrastructure is expanding so quickly that it is outpacing almost every other form of industrial growth. Modern AI models rely on vast amounts of computing power, and that computing power requires continuous electricity.

 

Massive power loads from data centres


Next-generation AI data centres are unlike anything seen in traditional IT. Many facilities already exceed 100 megawatts (MW) of power usage, and future clusters may require a full gigawatt (GW). This is similar to the output of a large power station.

 

Global modelling suggests that:


  • Data centres could consume up to 20% of global electricity by 2035.

  • In the United States, data centres may account for 9% of national electricity demand by 2030.

  • GPU-accelerated AI servers have grown from less than 2 TWh of energy use in 2017 to more than 40 TWh in 2023.

 

This rise is driven mainly by large-scale AI training workloads. Training a modern LLM involves adjusting billions of parameters across thousands of GPUs or TPUs running in parallel. Each training cycle can run for weeks or months, consuming energy equivalent to powering tens of thousands of households.

 

power grid

Energy for cooling: the hidden giant


Most people focus on the processors, but cooling consumes almost as much energy as computation itself. High-performance GPUs generate enormous heat, forcing data centres to deploy advanced cooling methods such as liquid cooling, immersion cooling, and recycled water cooling systems.

 

The Environmental Cost Is Rising


AI expansion affects far more than electricity supply. It also drives increased water use for liquid and evaporative cooling, higher greenhouse gas emissions from fossil fuel energy, rapid growth of e-waste, and greater demand for rare earth minerals used in chip production.

 

Why AI Training Consumes So Much Energy


Training modern AI models involves staggering levels of computation. To adjust billions or even trillions of parameters, thousands of GPUs must operate around the clock for extended periods. Interruptions or restarts can compound energy use further.

 

Only a few organisations — including Google, Amazon, Microsoft, Meta, and specialised AI labs — can afford to train frontier models at scale due to the extremely high hardware, electricity, cooling, and infrastructure costs.

 

Retraining models adds further demand. AI systems must be updated regularly to handle new data, improve safety, reduce bias, and maintain accuracy.

 

AI Energy Demand Is Reshaping National Strategies


Governments are adapting their national energy strategies to the realities of AI. Traditional power systems were not built for this scale, so countries are expanding, diversifying, and modernising their energy infrastructure.


wind farm

 

Diversifying energy sources


Governments are investing in geothermal, nuclear, natural gas, hydropower, and large-scale renewables such as wind and solar paired with battery storage. This spreads risk and improves energy security for AI clusters.

 

Modernising the grid


AI data centres require 24/7 stable power. Most national grids need significant upgrades to meet this requirement, including new transmission capacity, smart grid systems, high-capacity connectors, and grid-scale battery storage.

 

Streamlining regulation


Governments are speeding up permitting for data centres, renewable energy projects, transmission lines, and even nuclear facilities to support rapid AI growth.

 

Strategic siting


To reduce strain on local grids, governments encourage the use of brownfield sites, decommissioned industrial land, and areas with strong renewable supply.

 

The Grid Is Under Pressure


AI’s rapid growth is shifting electricity demand in ways grid operators have never encountered. In several regions, data centre loads are reshaping consumption patterns and accelerating demand growth.

 

Examples from the United States include:


  • Ameren Missouri’s average project size rising from 3.2 MW in 2019 to 181.2 MW in 2024.

  • Oklahoma Gas & Electric forecasting more than 20% growth in electricity demand over five years.

  • Evergy reporting 6 gigawatts (GW) of planned AI-related projects.

 

Other high-demand sectors such as semiconductor manufacturing and EV production are adding further strain.


control center

 

Managing AI-Driven Demand


To stabilise the grid, utilities and regulators use tools such as fair rate design, grid investment planning, demand response programmes, and data centre efficiency standards.

 

AI’s Growing Environmental Footprint


Beyond electricity use, AI consumes vast amounts of water for cooling, generates large volumes of e-waste, and accelerates mineral extraction. Without careful planning, AI could undermine environmental and climate goals.

 

The Long-Term Sustainability Challenge


Electricity grids may struggle to keep up, climate targets could be threatened, and resource shortages may emerge if AI continues expanding without mitigation.

 

Innovation to Close the Gap


Solutions include more efficient data centres, AI-optimised workload scheduling, advanced cooling systems, renewable energy microgrids, small modular nuclear reactors, and geothermal plants dedicated to high-density compute loads.

 

Policy Options for Balancing AI and Energy


Governments can support sustainable AI growth by investing in clean energy, modernising grids, improving cost-sharing rules, and coordinating planning across public and private sectors.

 

Practical Steps for Data Centre Operators


Data centre operators can reduce their footprint by improving cooling efficiency, participating in demand response, extending hardware life, increasing server utilisation, and using AI to optimise internal operations.

 

Aligning AI’s Growth With Planetary Limits


AI is transforming society, but its energy demand is rising at unprecedented speed. With coordinated action — from clean energy expansion to better regulation and smarter design — the world can support AI’s growth without overwhelming energy systems or harming the environment.

 

Key Recommendations


- Expand clean, reliable energy sources.

- Modernise electricity grids with smart technologies.

- Streamline approvals for critical infrastructure.

- Ensure fair and transparent cost allocation.

- Improve efficiency in AI data centres.

- Use AI to optimise the energy system.

- Support greener hardware and recycling efforts.

 

Stay Connected


For more insights, subscribe at: www.Georgejamesconsulting.com


GJC

George James Consulting logo

Strategy – Innovation – Advice – ©2023 George James Consulting

bottom of page