How much electricity does AI really use worldwide

How much electricity does AI really use worldwide

As of March 2026, there is still no perfect global meter for AI alone. The strongest public benchmark measures electricity used by data centres, because AI training and inference mostly happen inside those facilities, alongside cloud storage, enterprise software, search, video delivery, and other digital workloads. So the honest answer is not a single perfectly isolated AI number. It is a well-sourced estimate built around total data-centre electricity use, with AI identified as the main force pushing that total higher.

The clearest global estimate comes from the International Energy Agency. The IEA says data centres consumed about 415 terawatt-hours of electricity in 2024, equal to roughly 1.5% of global electricity consumption. In its base case, that rises to around 945 TWh by 2030, just under 3% of global electricity demand. That 2024 figure is not “AI only,” but the report is explicit that AI is the main accelerator of the next growth wave. Electricity use in accelerated servers, driven mainly by AI adoption, is projected to grow about 30% per year and account for almost half of the net increase in global data-centre electricity demand through 2030.

That distinction matters. A lot of public discussion asks how much electricity “AI” uses as if the answer were as neat as a household utility bill. It is not. AI is embedded inside shared infrastructure. A hyperscale facility may run language models, recommendation systems, storage, analytics, and ordinary cloud computing at the same time. Even company-level prompt estimates depend heavily on whether you count only active chips, or also include idle capacity, CPUs, RAM, networking, cooling, and the rest of the data-centre overhead. Google’s own methodology note makes this point clearly: narrow estimates can understate the true operating footprint of AI at production scale.

The best global number is bigger than many people expect

Four hundred fifteen terawatt-hours is already a serious amount of electricity. It is also growing fast. The IEA says data-centre electricity demand rose by about 12% per year over the previous five years, and from 2024 to 2030 it expects average annual growth of roughly 15%, more than four times faster than electricity demand growth from all other sectors combined. Even so, the IEA adds an important dose of perspective: data-centre demand growth still accounts for less than 10% of global electricity demand growth over that period. AI is a powerful new source of demand, but it is not the only story in the electricity system. Industry, electric vehicles, air conditioning, and wider electrification still matter more in absolute terms.

That is why the most responsible answer is neither complacent nor apocalyptic. AI is not a rounding error, and it is not the whole grid either. It is a fast-growing load category that is becoming strategically important because it is concentrated, capital-intensive, and difficult to postpone once demand spikes. The global percentage may still look modest, but the growth rate is unusually sharp.

Why AI pushes electricity demand up so quickly

The reason is not mysterious. Modern AI systems rely on dense clusters of accelerators such as GPUs and TPUs, and those systems draw power not only for computation but also for cooling, power conversion, storage, networking, and redundancy. The IEA says servers account for about 60% of electricity use in modern data centres on average, while cooling can range from about 7% in efficient hyperscale sites to more than 30% in less efficient enterprise facilities. In other words, the power bill is not just “the chips.” The building around the chips matters too.

The U.S. picture shows how quickly this can scale in practice. Lawrence Berkeley National Laboratory reported that U.S. data centres used about 176 TWh in 2023, up from 58 TWh in 2014, and could reach 325 to 580 TWh by 2028. Berkeley Lab also notes that between 2017 and 2023, data-centre power demand more than doubled, largely because of growth in AI servers. That is a national example, not a global one, but it captures the broader pattern: AI’s electricity footprint rises fastest where capital, cloud platforms, chips, and transmission capacity can all be assembled at once.

A simple example helps make the numbers tangible

Single-prompt estimates should be treated carefully, because they vary by model, hardware, latency target, utilization rate, cooling system, and accounting method. But they are still useful for building intuition.

Google said in August 2025 that the median Gemini Apps text prompt used about 0.24 watt-hours under its comprehensive methodology. Sam Altman wrote in June 2025 that the average ChatGPT query used about 0.34 watt-hours. These are not interchangeable numbers and they do not describe every prompt, every model, or every future system. Google explicitly says its figure is a point-in-time estimate and not representative of all prompts or future performance. Still, together they show the right order of magnitude for common text interactions: fractions of a watt-hour, not kilowatt-hours, per typical prompt.

Here is a practical illustration. If one billion text prompts were served at 0.24 Wh each, that would equal about 240 megawatt-hours per day, or 87.6 gigawatt-hours per year. That is only an illustration, not a forecast, but it shows why seemingly tiny per-prompt numbers become infrastructure questions at internet scale. What looks small at the level of one interaction can become material when multiplied across millions or billions of users.

The pressure shows up locally before it shows up globally

This is one of the most misunderstood parts of the debate. Global percentages can make AI electricity demand look manageable. From a system-planning perspective, the harder problem is often location. The IEA notes that data centres tend to cluster in specific places, which makes grid integration more challenging than a simple global average would suggest. A new industrial load arriving in one region can require upgrades to substations, transmission lines, backup systems, and water or cooling infrastructure long before the global share looks dramatic.

That local concentration is why policy makers, utilities, and transmission planners are focused on AI even though data centres still represent a minority share of total world electricity demand. A fast-growing load in the wrong place can be far more disruptive than a larger but more evenly distributed one. The friction is geographic before it is planetary.

Efficiency will decide how steep the curve becomes

The future path is not fixed. The IEA’s scenarios show a wide spread. In its Headwinds Case, data-centre electricity demand plateaus around 700 TWh by 2035. In its Lift-Off Case, it rises beyond 1,700 TWh and reaches about 4.4% of global electricity demand. That is a huge gap, and it exists because the outcome depends on three moving targets at once: how fast AI adoption grows, how much hardware and software efficiency improves, and how quickly the energy system can build enough generation and grid capacity.

That is also why efficiency progress matters so much. Google says the energy footprint of its median Gemini text prompt fell 33-fold over a 12-month period, while response quality improved. Company claims should always be read carefully, but the broader point is solid: better models, better chips, better scheduling, and better data centres can sharply reduce electricity per task even while total usage rises. AI can become more efficient and still consume more electricity overall, because demand can grow faster than efficiency gains. That rebound effect is central to the entire debate.

So how much electricity does AI use worldwide

The best defensible answer today is this: AI does not yet have a clean standalone global electricity figure, but the infrastructure it runs on consumed about 415 TWh in 2024, and the IEA expects that total to climb to about 945 TWh by 2030 in its base case, with AI as the main driver of the increase.

That answer is more useful than a false precision number. It tells us three things at once. First, AI already sits inside a very large electricity system footprint. Second, the growth is real and fast enough to matter for grids, utilities, and industrial policy. Third, the size of the future problem will depend less on one dramatic headline number than on a continuous race between demand growth and efficiency improvement.

The deeper lesson is that the question is not whether AI uses “a lot” of electricity. It clearly does, and it will use more. The real question is whether the world can make each unit of intelligence cheaper in energy terms faster than it multiplies the number of tasks handed to machines. That is where the next decade will be decided.

Author:
Jan Bielik
CEO & Founder of Webiano Digital & Marketing Agency

How much electricity does AI really use worldwide
How much electricity does AI really use worldwide

Sources

Energy and AI
IEA flagship report on AI, data centres, electricity demand, infrastructure constraints, and future scenarios.
https://www.iea.org/reports/energy-and-ai

Energy demand from AI
IEA section with the core figures used in the article, including global data-centre electricity use in 2024 and projections to 2030.
https://www.iea.org/reports/energy-and-ai/energy-demand-from-ai

Berkeley Lab Report Evaluates Increase in Electricity Demand from Data Centers
Lawrence Berkeley National Laboratory summary of U.S. data-centre electricity growth, including the role of AI servers.
https://newscenter.lbl.gov/2025/01/15/berkeley-lab-report-evaluates-increase-in-electricity-demand-from-data-centers/

How much energy does Google’s AI use? We did the math
Google Cloud explanation of its methodology for estimating AI inference energy use, including the median Gemini Apps text prompt estimate.
https://cloud.google.com/blog/products/infrastructure/measuring-the-environmental-impact-of-ai-inference/

The Gentle Singularity
Sam Altman post cited for the stated average electricity use of a ChatGPT query.
https://blog.samaltman.com/the-gentle-singularity