How much energy will AI really consume? The good, the bad and the unknown

Researchers want firms to be more transparent about the electricity demands of artificial intelligence
The aroma of hay and manure hangs over Culpeper County, Virginia, where there’s a cow for every three humans. “We’ve got big farms, most still family-owned, and a lot of forests,” says Sarah Parmelee, one of the county’s 55,000 residents. “It’s very charming small-town USA,” she adds.
But this pastoral idyll is in the middle of a twenty-first-century shift. Over the past few years, the county has approved the construction of seven large data-centre projects, which will support technology firms in their expansive plans for generative artificial intelligence (AI). Inside these giant structures, rows of computer servers will help to train the AI models behind chatbots such as ChatGPT, and deliver their answers to what might be billions of daily queries from around the world.
In Virginia, the construction will have profound effects. Each facility is likely to consume the same amount of electrical power as tens of thousands of residential homes, potentially driving up costs for residents and straining the area’s power infrastructure beyond its capacity. Parmelee and others in the community are wary of the data centres’ appetite for electricity — particularly because Virginia is already known as the data-centre capital of the world. A state-commissioned review, published in December 2024, noted that although data centres bring economic benefits, their growth could double electricity demand in Virginia within ten years1.
“Where is power going to come from?” asks Parmelee, who is mapping the rise of data centres in the state and works for the Piedmont Environmental Council, a non-profit organization headquartered in Warrenton, Virginia. “They’re all saying, ‘We’ll buy power from the next district over.’ But that district is planning to buy power from you.”
Similar conflicts about AI and energy are brewing in many places around the globe where data centres are sprouting up at a record pace. Big tech firms are betting hard on generative AI, which requires much more energy to operate compared with older AI models that extract patterns from data but don’t generate fresh text and images. That is driving companies to collectively spend hundreds of billions of dollars on new data centres and servers to expand their capacity.
From a global perspective, AI’s impact on future electricity demand is actually projected to be relatively small. But data centres are concentrated in dense clusters, where they can have profound local impacts. They are much more spatially concentrated than are other energy-intensive facilities, such as steel mills and coal mines. Companies tend to build data-centre buildings close together so that they can share power grids and cooling systems and transfer information efficiently, both among themselves and to users. Virginia, in particular, has attracted data-centre firms by providing tax breaks, leading to even more clustering.
“If you have one, you’re likely to have more,” says Parmelee. Virginia already has 340 such facilities, and Parmelee has mapped 159 proposed data centres or expansions of existing ones in Virginia, where they account for more than one-quarter of the state’s electricity use, according to a report by EPRI, a research institute in Palo Alto, California2. In Ireland, data centres account for more than 20% of the country’s electricity consumption — with most of them situated on the edge of Dublin. And the facilities’ electricity consumption has surpassed 10% in at least 5 US states.
Complicating matters further is a lack of transparency from firms about their AI systems’ electricity demands. “The real problem is that we’re operating with very little detailed data and knowledge of what’s happening,” says Jonathan Koomey, an independent researcher who has studied the energy use of computing for more than 30 years and who runs an analytics firm in Burlingame, California.
“I think every researcher on this topic is going crazy because we’re not getting the stuff we need,” says Alex de Vries, a researcher at the Free University of Amsterdam and the founder of Digiconomist, a Dutch company that explores the unintended consequences of digital trends. “We’re just doing our best, trying all kinds of tricks to come up with some kind of numbers.”
Working out AI’s energy demands
Lacking detailed figures from firms, researchers have explored AI’s energy demand in two ways. In 2023, de Vries used a supply-chain (or market-based) method3. He examined the power draw of one of the NVIDIA servers that dominates the generative AI market and extrapolated that to the energy required over a year. He then multiplied that figure by estimates of the total number of such servers that are being shipped, or that might be required for a particular task.
De Vries used this method to estimate the energy needed if Google searches used generative AI. Two energy-analyst firms had estimated that implementing ChatGPT-like AI into every Google search would require between 400,000 and 500,000 NVIDIA A100 servers, which, based on the power demand of those servers, would amount to 23–29 terawatt hours (TWh) annually. Then, estimating that Google was processing up to 9 billion searches daily (a ballpark figure from various analysts), de Vries calculated that each request through an AI server requires 7–9 watt hours (Wh) of energy. That is 23–30 times the energy of a normal search, going by figures Google reported in a 2009 blogpost (see go.nature.com/3d8sd4t). When asked to comment on de Vries’ estimate, Google did not respond.
This energy calculation felt like “grasping at straws”, de Vries says, because he had to rely on third-party estimates that he could not replicate. And his numbers quickly became obsolete. The number of servers required for an AI-integrated Google search is likely to be lower now, because today’s AI models can match the accuracy of 2023 models at a fraction of the computational cost, as US energy-analyst firm SemiAnalysis (whose estimates de Vries had relied on) wrote in an e-mail to Nature.
Even so, the firm says that the best way of assessing generative AI’s energy footprint is still to monitor server shipments and their power requirements, which is broadly the method used by many analysts. However, it is difficult for analysts to isolate the energy used solely by generative AI, because data centres generally perform non-AI tasks as well.
Bottom-up estimates
The other way to examine AI’s energy demand is ‘bottom-up’: researchers measure the energy demand of one AI-related request in a specific data centre. However, independent researchers can perform the measurements using only open-source AI models that are expected to resemble proprietary ones.
The concept behind these tests is that a user submits a prompt — such as a request to generate an image, or a text-based chat — and a Python software package called CodeCarbon then allows the user’s computer to access technical specifications of the chips that execute the model in the data centre. “At the end of the run, it’ll give you an estimate of how much energy was consumed by the hardware that you were using,” says Sasha Luccioni, an AI researcher who helped to develop CodeCarbon, and who works at Hugging Face, a firm based in New York City that hosts an open-source platform for AI models and data sets.
Luccioni and others found that different tasks require varying amounts of energy. On average, according to their latest results, generating an image from a text prompt consumes about 0.5 Wh of energy, while generating text uses a little less. For comparison, a modern smartphone might need 22 Wh for a full charge. But there is wide variation: larger models require more energy (see ‘How much energy does AI use?’). De Vries says that the numbers are lower than those in his paper, but that might be because the models used by Luccioni and others are at least an order of magnitude smaller than the model underlying ChatGPT — and because AI is becoming more efficient.

Source: HuggingFace AI Energy Score Leaderboard
The numbers are a lower bound, according to Emma Strubell, a computer scientist and collaborator of Luccioni’s at Carnegie Mellon University in Pittsburgh, Pennsylvania. Otherwise, “companies would be coming out and correcting us”, she says. “They’re not doing that.”
What’s more, firms generally withhold the information the software needs to estimate the energy used for data-centre cooling. CodeCarbon also cannot access the energy consumption of some types of chip. This includes Google’s proprietary TPU chips, says Benoît Courty, a data scientist in France who maintains CodeCarbon.
Luccioni has also studied how much energy it takes to train generative AI models — when a model extracts statistical patterns from massive amounts of data. But if models are receiving billions of queries daily, as de Vries assumed for his Google estimates, then the energy used to answer those queries — amounting to terawatt hours of electricity — will dominate AI’s annual energy demands. Training something the size of GPT-3, the model behind the first version of ChatGPT, requires energy in the order of a gigawatt hour.
Last month, Luccioni and other researchers launched the AI Energy Score project, a public initiative to compare the energy efficiency of AI models on different tasks, which gives each model a star rating. Developers of proprietary, closed models are also able to upload test results, although only the US software firm Salesforce has participated so far, Luccioni says.
Companies are becoming increasingly close-mouthed about the energy requirements of their up-to-date industry models, Strubell says. As competition has heated up, “there has been a closing down of information shared outside of the company”, she says. But firms such as Google and Microsoft have reported that their carbon emissions are increasing, which they have ascribed to data-centre construction in support of AI. (Companies including Google, Microsoft and Amazon didn’t address criticisms about a lack of transparency when asked by Nature; instead they emphasized that they were working with local authorities to ensure that new data centres don’t affect local utility supplies.)
Some governments now require more transparency from companies. In 2023, the European Union adopted an Energy Efficiency Directive, which requires operators of data centres that are rated for at least 500 kilowatts of power to report their energy consumption each year.
Global projections
On the basis of supply-chain estimation methods, analysts say that data centres currently use just a small proportion of the world’s electricity demand. The International Energy Agency (IEA) estimates4 that the electricity used by such facilities in 2022 was 240–340 TWh, or 1–1.3% of world demand (if cryptocurrency mining and data-transmission infrastructure are included, this raises the proportion to 2%).
The AI boom will increase this, but with world electricity consumption expected to grow by more than 80% by 2050 owing to the electrification of many industries, the rise of electric cars and greater demand for air conditioning, data centres “account for a relatively small share of overall electricity demand growth”, the IEA reports4 (see ‘World electricity growth’).

Source: Ref. 4
Even with approximations of AI’s current energy demands, it’s difficult to project future trends, warns Koomey. “Nobody has any idea what data centres, either AI or conventional, will use even a few years from now,” he says.
The main problem is disagreements about the number of servers and data centres that will be needed, and it’s an area where utilities and tech companies have financial incentives to inflate numbers, he says. Many of their predictions are based on “simple-minded assumptions” he adds. “They extrapolate recent trends out 10 or 15 years into the future.”
Late last year, Koomey co-authored a report funded by the US Department of Energy5, which estimated that US data centres currently use 176 TWh (4.4%) of the country’s electricity, and that this might double or triple by 2028 to between 7% and 12%.
Enjoying our latest content?
Login or create an account to continue
- Access the most recent journalism from Nature's award-winning team
- Explore the latest features & opinion covering groundbreaking research
or
Sign in or create an accountNature 639, 22-24 (2025)
doi: https://doi.org/10.1038/d41586-025-00616-z
This story originally appeared on: Nature - Author:Sophia Chen