Skip to content
AI Energy Use, Explained
Go to my account

AI Energy Use, Explained

All we’re doing with artificial intelligence takes power—a lot of it.

A photo taken on December 23, 2023, shows Tencent's largest big data center and cloud computing base in East China, which is situated in the Jiangning Development Zone in Nanjing, Jiangsu Province, China. (Photo by Costfoto/NurPhoto via Getty Images)

Artificial intelligence has made it easy for anyone to do things as whacky as create mock-ups of presidents as professional wrestlers. But the upcoming AI revolution may be a shock to power grids.

Not even experts can precisely determine how much energy AI computing takes. And because AI and its physical technology are changing so rapidly—and because AI firms have become more secretive—calculating its energy consumption is a tricky task. But what we do know strongly suggests it’ll take a lot.

Data centers: where the action is. 

Much of the digital world today runs on computers located in massive facilities known as data centers. The “cloud”—that mystical place where files are stored and computing takes place—suggests a beautiful bank of white cumulus, but it’s actually a generic set of giant warehouse-like buildings. Imagine your local Costco or Sam’s Club, but instead of distracting shelves of bargains, envision orderly racks of servers (computers), routers, storage devices (hard drives), and other equipment. 

Traditionally, data centers have provided the needs of present-day computing: internet searches, media streaming, listening to your Spotify playlist, online gaming, and more. 

And all this computing takes energy. According to estimates by the International Energy Agency (IEA), data centers consumed 240-340 terawatt hours (TWh), or 1.0 percent to 1.3 percent of total global energy consumption in 2022, excluding cryptocurrency mining, which consumed an additional 0.4 percent globally. (A terawatt hour equals 1 trillion watt hours, or Wh). To put the energy consumption of data centers in perspective, New England consumes 120-125 TWh of energy on average each year; data centers worldwide consume at least double to almost triple that amount.

Will AI data centers be energy hogs?

AI computing requires much more energy. Training the AI—that is, feeding it the data and asking it to perform the complex calculations to produce the AI model—is the most energy-intensive. For you, me, and others to use the AI model to create text, images, and deep fake videos; run robots; analyze DNA; or perform other tasks—what techies call “inference”—takes far less energy, relatively speaking. But of course, the more AI is used, the more energy we’ll need to use it. 

The amount of energy needed for training depends on the AI model being developed. According to research published in 2022, the energy required for training ranged from 2-3 kilowatt hours (one kWh equals 1,000 watt hours) for small natural language processing and computer vision models to 103,500 kWh for a 6 billion-parameter “transformer” (that’s an AI that takes an input of sequential data, such as strings of DNA or a large text, and produces an output of sequential data—for example, an analysis of the DNA or a brief summary of the large text).*

For comparison, some of the same researchers estimated the now somewhat stale GPT-3, a 175 billion-parameter AI model, to have required almost 1,300 MWh (one MWh equals 1,000 kWh) of electricity, roughly equal to the annual power consumption of 130 homes in the U.S.* This is probably much less power than was used to develop GPT-4, which reputedly was trained on 1.8 trillion parameters

(A parameter in AI is a value that controls how the machine learning model processes data and generates predictions, according to Pedro Palandrani at investment management firm Global X. “For example, in a text generation model, parameters can control the style, tone, and content of the resulting text … in general, the more parameters a model has, the more accurate and powerful it is.”)

A lot of this energy is used for cooling. About 50 percent of the energy used by traditional data centers is consumed by their equipment, according to estimates, and anywhere from 25 to 40 percent by HVAC. AI computing is generally done with hotter-running microprocessing chips—GPUs, or graphical processing units. And therefore more energy. 

The coming growth of AI data centers. 

As AI is developed and used around the world, the data centers hosting AI are changing and proliferating, with a dynamic effect on energy consumption: More demand increases energy consumption, while increased efficiencies reduce consumption. These interactions and how they add up are all a bit fuzzy, but here are some facts to ponder.

It’s not certain how many data centers there are worldwide—perhaps 9,000 to as many as 11,000, with more being designed and built. Their total energy consumption is hard to pin down, as increased demand can be mitigated by more efficient design and operations. Consumption barely budged from 2018 to 2022, according to the IEA, but is expected to double by 2026 to 1,000 TWh—about as much energy as Japan consumes annually today.

This will likely narrow down which existing data centers will host AI computing and where data centers specifically designed for AI will be built. Currently, data center hotspots in the U.S. include Northern Virginia, Silicon Valley, Dallas/Fort Worth and Chicago, according to commercial real estate services and investment firm CBRE. Data centers are on every continent except Antarctica, with the top locations (in order of megawatt inventory) Northern Virginia, London, Tokyo, Frankfurt, and Sydney. The high transmission speeds and dependable networks AI computing requires mean that AI data centers will need to be in places with dependable power grids, dense fiber optic networks, and near specific points on the internet chain. More broadly, AI data centers will be built in provinces, states, and regions with governments (and populations) that are friendlier to the companies buying the land, getting the permits, and seeking the energy draw that their data centers require—and with more data-friendly laws such as the U.S. There will also be competition for the economic benefits of AI-related industry, even in countries more wary of development and unfettered data use, such as France, which is dedicated to attracting AI data centers.

There’s already plenty of resistance around the world to data center construction, according to the London School of Economics. Concerns include the massive water use data centers require for cooling, general sustainability and community benefit concerns, and the use of open space. 

What’s being done to lessen the energy demands of AI? 

With so much at stake economically, tech companies are doing a number of things to improve the energy efficiency of AI data centers. The industry is transitioning from data centers dedicated to traditional computing to centers that increasingly handle AI. There are over 900 AI-driven hyperscale data centers, according to Synergy Research Group, almost 40 percent of which are in the U.S.

The hyperscale data centers currently being built by Microsoft, Meta, Alphabet, and other large players take up 1 million to 2 million square feet versus 100,000 square feet for the average cloud data center. Hyperscalers house more equipment and are more energy efficient by design.

Meanwhile, Microsoft may be leaning toward using nuclear power to run its AI data centers, according to The Verge, perhaps with a string of small modular reactors. The company also has agreed to purchase power from Helion, which is attempting to build a fusion power plant.

As demand for AI increases exponentially, it’s difficult to know exactly how these advances in power efficiency will pan out. Google reports that its data centers’ energy use has held stable since 2019, at a little less than 15 percent of the company’s total energy use annually, according to Yale Environment 360, an online publication of the Yale School of the Environment. If humans can’t figure it out, perhaps AI can.

Correction, March 14, 2024: Due to editing errors, an earlier version of this article misstated unit equivalencies. One kilowatt hour (kWh) equals 1,000 watt hours. One megawatt hour (MWh) equals 1,000 kilowatt hours.

Joseph Polidoro is a Sarasota, Florida-based independent science writer. His work has appeared in Scientific American and Science News.

Share with a friend

Your membership includes the ability to share articles with friends. Share this article with a friend by clicking the button below.

Please note that we at The Dispatch hold ourselves, our work, and our commenters to a higher standard than other places on the internet. We welcome comments that foster genuine debate or discussion—including comments critical of us or our work—but responses that include ad hominem attacks on fellow Dispatch members or are intended to stoke fear and anger may be moderated.

You are currently using a limited time guest pass and do not have access to commenting. Consider subscribing to join the conversation.

With your membership, you only have the ability to comment on The Morning Dispatch articles. Consider upgrading to join the conversation everywhere.