Volatility Of AI Usage Poses Challenge For Data Centers (2025)

Artificial intelligence is making data center electricity consumption more volatile and harder to predict. Industry executives say that willcreate problems for the growing number of data centers that produce some or all of their own power.

Volatility Of AI Usage Poses Challenge For Data Centers (1)

Data centers are getting bigger and power-hungrier, with the industry’s rising electricity usagestraining regional gridsand creating infrastructure challenges for utilities. It isn't just the scale of data center electricity demand but also the irregularity of that energy consumption thatare emerging as barriers to potential solutions for the industry’s power woes.

For most of the data center sector’s existence, the amount of electricity consumed by a given facility fluctuated minimally from minute to minute or hour to hour, with changes in power usage generally adhering to predictable patterns. But that is changing as new data centers are built to host the high-performance computing systems needed for AI.

As AI accounts for a larger share of workloads in the data centers being planned, built and delivered in the months ahead, patterns of power consumption are increasingly varied and difficult to anticipate.

This volatility can create reliability challenges for utilities and grid operators, but industry executivesspeaking last month atBisnow’sNational DICE Power Capacity, Energy and Sustainabilityeventsaid it is the growing number of data centers generating some or all of their own electricity on-site for whom AI’s power volatility poses the greatest risk.

Just as utilities must perfectly balance the supply of electricity on the grid with demand in real time to avoid catastrophic system failures, so must operators of data centers with on-site power generation “follow the load” to ensure the amount of power being produced matches the power being consumed. Dramatic shifts in power use make this more challenging, requiring more sophisticated and costly equipment and expertise, as well as someunconventional solutions from the world’s largest tech companies.

“During a situation where there is no grid available and a data center is running completely as an island, the load volatility makes it challenging,” Dick Kramp, sales director at on-site generation provider AB Energy USA, said at the DICE event, held at the Sheraton Dallas Hotel.

“We have seen huge swings now with AI, and it will take a combination of different technologies to follow the volatility of the load.”

AI’s energy demand profilesdifferfrom traditional data center workloads. Although some AI computing, such as training large language models, has energy usage that is predictable, energy usage for the AI inference computing that accounts for a growing share of capacity is far more sporadic.

Inference power usage can fluctuate based on factors that are outside of a data center operator's control, like the number of queries being processed, what users are asking and which applications they use. The result is sudden, unpredictable spikes in energy demand followed by equally unpredictable drop-offs.

For some applications, like high-frequency trading, autonomous vehicle navigation and real-time fraud detection, massive fluctuations can occur in a matter of milliseconds. Making these shifts harder to predict is the nature of graphics processing units and other AI computing hardware, which actively adjust power use based on demand.

Such power fluctuations in dense clusters of high-performance GPUs and other AI computing gear create enormous power management challenges for whoever is responsible for providing electricity to a data center, whether that is a utility, grid operator or the data center’s own power systems.

These are challenges that will only become harder to manage and more consequential as AI data center clusters become larger, according to Clayton Greer, vice president of energy at Dallas-based oil and gas firm Cholla.

“It's just a couple of hundreds of megawatts now, but whenever it becomes thousands of megawatts of AI and somebody's asking a question and you go from zero to 100% usage in fractions of a second, that's something the grid is not designed to handle,” Greer said. “Because the generation has to follow load in real time, you'll have events that can cause instability in the grid.”

While this escalating AI-driven load volatility is making life more difficult for utilities, panelists saidthey will be up to the challenge. Utilities have been balancing the grid formore than 100 years,CleanArc Data CentersChief Energy OfficerBill Thomassaid,and this is what these organizations are designed to do best.

But data centers fueled by energy generated on-site or purchased behind the meter may be another story.

Volatility Of AI Usage Poses Challenge For Data Centers (2)

Bisnow/Ally Araco

VoltaGrid’s Jason Long, Skybox Datacenters’ Rob Morris, AB Energy’s Dick Kramp, Cholla’s Clayton Greer, CleanArc’s Bill Thomas and Enverus’ Carson Kearl speak at Bisnow’s National DICE Power Capacity, Energy and Sustainability event March 19 in Dallas.

Such facilities account for agrowing share of the data center construction and planning pipeline. As wait times for grid connections near a decade in many markets, developers are increasingly pursuing projects where grid power is eschewed in favor of self-generation or direct purchase from energy developers.

Typically fueled by natural gas, these can either be a permanent power solution for a data center campus or a bridge until grid power can be acquired. Opinions across the industry vary on how widespreadsuch campuses will be, butone study anticipatesthat nearly a third of all new data centers will utilize on-site generation by 2030.

There is also a growing likelihood that even future data centers that are connected to the grid will spend more time running on their own generators and other backup power systems.

Due to concerns about grid stability, Texas lawmakers areworking to enact legislation this year that would require future data centers to disconnect from the grid at the behest of utilities during periods of peak demand. Texas is home to some of the world’s fastest-growing data center markets, and panelists said similar legislation is likely to emerge in other major data center hotbeds.

This growing cohort of self-powered data centers may struggle with the load volatility caused by AI computing, Kramp said. The generators and power management systems traditionally used in data centers just can’t handle sporadic and unpredictable AI load. Data centers will have to be more likesmall utilities, utilizing multiple sources of power like gas turbines, reciprocating engines, battery storage and renewables that feed into a“microgrid”governed by sophisticated power management systems.

“The solution is microgrids. When you look at a larger site, you have the combination of turbines to take care of the baseload, and reciprocating engines together with batteries will take care of the volatility of the load,” said Kramp, although he added that there are measures that the data center’s end user can take to reduce energy demand fluctuations.

“It would be great if you could take care of the volatility of the load already at the data center level so that the power plant itself doesn't have to be that complicated and complex to follow the load swings.”

Kramp and Greer saidthe most effective way to contend with AI’s power volatility problem may be for end users to use software or other methods to “smooth” the data center’s power demand profile artificially. Panelists pointed to Meta, which published code that made its GPUs perform meaningless calculations during lulls in AI processing.

“[It] effectively just took the volatility out of their data center load by getting all of the AI GPUs to compute fake numbers when they were switching the model weights,” Enverus Senior Energy Transition Analyst Carson Kearl said.

The development of such load-smoothing measures is “a big deal,” according to Greer. While Meta’s approach may use marginally more power, such a software-based approach eliminates demand troughs in the course of operations, reduces risk and means volatility doesn’t have to be addressed at great expense through the design of a data center’s power systems.

This isn't a new concept, Skybox Datacenters CEO Rob Morris said. He said the oil and gas industry has for years run high-performance machine learning workloads on GPUs at Texas data centers that have similarly unpredictable energy demand profiles. The petroleum firms and the data center hosts would mine bitcoin during periods of low computing demand to level out their energy consumption curve, he said.

“I'm not saying that's what should be implemented here, but like we saw with Meta, solutions will be developed at a software level to help curtail the impacts at the hardware level,” Morris said.

Learn more about how data center developers are overcoming power issues at Bisnow's three-day National Data Center Investment Conference & Expo, May 20-22. Get all the details here.

Buy Tickets Now

Volatility Of AI Usage Poses Challenge For Data Centers (2025)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Nicola Considine CPA

Last Updated:

Views: 6872

Rating: 4.9 / 5 (69 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Nicola Considine CPA

Birthday: 1993-02-26

Address: 3809 Clinton Inlet, East Aleisha, UT 46318-2392

Phone: +2681424145499

Job: Government Technician

Hobby: Calligraphy, Lego building, Worldbuilding, Shooting, Bird watching, Shopping, Cooking

Introduction: My name is Nicola Considine CPA, I am a determined, witty, powerful, brainy, open, smiling, proud person who loves writing and wants to share my knowledge and understanding with you.