Power consumption spikes from AI data centers are making grids more volatile, a senior executive from Hitachi Energy has warned.
“AI data centres are very, very different from these office data centres because they really spike up,” Andreas Schierenbeck told the Financial Times in an interview. “If you start your AI algorithm to learn and give them data to digest, they’re peaking in seconds and going up to 10 times what they have normally used.”
Schierenbeck went on to say that “No user from an industry point of view would be allowed to have this kind of behaviour — if you want to start a smelter, you have to call the utility ahead,” suggesting that governments implement the same requirements for data center operators.
The artificial intelligence race has sparked a lot of concern about electricity supply security because of the sheer size of AI data center energy consumption. Last month, Rystad Energy said this consumption was straining grids, citing U.S. data showing that while a decade ago, data centers in the United States consumed 50 TWh of electricity, now they consume 140 TWh, or 3.5% of the country’s total electricity consumption.
An additional problem arises from the fact that Big Tech is eager to use wind and solar energy to power its data centers. These are intermittent sources of energy that only produce when the weather is favorable. However, data centers need a reliable, round-the-clock supply of energy with almost no downtime, meaning they are leaning heavily on power generation from gas, coal, and nuclear.
Meanwhile, the problem that Hitachi Energy’s Schierenbeck flagged to the FT is about to get more severe. The International Energy Agency forecast that electricity consumption by data centers is set to increase twofold, to 945 TWh, by 2030. The amount is equal to the entire consumption of Japan, the FT noted.
By Irina Slav for Oilprice.com
More Top Reads From Oilprice.com: