top of page

AI and (a lot of) Energy

Demand for AI chips and programs has risen exponentially over the past few quarters. The rise in demand portends the start of a new age of technology with AI becoming a focal point of our innovation economy.  The problem?  The clamor for AI coincides with an already existing demand boom for cloud data centers.  With the proliferation of the cloud over the last decade, a data-center build-out boom was necessary to support the cloud infrastructure. These data centers were built in locations in close proximity to established energy, as a great deal of energy is required for data centers to function.  Now, this double demand growth for cloud data centers and AI is starting to create infrastructure and energy bottlenecks; but also, intriguingly, opportunities for renewable energy. With AI requiring a great deal of energy to function, companies are being forced to develop more creative solutions to the energy problem, and those solutions often lie in renewables.


Prior to the AI boom, demand for data centers was roaring.  Data center size is measured in energy consumption; the larger the GW, the larger the data center storage, the larger the energy it consumes. In 2022, 4.9 GW of data centers were installed globally.  As the AI demand boom started in 2023, an estimated 7.4 GW of data centers were expected to come online¹.  This growth is expected to continue, and most estimate data centers will see  2x growth in capacity by 2030.

In the US, data center power usage is expected to double from around 17 GW in 2022 to 35 GW in 2030². If that usage were a state, it would be the 9th largest state by capacity³.  Yet, that estimate was from 2023 and does not include the continued strong demand we are seeing in 2024.  This growth is creating a bottleneck when it comes to grid connections.

But why would data centers concern themselves with grid connections?  Why is energy such a concern for both data centers and AI? For perspective, data centers without an AI focus already use 10 to 50 times as much energy per floor space as a typical commercial office building⁴. Data centers are already major users of energy, and the new AI chips dwarf that energy usage.  For reference the Nvidia HG200, developed specifically to run AI, consumes two to four times as much energy as a regular cluster of similar size².  This intense need for power has now made the availability of power supply the biggest driver of locations of all new data centers, AI or cloud. 

The resulting outcome of the need for power is construction delays of two to four years for new build data centers, with permitting for getting grid connections and parts supplies for the electrical infrastructure running behind⁵.  To address the grid limitations, data centers will likely start to build out microgrids.  Some have already started by contracting with nearby solar and wind farms to access cheap renewable energy.  Renewables are positioned to benefit. We will likely see these solar and wind farms combined with battery storage systems to put even less stress on the grid for power. They are the cheapest form of energy and, with the drop in cost of batteries, the combination of renewable energy and batteries will become the norm.

The final iteration could also likely involve some type of long-term storage such as hydrogen or geothermal energy to provide continuous uninterrupted power that is not pulling from the grid.  In Wyoming, Microsoft is currently testing hydrogen fuel cells for backup power at a data center to replace diesel⁶.  If the rate of demand growth for data centers continues, a future of microgrids for data centers could likely happen sooner than most expect; otherwise grid stress will limit the growth of AI.     




14 views0 comments


bottom of page