January 8, 2025

AI’s power gridlock: A collaborative approach to data center energy challenges

The rapid rise of artificial intelligence (AI) applications and their massive energy demands have created an urgent power gridlock problem (pun intended!). As a result, data centers, particularly hyperscale facilities powering AI systems, now require unprecedented electricity consumption rates.

By Allan Schurr, Chief Commercial Officer  

The rapid rise of artificial intelligence (AI) applications and their massive energy demands have created an urgent power gridlock problem (pun intended!). As a result, data centers, particularly hyperscale facilities powering AI systems, now require unprecedented electricity consumption rates. At the same time, industry experts, utility providers, and policymakers are grappling with the strain AI workloads are placing on our electrical grid. Recent reports by the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE) have helped shed light on the challenges and propose strategies to meet the growing demand for power.

Several common threads between the two reports include expected and novel recommendations. Expected recommendations include clean energy supplies, efficient equipment, and building systems, as well as the novel concept that the adoption of flexible, reliable grid-interactive energy technologies can ensure a sustainable, scalable solution to power the data centers of the future. It’s important to note that while the energy demands of AI workloads have grown exponentially, many viable solutions are already within our reach. 

The EPRI report: Defining the challenge

Hyperscale data centers, which power cloud giants like Amazon AWS, Google Cloud, and Microsoft Azure, are leading the charge in electricity consumption and efficiency innovations. According to EPRI’s research, these hyperscale centers, along with co-location facilities, account for roughly 60-70% of the total U.S. data center load. This rapid growth is driven largely by AI applications, which are placing unprecedented demands on power infrastructure. 

Take ChatGPT, for instance. In just two months, it garnered 100 million users worldwide, and a single query on the platform is estimated to consume 10 times more electricity than a standard Google search (2.9 Wh vs. 0.3 Wh). These numbers are just a small sample of the vast energy requirements AI applications are now imposing on the grid. And those power demands are only going to increase.

Four different modeling scenarios for data center load projections through 2030

Data centers accounted for approximately 4% of the U.S.’s electricity consumption in 2023, and projections of potential growth rate range from 3.7% to 15% by 2030.

In their report, EPRI outlined three key recommendations to tackle this power challenge:

Operational efficiency: Data centers must focus on optimizing AI algorithms, computing hardware, and cooling technologies to improve energy efficiency.

Collaboration with utilities: As the grid becomes more strained, utilities and data centers must collaborate more effectively to find solutions.

Scalable, clean energy: Scalable and clean energy sources are essential to sustain the growth of AI and data centers.

The EPRI report emphasizes that addressing these energy needs requires not just technological innovation but a systemic collaboration between data centers and utilities. And the path forward involves improving efficiency, developing scalable energy sources, and working together to build a resilient, flexible power infrastructure.

The DOE report: Seeking solutions

While EPRI’s report does a fantastic job of defining the challenges and next steps, the DOE report offers more of a roadmap for solutions. One of the key points in the DOE report is the need to establish a framework for data centers to optimize energy consumption and contribute back to the grid. 

The DOE emphasizes the urgency of real-world solutions to electricity supply bottlenecks. Their report calls for innovation in grid enhancing technologies (GETs) with a focus on cost, performance, reliability, availability, and supply chain. In other words, power solutions must be efficient, scalable, and reliable to meet the growing needs of data centers.

These reports align closely with Enchanted Rock’s current initiatives in grid support, demonstrating how data centers can play an active role in stabilizing the grid during times of peak demand. 

Enchanted Rock has already taken major steps to address this problem with our natural gas microgrids that can power data centers while providing flexibility for utilities to firm up supply. Through our Bridge-to-Grid to Backup solution, data centers can have access to reliable power while supporting the grid in times of stress. And since speed to market is essential, our microgrids provide a proven rapid deployment solution, supporting not just data centers but broader economic development in the regions they serve.

The 500-hour capacity problem

A critical issue identified by both EPRI and the DOE is the need for data centers to secure around 500 hours of dispatchable capacity to ensure continuous operations. Diesel generators, commonly used for emergency backup power, are increasingly limited to emergencies only and cannot provide ongoing disptachable grid services. Some have suggested renewable diesel as a low-emission alternative, but this option has significant limitations, including its unsuitability for sustained grid support due to air permit restrictions and fuel supply chains.

Data centers must explore alternatives that are scalable and sustainable. Natural gas remains the cleanest and most reliable option that’s cost effective now. The infrastructure for natural gas is already in place, and it can provide grid support during periods of stress or emergencies. Furthermore, renewable natural gas (RNG) offers a cleaner, carbon-neutral fuel option that can leverage the same infrastructure to reduce emissions.

The DOE report also highlights that the timeline for deploying new energy technologies like nuclear power at scale could take decades. This is particularly concerning given the ongoing decommissioning of coal plants and the slow development of new transmission infrastructure. Immediate, proven solutions like natural gas microgrids are critical to filling the gap – for both long term and for the 500-hour dispatchable capacity needs – while ensuring reliable power for data centers and the grid.

A collaborative path forward

Both the EPRI and DOE reports take a pragmatic approach to the data center power gridlock, and their recommendations align closely with the solutions Enchanted Rock has been providing to critical infrastructure over the last 15 years. Our collective challenge now is scaling these solutions and fostering collaboration between data centers, utilities, and technology providers.

Enchanted Rock has been focused on supporting customers and evolving our technology offerings for over a decade. We’re providing backup power and grid support across the Gulf Coast, for entire communities in California, and for large wholesale markets like ERCOT MISO, and CAISO. And we have the data to prove the efficacy of our technology. Our extensive historical record can contribute to further research and innovation in grid support, resiliency, and emissions reduction. And by working together with stakeholders like the DOE and EPRI, we can advance the best practices for ensuring reliable, scalable power for data centers as AI workloads continue to grow.

Now, as the need for scalable, reliable power solutions for hyperscale and co-location data centers becomes increasingly urgent, we see it as an opportunity to step forward and raise awareness about flexible power generation and resiliency. Through collaboration and continued innovation, we can ensure that the power gridlock becomes merely a waystation toward a more sustainable and reliable energy future for data centers.

Subscribe for industry news and updates

By subscribing you agree to with our Privacy Policy

Share