Technology Risk | Insights, Resources & Best Practices

As Artificial Intelligence Gains Steam, the Power Grid Heats Up

Written by David Weldon | May 2, 2025

The costs of artificial intelligence are rising. It’s not only the investments in business transformation and Big Tech companies’ capital spending, now climbing into trillions of dollars. AI is also power-hungry.

For electricity, that is. Demand from all those hyperscale computing clusters and data centers is pushing the limits of power grids. Aside from land usage and environmental impacts, for energy suppliers and consumers alike there are critical operational risk, reliability and resilience issues to address.

“Access to power is the main determination of this industry’s growth,” L. Lynne Kiesling, an American Enterprise Institute nonresident senior fellow, said at a recent AEI event on energy demands of the data-driven future. “Utilities and regulators are being forced to plan for substantial new power infrastructure, which can take six to 10 years to build.”

Data centers account for 4.4% of U.S. electrical energy consumption, equivalent to that of the State of New York, said Kiesling, who also has affiliations with Northwestern University and University of Colorado Denver. That share could double or triple over the next few years, which is not currently sustainable.

McKinsey & Co. projects that data centers will require $6.7 trillion to meet worldwide demand for compute power by 2030 – $5.2 trillion in AI capital expenditures and the rest for traditional IT applications. The AI estimate is based on a mid-range scenario, between $7.9 trillion on the high end and $3.7 trillion on the low end.

Measuring the Burden

The consumption spike begins with large language models (LLMs). By one rule of thumb, a generative AI or chatbot query uses 10 times the electricity of a common Google search.

An Association of Data Scientists “deep dive” describes the “extensive computational resources” for training the models, which have grown exponentially.

“Larger models require more computational power both for training and inference,” it explains. “For example, training GPT-3, which has 175 billion parameters, consumed an estimated 1,287 megawatt-hours of electricity, which is roughly equivalent to the energy consumption of an average American household over 120 years. In contrast, smaller models like GPT-2, with 1.5 billion parameters, consumed significantly less energy during training.”

The power draw for training state-of-the-art AI models, measured in watts, is increasing. Source: Stanford University Human-Centered Artificial Intelligence.

Jensen Huang, CEO of GPU (graphics processing unit) chip producer Nvidia, said at his company’s “Super Bowl of AI” conference in March that 100 times more computing power will be needed for advanced AI than was considered necessary a year earlier. Nvidia subsequently drew praise from President Donald Trump for committing $500 billion to build AI infrastructure in the U.S., equaling the potential sum of the Stargate venture of Oracle, OpenAI and Softbank.

Finding the Energy

A plan that OpenAI chief executive officer Sam Altman reportedly pitched last year in the United Arab Emirates called for raising $7 trillion to build 36 semiconductor plants and additional data centers. He expressed hope for nuclear fusion as an alternative, non-fossil-fuel energy source; OpenAI’s participation in Stargate implied a more U.S.-centric data center strategy.

In a January blog, Microsoft vice chair and president Brad Smith said his company in fiscal 2025 was “on track to invest approximately $80 billion to build out AI-enabled data centers to train AI models and deploy AI and cloud-based applications around the world. More than half of this total investment will be in the United States, reflecting our commitment to this country and our confidence in the American economy.”

As are other Big Techs, Microsoft is turning to nuclear power. Last September it signed a 20-year agreement with Constellation Energy that would restart the Three Mile Island Unit 1 in Pennsylvania. (That facility was shuttered in 2019 for economic reasons. The adjacent Unit 2 reactor partially melted down in 1979 and is being decommissioned.)

Although Microsoft retreated from some new data center leasing deals because of a potential “oversupply position,” according to a TD Cowen analyst, a Microsoft spokesperson said the $80 billion budget was not rolled back. Analysts at another firm, Wells Fargo, said Amazon has similarly paused some plans for leasing large-scale facilities, mainly outside the U.S.

“The rise of artificial intelligence workloads is transforming the data center industry,” according to CBRE’s most recent North America Data Center Trends report. Supply in primary data center markets, such as those in Northern Virginia, Texas and California, increased by 34% last year, to 6,922.6 megawatts, after growing 26% in 2023. The primary markets’ megawatts under construction at year-end 2024, a record 6,350, was more than double the year-earlier 3,077.8. 

Primary data center markets inventory and under construction. Source: CBRE.

“Despite record construction activity,” CBRE concluded, “the data center market will struggle to keep pace with demand, leading to higher utilization rates in existing facilities and tighter vacancy rates” – while construction and equipment costs are rising.

Nvidia’s Huang envisions “AI factories” becoming so massive – multiple interconnected million-GPU-cluster data centers – that they will be visible from space.

Managing for ROI

The costs, for both technological and human resources, will have to be justified by the return on investment, says Ian Holmes, global lead for enterprise fraud solutions at AI and analytics leader SAS. In this as in other use cases, AI deployment drives demand for data. Digitization means more transaction data of different modalities is being captured across various customer access points – more than was thought possible before.

But it is only when AI’s relevance to the organization is understood that a full AI strategy should be developed, Holmes suggests, and this runs contrary to many executives’ initial expectations.

Ian Holmes

Potentially value-adding AI initiatives thus can be “unfeasible to start,” he explains. “Many organizations are holding out until a focused problem is defined, and the business has appropriately adapted to the opportunity. Pilot projects and growth within the company often ensure the best long-term returns by allowing AI momentum to develop organically.”

Nevertheless, as demands on data centers increase “by leaps and bounds,” cloud computing enables elasticity and ease of access. “Pay-as-you-process,” as Holmes puts it, allows organizations “to more easily assess AI workloads and flexibly adjust.”

A SAS survey of 1,600 business leaders last year found that 86% were planning to invest in generative AI in the year ahead. “Acceleration in AI and GenAI will have dramatic impacts on cloud processing demand, eventually outstripping supply,” Holmes says. “This mismatch will shift the market’s point of equilibrium and increase costs.”

Challenging the Grid

At the University of Florida, “every discipline teaches and uses AI,” notes Mark A. Jamison, director and Gunter Professor, Public Utility Research Center, and an AEI nonresident senior fellow. The university’s Nvidia AI Technology Center (NVAITC-UF) is the first such Nvidia research and education collaboration in North America.

Jamison points out that the U.S. hosts 37% of the world’s data centers and is home to cloud computing leaders Amazon, Google and Microsoft, but maintaining global leadership will depend on regulatory reforms.

Mark Jamison

“Data centers are energy-intensive, with their power demand potentially doubling U.S. electricity consumption within the next decade,” Jamison wrote in a MarketWatch article. “This rapid growth comes with significant challenges for the U.S. electricity grid. Without decisive action, this surge in demand could lead to significant challenges for the grid, including congestion, higher costs and reduced reliability, threatening America’s competitive edge not just in AI, but in all industries.”

Jamison, who has a utility regulation background and served on President-elect Trump’s Federal Communications Commission transition team, would like to see the administration and state lawmakers ease limitations on utilities’ investments and pricing. He calls for streamlined permitting, clarified Federal Energy Regulatory Commission (FERC) rules, and relaxing ownership restrictions on generation.

Jamison mentioned separately that the permitting process is a “major roadblock for expanding power availability,” and “timelines can be staggering.” In one case where legal appeals are playing out, FERC blocked an Amazon Web Services colocation proposal with a Pennsylvania nuclear plant because of concerns over grid reliability and costs borne by the public.

“The time required to get new power connections for data center sites” also has international implications, said a 2024 McKinsey report. “Locations outside of the United States, such as Amsterdam, Dublin and Singapore, have placed moratoriums on many new data center builds in recent years primarily because they lack the power infrastructure to support them.”

At the Stargate announcement in January, Trump pledged to remove bureaucratic obstacles with “emergency declarations” if necessary “to get this stuff built. So they have to produce a lot of electricity. And we'll make it possible for them to get this production done easily, at their own plants if they want.”

Hunger for Data

The credit scores and decision support provided by FICO rely on extensive machine learning trained using diverse data across many financial institutions. The vast quantities of data are processed efficiently – and data center resources are less taxed – when the data is well targeted for the task. That means identifying and leveraging the “right data,” as opposed to just throwing AI at a huge data set and hoping for good outcomes, asserts FICO chief analytics officer Scott Zoldi.

Holmes at SAS agrees that it is a best practice to reduce the amount of data that AI needs to sift through, along with increasing the quality of that data.

Organizations come to realize that “AI algorithms tuned with better data can outperform the more complex ones . . . whilst attempting to ignore the bad data issues that plague most businesses,” Holmes comments. “That is, a simple AI model tuned on high-quality data can achieve results nearly as accurate as a complex model tuned on poor-quality data, and with far less investment in terms of cost, effort and processing power.”

“Keep in mind, too, that the data volume required to support GenAI models far exceeds that of traditional AI models,” he adds. “As such, AI and GenAI advances will bring new opportunities, especially for companies holding large volumes of raw data. Converting libraries of data into insights will bring high benefits for pre-tuned AI that can make sense in a data-driven world.”

Big Data Is Big – Again

“Big Data seems to have come back alive when we look at large language models, many of which want to consume all data that is available in hopes of being aware of the world’s entire knowledge,” Zoldi says.

FICO builds its own Focused Language Models (FLMs) that are “suited for the problem statement, controlled, and auditable,” he adds. “This approach reduces the training-data scope from ‘use all the world’s data’ to ‘use the data appropriate to the problem.’

“As LLMs get even larger and more unwieldy, we can expect AI to cause organizations to get surgical in their use of data, which may change the sprawling data center into the ‘research, insight, and right size’ center.”

Political Element

The same Trump administration that celebrates investment in AI and data-center infrastructure has reversed previous course on climate change and environmental, social and governance issues. That doesn’t necessarily make energy efficiency and sustainability any less of an industry priority.

“As industry clamors for AI to drive greater speed, automation and productivity, AI demands GPUs to satisfy the higher levels of processing,” Holmes says. “Banks and other businesses face the paradox of consuming more energy, while facing pressure to enhance their own environmental sustainability and promote ESG measures. The sustainability-related disclosures mandated by [International Financial Reporting Standards] will only increase scrutiny of energy consumption as a major contributor to global warming.”

Quantum computing will place additional demands on resources and, Holmes warns, the lack of a sustainable energy supply can hinder progress and innovation.

On April 25, quantum technology company IonQ announced a development partnership with Chattanooga, Tennessee-based utility EPB and its EPB Quantum Center. Installed in what they called “the first quantum computing and networking hub in the U.S.” will be IonQ’s Forte Enterprise system, “a powerful, data center-ready quantum computer designed with a low energy profile, rack-mounted form factor and minimal environmental isolation requirements.”

Indicating gains in machine-learning hardware efficiency, the 2.5 trillion FLOP (floating point operations) per watt of the Nvidia B100, released in March 2024, is 33.8 times more energy efficient than the P100 released in April 2016 (74 billion per watt). Source: Stanford University Human-Centered Artificial Intelligence.

Public and Private Participation

Both government and industry have pivotal roles in ensuring sustainable data center development, says Carson Kearl, data center lead at Enverus, an energy-dedicated software-as-a-service company and parent of Enverus Intelligence Research: “The government should streamline permitting processes for data center development, support grid infrastructure modernization, and create policies that encourage sustainable practices.”

Significant issues with AI growth and data center availability going forward include massive capital requirements, power demand and infrastructure constraints, and physical infrastructure limitations, Kearl observes.

Through 2030, the projected compound annual growth rate of U.S. data center electricity demand is 23%, according to McKinsey & Co.’s How Data Centers and the Energy Sector Can Sate AI’s Hunger for Power.

In a base-case scenario, “the data center industry will need to spend $317 billion annually just on maintenance capital by 2035,” Kearl continues. “By 2030, major tech companies will need to reinvest 31% of their operating cash flow into data center capital expenditure.

“Additionally, data center power consumption is projected to increase from 180 TWh [terawatt-hours; 1 terawatt equals 1 million megawatts] in 2023 to over 400 by 2030, necessitating 99.995% power reliability, which allows only 28 minutes of downtime annually.”

A further challenge is finding suitable sites with adequate available withdrawal capacity (AWC) and transmission infrastructure.

“The industry should invest in efficiency improvements and innovative cooling technologies, form strategic partnerships with power generators and utilities, and continue developing more energy-efficient computing solutions,” Kearl contends. “The success of these efforts will require coordinated actions between technology companies, power providers and regulatory bodies to ensure the sustainable growth of AI infrastructure while managing resource constraints.”

 

Jeffrey Kutler of GARP contributed reporting for this article.