In the race for dominance in artificial intelligence, American tech giants hold the financial resources and the computing chips, but they are now facing a significant new obstacle: power.
“The biggest issue we are now having is not a compute glut, but it’s the power and… the ability to get the builds done fast enough close to power,” Microsoft CEO Satya Nadella admitted during a recent podcast with OpenAI’s chief Sam Altman.
“So if you can’t do that, you may actually have a bunch of chips sitting in inventory that I can’t plug in,” Nadella added, highlighting the bottleneck that is hindering the rapid expansion of AI infrastructure.
This power dilemma mirrors the dotcom boom of the 1990s, when companies spent vast sums to build internet infrastructure. Today’s tech giants are similarly investing unprecedented amounts of money to construct the silicon backbone of the AI revolution.
Google, Microsoft, Amazon Web Services (AWS), and Meta (formerly Facebook) are drawing on their massive cash reserves, with plans to spend roughly $400 billion in 2025 and even more in 2026. For now, they are bolstered by the enthusiasm of investors keen to fuel the next big technological leap.
This financial might has helped ease one major obstacle: acquiring the millions of chips needed to fuel the computing power race. In response, the tech giants are ramping up their in-house processor production as they seek to challenge global leader Nvidia in the AI chip market.
However, these chips will need to be housed in the racks of colossal data centres, which are themselves massive consumers of electricity and water for cooling.
Energy Infrastructure Strain
Building these data centres is no easy task. On average, it takes about two years to build a data centre in the United States, but bringing new high-voltage power lines into service can take five to ten years.
The “hyperscalers” — as the largest tech companies are known in Silicon Valley — saw this energy wall coming. A year ago, Dominion Energy, Virginia’s main utility provider, had already received data-centre orders totalling 40 gigawatts of capacity — enough to power 40 nuclear reactors.
Since then, the capacity required in Virginia, which is the world’s largest cloud computing hub, has risen to 47 gigawatts, the company recently announced.
Data centres in the US, already blamed for driving up household electricity bills, are expected to account for 7 to 12 percent of the nation’s total energy consumption by 2030, up from 4 percent today, according to various studies.
Some experts, however, caution that these projections could be inflated. Jonathan Koomey, a renowned expert from UC Berkeley, warned in September that “both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use.” He added that, like the late 1990s internet bubble, “many data centres that are talked about and proposed and in some cases even announced will never get built.”
The Energy Crisis
If the forecasted growth in data centres does materialise, it could create a 45-gigawatt energy shortfall by 2028 — the equivalent of the electricity consumption of 33 million American households, according to Morgan Stanley.
To cope, several US utilities have already delayed the closure of coal plants, despite coal being the most polluting energy source. Natural gas, which powers around 40 percent of data centres worldwide, according to the International Energy Agency, is also experiencing a resurgence as it can be deployed more quickly.
In Georgia, where data centres are proliferating, one utility has sought permission to install 10 gigawatts of gas-powered generators to meet the growing demand.
Some companies, including Elon Musk’s startup xAI, are also turning to international sources for quick solutions, such as purchasing used turbines from overseas. Even the recycling of aircraft turbines, an older niche solution, is gaining attention as a means to build power generation capacity quickly.
“The real existential threat right now is not a degree of climate change. It’s the fact that we could lose the AI arms race if we don’t have enough power,” argued US Interior Secretary Doug Burgum in October.
Alternative Energy Solutions
Despite these challenges, tech giants are quietly backtracking on their climate commitments. Google, for example, removed its pledge to achieve net-zero carbon emissions by 2030 from its website in June, even though the company had previously made it a key part of its environmental strategy.
Instead, tech companies are focusing on long-term energy projects. Amazon, for instance, is championing the revival of nuclear power through Small Modular Reactors (SMRs), a new and experimental technology that is expected to be easier to construct than traditional nuclear reactors. Google plans to restart a reactor in Iowa by 2029, while the Trump administration announced in late October an $80 billion investment to begin building ten conventional nuclear reactors by 2030.
At the same time, hyperscalers are investing heavily in solar power and battery storage, particularly in states like California and Texas. Texas, in particular, plans to add approximately 100 gigawatts of capacity from these renewable technologies by 2030.
Finally, both Elon Musk’s Starlink program and Google have proposed launching chips into space, where they would be powered by solar energy. Google plans to begin testing this concept in 2027.
The Path Forward
The AI revolution is undoubtedly transforming the technological landscape, but as it advances, it faces significant challenges in energy infrastructure. The race to harness the power needed for AI could be as decisive as the race for AI itself. Whether these tech giants can overcome their power problem will likely shape the future of AI and the broader global economyTop of Form
