Microsoft CEO Satya Nadella says the AI boom has hit a new kind of ceiling... and it’s not GPUs.
Speaking on the BG2 podcast alongside OpenAI CEO Sam Altman, Nadella said Microsoft is “not chip supply constrained” anymore. The real problem, he explained, is finding enough powered, fully built-out data centres — the “warm shells” close to grid capacity — actually to switch all those accelerators on.
In his words, you can “have a bunch of chips sitting in inventory” that simply can’t be plugged in.
It’s a notable shift in tone from the last couple of years, when the industry obsessed over Nvidia GPU shortages and supply-chain snags. Now, Nadella says, the real brakes are local grid limits, planning and permitting delays, and power-delivery bottlenecks that can stall or even kill AI projects long after the hardware has been ordered.
That power crunch has knock-on effects. AI data centres are already drawing as much electricity as small cities, pushing cloud giants to lock in long-term energy deals, look at on-site generation and even explore small modular nuclear reactors to keep future clusters running.
The message for investors and regulators is blunt: the next phase of the AI race won’t just be about who can buy the most GPUs, but who can secure the most reliable, scalable power to feed them.




