AI
|6 min ReadGoogle Sends Chips to Space as AI’s Power Crunch Deepens
Tariq Al-Saidi
Senior Analyst
Published
Jan 16, 2026
AI’s problem is not electricity. It’s time.
Silicon Valley’s Energy Panic Meets Market Apathy
Microsoft CEO Satya Nadella recently admitted that “many GPUs are sitting idle because there isn’t enough power.”
Google’s answer? Launch its chips into orbit. The company’s new Project Suncatcher aims to power Tensor Processing Units (TPUs) directly from the sun in space. It sounds wild, but it’s real. Yet despite these grand moves, energy stocks have barely budged. Since early November, China’s A-shares and the Nasdaq energy sector are both flat. The biggest U.S. energy stock rose just 0.77 percent.
If tech giants are screaming for power, why does the market not care? Maybe because investors sense that this “AI power crisis” isn’t what it seems.
AI’s Power Problem Is a Timing Problem
OpenAI’s Sam Altman puts it simply: “Yes and no.” Yes, there is short-term shortage. But no, because AI itself will soon outgrow its power hunger. He believes within six years, AI’s demand will cool as efficiency soars and growth stabilizes. The shortage is real, but temporary.
Google’s Suncatcher plan captures sunlight from space and runs computation above Earth. Eighty-one satellites, orbiting 650 kilometers up, will act as a solar-powered compute cluster. Each satellite communicates via optical links with projected bandwidth of up to 10 terabits per second. Data is processed in orbit, and only final results are sent down. The power never needs to return to Earth.
That’s clever—but heat is still the nightmare. In space there’s no air for convection. Google says it’s using advanced thermal interface materials to radiate heat away, but details remain thin.
Rivals are racing too. Startup Starcloud has already launched satellites equipped with NVIDIA H100 chips, targeting a five-gigawatt orbital data center. SpaceX says it will build one. In China, the “Three-Body Constellation” project launched 12 computing satellites back in May. Everyone’s chasing the same thing: more power, even if it means leaving the planet.
GPUs Are Power-Hungry by Design
NVIDIA is both the hero and the villain here. From its Ampere to Blackwell architecture, GPU power draw has multiplied in four years. A Hopper-based rack eats 10 kilowatts; a Blackwell rack pushes 120 kilowatts. Multiply that by thousands.
Each GPU links through high-power NvLink connections and NvSwitch hubs. A 10,000-GPU cluster can consume 700 kilowatts to over a megawatt just for interconnects. Cooling adds another 180 kilowatts.
Big Tech’s new metric is not FLOPS—it’s gigawatts. OpenAI and Meta each plan to add more than 10 gigawatts of compute capacity in coming years. One gigawatt powers a million U.S. homes. The International Energy Agency says AI energy use could double by 2030, four times faster than the grid itself grows.
Goldman Sachs projects global data center demand to reach 92 gigawatts by 2027, up 50 percent. In the U.S., data centers’ share of total electricity will rise from 4 percent in 2023 to 10 percent by 2030.
And yet, energy companies lag. NextEra Energy is up only 11.6 percent over the past year. The utilities ETF XLU gained 14.8 percent, while the S&P 500 rose nearly 20 percent. If AI truly faced a power crisis, utilities should be soaring. They’re not.
The reason, again, is time. Nadella said it himself: grid connection approvals take five years, transmission lines 10 to 17. GPUs ship every quarter. Data centers go live within two years. The mismatch is the real shortage.
Nuclear’s Second Act: Small Reactors for Big Tech
Microsoft and Google are turning to nuclear. Small Modular Reactors (SMRs) are the hot new thing—compact, factory-built, low-carbon power sources that can be shipped like Lego blocks. Each SMR generates 50 to 300 megawatts, small but flexible and cheap.
Google signed a deal with Kairos Power for 500 megawatts of SMR capacity—the first tech firm to invest directly. Microsoft hired a nuclear director from Ultra Safe Nuclear Corporation to pursue SMR and Micro-Modular Reactor projects.
These are not experiments in power. They are experiments in timing—ways to shrink the decade-long delay between energy need and supply.
As Nadella said, Microsoft doesn’t lack electricity. It lacks time.
Smarter Chips, Cooler Data Centers
AI’s next breakthrough may come from efficiency, not energy. Altman notes that the cost per unit of intelligence drops 40-fold each year. The leap from GPT-4 to GPT-4o cut token costs by 150 times in one year. That means less electricity per task, not more.
Stanford’s AI Index confirms it. Reaching GPT-3.5 level accuracy now costs 20 in 2022, a 280-fold drop. Hardware is improving too. Meta’s Athena X1 chip hits 32 TOPS per watt, doubling efficiency. NVIDIA’s H200 is 1.4 times more efficient than the H100.
Data centers have cut their Power Usage Effectiveness (PUE) from 2.5 a decade ago to 1.5 today. Google’s latest facilities hit 1.1. Liquid cooling and AI-based energy management systems keep squeezing waste.
So yes, AI is consuming more power. But it is also learning to consume it better.
When the dust settles, energy markets will not just serve AI—they will be rebuilt around it. Whether AI’s demand peaks or cools, the infrastructure built for it will power the next industrial wave.
Disclaimer: This document is intended for informational and entertainment purposes only. The views expressed in this document are not, and should not be taken as, investment advice or recommendations. Recipients should do their own due diligence, taking into account their specific financial circumstances, investment objectives and risk tolerance, which are not considered here, before investing. This document is not an offer, or the solicitation of an offer, to buy or sell any of the assets mentioned.