Growth Ritual #82
📋 In This Issue:
The AI Cold War Isn't About Chips, It's About Plugs
Why Your Next Million-Dollar Idea Is Invisible to Google Trends — 🔒
The $73 Billion Pivot: How Top AI Agencies Will Stop Building and Start Minting Money — 🔒
The AI Training Gym: How a "Fake" Internet Creates a Smarter Brain — 🔒
The Tangibility Edge: Warren Buffet Strategy AI Can't Touch — 🔒
Know someone who’d love this? Invite them and you’ll both win—your friend discovers the newsletter, and you unlock 1 month of premium access.
The AI Cold War Isn't About Chips, It's About Plugs
For the last two years, we've been hypnotized by a magic show. OpenAI, Google, xAI—they've been pulling rabbits out of hats, dazzling us with models that can write poetry and code.
We've all been staring at the magician's hands.
While we were watching the show, the real action was happening backstage, where they're building the theater. And it turns out the theater is the size of a city and requires a private power plant to keep the lights on.
The race isn't about the magic trick anymore. It's about who can build the biggest stage.
Microsoft just broke ground on "Fairwater," a 1.2 million square foot AI data center in Wisconsin they’re packing with hundreds of thousands of next-gen NVIDIA GB200s. Not to be outdone, Elon Musk's xAI is building "Colossus 2" in Memphis, aiming to create the world's first gigawatt-scale data center.
A gigawatt. The output of a nuclear power plant. Just to train the next version of Grok.
I spent four hours this week plugged into the Stepchange Show with Ben Shwab Eidelson and Anay Shah, and their deep dive on the history of data centers confirmed my suspicion:
We've been so obsessed with the magic on the screen that we've completely forgotten that the "cloud" isn't a cloud at all. It's a physical empire of 11,800 buildings, and right now, that empire is having a violent collision with the laws of physics.
The brutal truth is this: the winners of the next decade won't be the ones with the best algorithm.
They'll be the ones with the best power contract.
I warned you back in issue #78 that the hum from the desert was getting louder; today, we're diving into the engine room to see what’s fueling it: a bare-knuckle brawl for every last electron on the grid.
The Beautiful Lie We All Believed
Let's be clear: AWS is one of the most brilliant businesses ever conceived. On track for over $111 billion in annual revenue, they created a utility that let a generation of founders (myself included) build global products with a credit card.
They abstracted away the hardware. The cooling. The networking. We stopped thinking about servers and started thinking purely in code.
But this abstraction was so perfect, it became a lie. We forgot that behind every API call is a whirring rack of servers sucking down megawatts of power. The hyperscalers never forgot.
While we were busy optimizing our customer acquisition costs, Google was obsessing over their PUE (Power Usage Effectiveness). They weren't just building a search engine; they were building a "Warehouse Scale Computer".
Meta didn't just build a social network; they launched the Open Compute Project to commoditize their hardware supply chain, turning a cost center into a competitive weapon.
They knew the game was physical.
We were just renting space in their world.
The AI Heat Death Is Here
Then came the AI boom. And everything changed.
This isn't just "more cloud". The shift from traditional cloud computing to AI workloads is a step-function change in physical reality. A standard server rack in the AWS-era might draw 5-10 kilowatts of power.
An AI training rack, packed with next-gen NVIDIA GPUs, can draw up to 90 kilowatts.
Let that sink in. We're talking about an order-of-magnitude increase in power density. The physics of air cooling are breaking down. The industry is rapidly moving to direct-to-chip liquid cooling, running water through pipes directly on top of the processors just to stop them from melting.
This is no longer a software problem. This is a thermodynamics problem. And it leads to the single biggest bottleneck that will define the next 5 years...
The Real Gist is the Grid Connection
Forget GPU shortages. The real scarcity in AI is a grid connection.
The game has shifted from securing chips to securing a multi-year contract with a utility that can deliver hundreds of megawatts of stable power. And the desperation is palpable.
Look at Elon's playbook for Colossus 2. After facing pushback from the Tennessee utility, his team simply acquired a former power plant across the border in Mississippi and got approval to run massive gas turbines for 12 months without permits.
Read that again. They are literally creating their own private, unregulated power grid to get ahead. This isn't business as usual; this is an infrastructure arms race.
And while US tech giants are pulling off these clever, brute-force maneuvers state by state, it's crucial to zoom out and see the global power game. Because our scramble for megawatts is happening while China is executing one of the most audacious energy plays in modern history.
Recent data from Ember shows that in the first half of this year alone, China installed more than twice as much solar capacity as the rest of the world combined (256 GW vs. 124 GW).
Let’s be crystal clear: this isn’t about saving the planet. This is a geopolitical power grab.
Just as America leveraged its control of oil in the 20th century, China is cornering the market on cheap, abundant solar to fuel its 21st-century ambitions—including a limitless supply of energy for its own AI factories.
The battle for AI supremacy isn't happening on GitHub. It's happening in zoning board meetings and backroom deals with regional power utilities.
The most valuable piece of paper in tech right now isn't a patent; it's a signed Power Purchase Agreement.
So, What's the Play for You?
Okay, Selim, this is terrifying. I'm not going to go build a nuclear power plant. What does this mean for me, the ambitious founder/hacker/marketer?
This physical constraint is your biggest opportunity. While everyone else is distracted by the model-of-the-week, you can build a defensible business in the unsexy guts of it all.
1. Build the Picks & Shovels (The Physical + Digital Stack)
The biggest opportunities are in solving the new, painful problems of heat, power, and water. This isn't just a software game anymore; there's a fortune to be made in atoms, not just bits.
The Hardware Play (The Atoms):
Modular Liquid Cooling: The industry is desperate for solutions to cool 90kW racks. Design and manufacture containerized, "drop-in" liquid cooling units that can be attached to existing data centers. You're not just selling a product; you're selling them a way to upgrade to AI capacity in months, not years.
On-site Power Systems: Package battery storage (BESS) and next-gen generators into a productized service. Sell it as "Power-as-a-Service" to data centers stuck in a 3-year queue for a grid connection. You become their lifeline to getting online now.
Water Recycling Tech: xAI is spending $80 million on a wastewater facility. Develop and sell smaller, more efficient closed-loop water recycling systems for cooling. You're selling a technical solution to a political problem, and the ROI is measured in approved building permits.
The Software Play (The Bits):
SaaS for Power Management: Build the operating system for these hybrid-powered data centers. Create tools that intelligently switch between grid, battery, and on-site generation based on real-time energy prices and grid stability.
Cooling & Water Optimization: Use AI to model the thermodynamics of a data center, minimizing the energy and water needed for cooling. This is a pure opex play that sells itself.
Infrastructure Finance & Procurement: Create a platform that's like Zillow for data center construction, connecting builders with pre-vetted sites that have available power, water rights, and fiber.
2. Rethink Your Pitch
If you're building an AI startup, your infrastructure strategy is now as important as your product strategy. Don't just show VCs your slick UI; show them your plan for managing compute costs. Do you have a unique way to access cheaper, more efficient inference? That's a real moat.
3. Market Efficiency, Not Just Intelligence
The marketing narrative is about to shift. "Our AI is smarter" is table stakes. "Our AI uses 50% less energy to deliver the same result" is a powerful, cost-saving differentiator that CFOs will actually care about.
The digital world we've lived in for 20 years is slamming back to earth. The abstractions are cracking, and the brutal physics are showing through.
The biggest fortunes won't be made by those asking an AI to write a poem. They'll be made by those who understood that the AI runs on a power grid, and they got to the front of the line to plug in.
Keep reading with a 7-day free trial
Subscribe to Next Big App to keep reading this post and get 7 days of free access to the full post archives.