Treat Others How THEY want to be treated!!

How are we going to power all this AI Stuff??

The Infrastructure Reality Nobody Wants to Talk About

So everyone’s obsessing over AI and who’s going to win the model wars, but there’s this massive problem that’s way more fundamental than anyone’s discussing. We can’t actually power all this stuff.

I keep hearing founders talk about their AI strategies like infrastructure is just some detail they’ll figure out later. But the math is getting really ugly really fast. Data centers that used to need 30 megawatts now need 300 or more. Some of these new AI facilities are pulling as much power as a small city.

Here’s what’s actually happening on the ground. Amazon just spent $650 million to buy a data center right next to a nuclear plant in Pennsylvania. Not because they wanted to make some environmental statement, but because they literally couldn’t get enough power anywhere else. Then the feds blocked them from getting more juice from the reactor. Six hundred and fifty million dollars, and they still can’t get the power they need.

Mark Zuckerberg basically said it out loud on a podcast recently. Energy constraints are now the biggest bottleneck to building AI data centers. Not chips, not talent, not even money. Power. The thing we’ve taken for granted since forever is now the limiting factor for the entire industry.

And it’s not just the big guys hitting these walls. The whole data center development process has gone completely sideways. Projects that used to take 18 months now take three years, assuming you can even get a spot in line. Transformers for power distribution? Eighteen month wait times. Backup generators? Almost two years. These aren’t software delivery timelines we’re talking about, these are massive industrial components that you can’t just spin up in AWS.

The really wild part is what’s happening in Northern Virginia, which is basically the data center capital of the world. They’ve got a 0.9% vacancy rate. That’s not a typo. Less than one percent of data center space is available, and the power grid literally can’t handle any more load. Dominion Energy had to tell new projects they can’t get connected until 2026, maybe longer.

But here’s where it gets interesting from a business perspective. The economics are starting to break down in ways that should make everyone nervous. Goldman Sachs is basically asking whether AI can solve a trillion dollar problem when we need to spend a trillion dollars just on the infrastructure to run it. That’s a fair question.

Look at what the hyperscalers are spending. Amazon’s planning $100 billion in infrastructure next year. Microsoft, $80 billion. Google, $75 billion. Meta, $65 billion. These numbers are so big they’re almost abstract, but they represent like 20% of these companies’ revenue going just to infrastructure. That’s not R&D or acquisitions or anything that directly creates products. Just keeping the lights on and the servers running.

The power situation is even weirder when you dig into it. AI training runs can pull 25 megawatts continuously for three months. That’s enough to power 20,000 homes, just for one training job. And these things create massive power fluctuations that the grid wasn’t designed for. Engineers are literally writing code with variables called things like “pytorch_no_powerplant_blowup” to manage the power load. That’s not a joke, that’s actual infrastructure management now.

Texas has 108 gigawatts of data centers wanting to connect to their grid. For context, the entire US peaks at about 745 gigawatts total. So Texas alone has requests for like 15% of total US power capacity, just for data centers. They approved 3 gigawatts last year out of those requests. Three out of 108.

What’s happening is we’re hitting the physical limits of how fast you can build power infrastructure. You can’t just download more electricity. Grid upgrades take five to seven years. Data centers take two. So even if you start planning today, you’re still going to be constrained for years.

And the costs are getting shifted in ways that nobody’s really talking about. Harvard Law did this analysis showing how utilities are basically subsidizing data center growth by raising rates on regular customers. Virginia residents are looking at $14 to $37 more per month on their electric bills by 2040, mostly because of data center growth.

The companies know this is a problem. Microsoft just signed a 20-year deal to restart Three Mile Island. Amazon’s buying up every piece of land next to power plants they can find. Google and Meta are throwing billions at nuclear startups betting on small modular reactors that won’t be ready until the 2030s. These aren’t the moves of companies that think the power situation is going to resolve itself.

But here’s what’s really interesting. Utility companies are starting to say the demand projections might be inflated. Project cancellation rates are running around 30%. Georgia Power just removed 5,400 megawatts from their demand forecasts in 18 months because projects kept getting canceled or delayed.

So you’ve got this weird situation where everyone’s planning for massive AI buildouts, but the infrastructure to support it doesn’t exist and won’t exist for years. And the economics are getting questionable enough that a bunch of projects are getting canceled even after companies put down massive deposits.

The edge computing story might be part of the solution, but it’s not like you can just distribute a ChatGPT-sized model across a bunch of small servers. Some workloads need the massive centralized compute. And the efficiency gains from new chip architectures and better software are real, but they’re not keeping pace with the growth in AI model sizes and usage.

I think what’s going to happen is we’re going to see a lot more geographic arbitrage. Companies are going to end up building in places they wouldn’t normally consider, just because that’s where the power is available. Rural locations with good grid connections are going to become really valuable. International locations with better power infrastructure are going to get a lot more interesting.

But the fundamental problem isn’t going away. We’ve got this exponentially growing demand for compute power hitting the linear constraints of physical infrastructure. And unlike most software problems, you can’t just architect your way around the laws of physics.

The companies that figure out how to be more power-efficient are going to have a huge advantage. The ones that can get creative about where they build and how they structure power deals are going to be the ones that actually get to deploy their AI at scale. Everyone else is going to be waiting in line.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.