
We’re living through a technological revolution. Artificial intelligence is evolving at a breakneck pace, promising to reshape US economy, bolster national security, and unlock scientific breakthroughs we’ve only dreamed of. But behind the curtain of these increasingly capable digital minds lies a very physical, very power-hungry reality. What if America’s AI leadership isn’t threatened by a lack of innovation, but by a lack of electricity?
A recent, eye-opening report from the AI company Anthropic, titled “Build AI in America,” lays out a stark challenge. To maintain its edge in AI, the United States is on track to require at least 50 gigawatts (GW) of new electric capacity by 2028. To put that in perspective, United nation’s electricity demand has grown by less than 1% per year for the last two decades. AI is now causing that growth to be several times faster, and US energy ecosystem simply isn’t built for that kind of surge.
This isn’t just about keeping the lights on; it’s an economic and national security imperative. The question is no longer
if US need this power, but if US can possibly build the infrastructure to deliver it in time.
Understanding AI’s Two-Sided Energy Appetite
Not all AI work is created equal, and the energy demands are split across two very different tasks: training and inference.
AI Training is the herculean process of creating a new model. Think of it like building a universal library of knowledge from scratch. It involves feeding a model mountains of data so it can learn patterns, relationships, and concepts. This process is the foundation for every AI application, but it requires an astonishing, concentrated amount of energy. To stay at the forefront, leading AI developers expect that training a single, state-of-the-art model in 2028 will require a dedicated data center with a 5-gigawatt capacity. That’s the power output of several large nuclear power plants, all for a single training run.
AI Inference, on the other hand, is the act of using a model that has already been trained. This is like looking up a specific fact in the library. While a single inference task is far less energy-intensive than training, these tasks will be happening billions of times a day across the entire economy. To be effective, especially for real-time applications, this requires a widespread, distributed network of data centers located close to users to reduce lag time (latency).
The challenge is twofold: we need to build massive, centralized power hubs for training, and a broad, decentralized network for inference.
What’s Stopping Us? The Three-Headed Monster of Infrastructure Delays
So, why can’t US just build what they need? The U.S. has the technical know-how and economic strength. The problem lies in a tangled web of regulatory, logistical, and supply chain bottlenecks that can delay critical projects for years, if not kill them entirely.
Building AI infrastructure boils down to three physical components:
- The Data Center Itself: Housing the computers and IT equipment.
- Electricity Generation: The power plants (solar, nuclear, gas, etc.) that produce the electricity.
- Transmission Infrastructure: The high-voltage lines that move power from the plant to the data center.
Each of these components faces a gauntlet of regulatory hurdles that can stretch timelines to the breaking point.
- Permitting Hell: Projects face years-long delays from a patchwork of federal, state, and local permits. Federal reviews under the National Environmental Policy Act (NEPA) can take years on their own. On top of that, state and local zoning processes are one of the most common reasons energy projects fail.
- Transmission Gridlock: Getting approval to build new transmission lines is a nightmare. State-level approvals for interstate lines are so arduous that the average completion time for a new project has been over 10 years. In 2023, the U.S. built just 55 miles of high-voltage lines, a fraction of the 1,700-mile annual average from a decade earlier.
- The Interconnection Queue: Before a new power plant can even connect to the grid, it has to get approval from the utility, a process that typically takes 4-6 years. Thousands of projects are currently stuck in this queue, waiting for their turn.
While the U.S. is mired in this red tape, US competitors are not. China, for instance, is building its own AI infrastructure at a blistering pace. Last year, China brought over 400 GW of new power capacity online, compared to just a few dozen gigawatts in the United States. While US shouldn’t adopt their methods, this stark contrast highlights the urgent need to streamline US own processes.
This is a complex web of challenges, and it’s easy to feel like the problem is insurmountable. The world of AI infrastructure is dense, but understanding it is critical to US future. If you want to stay ahead of the curve and get expert breakdowns of reports like this delivered straight to your inbox, join the “Everything in AI” newsletter. We cut through the jargon and deliver the insights that matter.
A Blueprint for Building AI in America
The situation is critical, but Anthropic’s report argues that it’s not hopeless. The executive branch has the tools to unlock America’s potential without even needing new laws from Congress. The plan is broken down into two main pillars.
Pillar 1: Building the Behemoths for Frontier AI Training
The first priority is enabling the massive, gigawatt-scale data centers needed to train the world’s most capable AI models. The strategy is a multi-pronged attack on the key bottlenecks.
- A Clever Land-Use Hack: One of the most brilliant proposals is to lease federal lands from the Department of Defense (DOD) or Department of Energy (DOE) for data center construction. This would allow developers to bypass the treacherous and time-consuming state and local zoning permits that kill so many projects. The project would still need an environmental review under NEPA, but the federal government has far more control over speeding up its own process.
- Slashing Federal Red Tape: The government can pre-approve the general environmental impact of data centers through a “programmatic review,” saving immense time on individual site applications. It can also create “categorical exclusions” for certain components, waiving further review entirely.
- Fast-Tracking Transmission: The DOE has powerful authorities to partner with private developers to plan and build transmission lines, effectively overriding lengthy state approval processes for critical projects. This would be used for targeted, mission-critical lines, not a complete federal takeover.
- Fixing the Connection Queue: The government can work with utilities on reforms, like allowing auctions for queue positions or using AI to speed up grid reliability tests. As a last resort for projects critical to national security, the report suggests the President could even use the Defense Production Act (DPA) to require a utility to provide a grid interconnection.
Pillar 2: Powering the People for Nationwide AI Innovation
Beyond the handful of giant training centers, US need a broader buildout to support AI’s deployment across the entire country. This requires unlocking energy and data center construction everywhere.
- Accelerating All Forms of Power: This means speeding up permitting for all energy sources. There are 40GW of accessible geothermal power, mostly on federal lands, that could be unlocked by fixing the “double NEPA” review process it currently faces. It also means continuing to streamline permitting for advanced nuclear and natural gas.
- Strengthening Critical Supply Chains: The U.S. faces a shortage of critical grid components like transformers, with lead times running up to three years. The report suggests creating strategic reserves of these components and offering loan guarantees to domestic manufacturers to help them scale up production.
- Investing in the Workforce: You can’t build infrastructure without skilled labor. The plan calls for expanding financial support for apprenticeship programs for electricians, engineers, and construction workers who will be on the front lines of this buildout.
The Clock is Ticking
Building the physical foundation for America’s AI leadership is a monumental task, riddled with complex regulatory, financial, and logistical hurdles. Every challenge—from permitting a solar farm to financing a transmission line to training an electrician—must be addressed to succeed.
The strategies outlined by Anthropic provide a bold but practical roadmap. They show that while the task is not easy, it is possible. The actions US take today, from reforming permitting laws to investing in US workforce, will determine whether the next generation of world-changing AI is built in America or is offshored to US competitors. The future is calling, and it’s asking for more power. US need to be ready to answer.