MAGA 2.0 Securing Full Stack U.S. Leadership in AI : US Pioneer Global VC DIFCHQ SFO Singapore – Riyadh Swiss Our Mind

“The U.S. possesses all components across the full AI stack, including advanced semiconductor design, frontier algorithms, and, of course, transformational applications. Now the computing power this stack requires is integral to advancing AI technology, and to safeguard America’s advantage, the Trump administration will ensure that the most powerful AI systems are built in the U.S. with American-designed and manufactured chips.”

– Vice President JD Vance, Paris, February 11, 2025

Today, the United States leads the world in generative AI. Its frontier labs set the pace in model development, U.S. firms control more than half of the world’s AI accelerators, and U.S. capital markets are poised to rapidly scale investment in data center infrastructure. Lasting U.S. advantage in AI, however, is not guaranteed.

The global race for compute is intensifying as competitors—adversaries and allies alike—are maneuvering to catch up. Beyond the recent breakthrough with DeepSeek, China is building massive data centers, expanding its power sector, and developing domestic AI chips to reduce Western dependence. France aims to leverage surplus nuclear power to attract data centers and support AI research centers across the country. Japan seeks to overcome space and energy constraints by powering highly efficient data centers with idled nuclear plants. The United Arab Emirates is creating AI-focused economic zones and incentives to attract international companies, with nuclear power as part of its strategy.

To stay ahead in the AI race, the United States should put meaningful distance between itself and competitors across all components of the AI stack—frontier models, data centers, advanced chips, and energy. These constitute the fundamentals of AI competitiveness. While all components are important, by far the most pressing need today is ensuring rapid access to the electricity needed to power large data centers. Simply put, failing to secure energy means surrendering U.S. leadership on AI. At stake in the United States is long-term growth and productivity, market security, and national security.

Findings

The central message of this CSIS Economic Security and Technology Department report is that, while the AI revolution is digital in nature, its binding constraint is physical infrastructure. The AI race will be won by whoever scales investment and delivers infrastructure the fastest, most reliably, and in ways that generate maximum positive spillovers for the broader economy.

Our team has developed a range of scenarios to assess the semiconductor, energy, and capital needs for leadership across the AI stack. These scenarios correspond to surges in business investment seen during the Dot-Com Boom, PC revolution, and Second Industrial Revolution. Given the nearly inexhaustible demand for model training and inference (or AI applications) compute, our analysis has focused on the supply-side delivery of compute to the AI sector. A long-term research priority should be to forecast the economy-wide trajectory of compute demand, based on the sectoral adoption of AI, and identify policy options to reduce sectoral barriers to AI uptake, innovation, and growth.

Based on this supply-constrained framework, we estimate that data center expansion will require 80–160 million leading-edge GPUs (measured in H100 equivalents) and 40–90 gigawatts (GW) of new energy demand at a total capex of $2 trillion by 2030. Summarized in the table below, we explore the policy implications of these findings:

https://www.csis.org/analysis/securing-full-stack-us-leadership-ai