Ranked: The Companies That Sell the Most AI Chips
See visuals like this from many other data creators on our Voronoi app. Download it for free on iOS or Android and discover incredible data-driven charts from a variety of trusted sources.
Key Takeaways
- Nvidia supplied nearly two-thirds of AI compute capacity in Q4 2025, far ahead of all rivals combined.
- Google ranked a distant second, with less than one-third of Nvidia’s output.
- AMD, Amazon, and Huawei form a smaller second tier, highlighting how concentrated AI compute remains.
Nvidia’s grip on the AI boom remains overwhelming.
In Q4 2025, the company shipped nearly two-thirds of all measured AI compute capacity—more than its closest competitors combined. While Google, Amazon, and others are scaling up their own chips, the gap between first and second place remains striking.
This visualization, part of Visual Capitalist’s AI Week sponsored by Terzo, ranks the world’s largest AI chip designers using data from Epoch AI’s Chip Sales database, which estimates compute capacity across leading architectures.
The Biggest AI Chip Sellers
Even as more companies entered the AI chip market, one still towered over the rest in Q4 2025: Nvidia.
To make different chips comparable, the data is converted into “H100 equivalents”—a standardized measure based on Nvidia’s flagship AI GPU.
| Rank | Manufacturer | Q4 2025 Chip Sales (H100 equivalents) |
|---|---|---|
| 1 | Nvidia | 2,957,362 |
| 2 | 976,313 | |
| 3 | AMD | 226,485 |
| 4 | Amazon | 221,354 |
| 5 | Huawei | 131,964 |
Nvidia didn’t just lead—it dominated. Its 2.96 million H100-equivalent shipments in Q4 2025 exceeded the combined total of every other company in this ranking.
AMD (226k) and Amazon (221k) formed a much smaller second tier, followed by Huawei (132k). Together, the rankings show that while the market is broadening, AI compute shipments remain highly concentrated at the top.
As demand for AI infrastructure accelerates, the key question is whether competitors can meaningfully close this gap or whether Nvidia’s early lead will translate into long-term dominance of the AI stack.
What H100 Equivalent Compute Measures
This chart measures compute capacity, not units sold or revenue. Epoch AI defines H100e as H100-equivalent compute capacity, converting each chip’s peak dense 8-bit operations into the equivalent number of Nvidia H100 GPUs.
Epoch AI uses this measure because it is more intuitive than citing raw operations per second across different chip families.
Still, the firm notes that H100e is an imperfect proxy, since real-world performance also depends on factors like memory bandwidth, software ecosystems, and how chips are networked into servers and clusters.
Inside the Methodology
These figures are estimates rather than exact reported sales. Epoch AI says chipmakers do not consistently disclose precise volumes, and most of its uncertainty ranges span roughly a factor of 2x around the median estimate.
The dataset also does not track all AI chip production. Instead, it focuses on the largest designers of dedicated AI accelerators—Nvidia, Google, Amazon, AMD, and Huawei—which Epoch AI says account for the large majority of global AI compute capacity.
https://www.visualcapitalist.com/ranked-the-companies-that-sell-the-most-ai-chips/


