Nvidia CEO Warns of China’s Rapid AI Infrastructure Build as Open‑Source Models Capture 30% of Global Usage

Key Points
- Nvidia CEO Jensen Huang says China can build AI data centers far faster than the United States.
- China’s energy capacity is described as twice that of the United States, supporting large AI workloads.
- OpenRouter and Andreessen Horowitz report Chinese open‑source LLMs now account for about 30% of global AI token usage.
- Key Chinese models include DeepSeek V3, Alibaba Qwen, and Moonshot AI Kimi K2.
- Nvidia claims its chip technology remains generations ahead of Chinese competitors.
- The rapid growth of Chinese AI infrastructure and models intensifies the global AI competition.
Nvidia chief executive Jensen Huang cautioned that China can construct AI data centers and even hospitals far faster than the United States, citing the country's expansive energy resources and swift construction capabilities. At the same time, a report from OpenRouter and Andreessen Horowitz shows Chinese open‑source large language models now account for roughly 30% of global AI token usage, up from just over 1% a year earlier. While Huang affirmed Nvidia’s chip technology remains ahead of China, the rapid growth of Chinese AI models and the nation’s infrastructure advantages highlight an intensifying competitive landscape.
China’s Speedy AI Infrastructure Development
Nvidia CEO Jensen Huang warned that China’s ability to build AI‑related facilities is dramatically faster than that of the United States. He compared the timeline for a U.S. data center—taking about three years from groundbreaking to a fully operational AI supercomputer—to China’s capacity to build a hospital in a weekend. Huang emphasized that China’s energy capacity, which he described as “twice as much” as the United States, and its larger economy give it a clear advantage in powering large‑scale AI workloads.
China’s Growing Share of Global AI Usage
A separate analysis compiled by OpenRouter in partnership with venture‑capital firm Andreessen Horowitz found that Chinese open‑source large language models now generate roughly 30% of global AI token usage. The study examined 100 trillion tokens, the basic data units processed by AI models. While closed‑source western models such as ChatGPT still dominate the remaining 70%, the surge in Chinese open‑source activity marks a steep rise from just over 1% a year earlier.
Key Players in China’s Open‑Source AI Ecosystem
The report highlighted several Chinese models contributing to this growth, including DeepSeek’s V3 series, Alibaba’s Qwen models, and Moonshot AI’s Kimi K2. These models collectively bring Chinese language prompts to the second‑largest token volume worldwide, trailing only English. Open‑source usage from China now averages about 13% of weekly token volume, nearly matching the 13.7% share from the rest of the world’s open‑source community.
Nvidia’s Position and Outlook
Despite Huang’s concerns, he asserted that Nvidia remains “generations ahead” of China in AI chip technology. He cautioned against complacency, noting that the rapid infrastructure and model development in China could reshape the competitive dynamics of the AI industry. Huang also expressed optimism about U.S. policy efforts aimed at boosting domestic AI investment and manufacturing.
Implications for the Global AI Race
The convergence of China’s fast‑track construction capabilities, expanding energy capacity, and rising open‑source model usage creates a formidable challenge for U.S. firms and policymakers. While Nvidia’s chip leadership offers a counterbalance, the accelerating pace of Chinese AI development suggests a continued intensification of the global AI competition.