Nvidia’s CEO Jensen Huang made a pointed request to TSMC during a rare public appearance in Taiwan this weekend, framing the conversation in terms that left little room for ambiguity. The company’s relentless AI-driven growth—fueled by record sales of its Blackwell and H100 GPUs—has created a bottleneck at the world’s largest semiconductor foundry, where Huang now finds himself in the unusual position of needing more capacity than TSMC can currently deliver.

The exchange took place outside a Taipei restaurant hosting what Huang playfully dubbed a trillion-dollar dinner, a nod to the combined market caps of the tech giants in attendance. With TSMC’s C.C. Wei present but not engaging with reporters, Huang’s remarks carried weight: ‘TSMC needs to work very hard this year because I need a lot of wafers.’ The , delivered with a laugh, underscored a reality few in the industry are ignoring—the AI boom has turned Nvidia into a voracious consumer of TSMC’s most advanced nodes.

Huang’s urgency reflects a broader industry shift. TSMC’s role as the backbone of global chip production has never been more critical, yet its packaging and testing facilities are under strain. While the company has already committed over $165 billion to expanding U.S. manufacturing—including a massive Arizona hub—the demand for AI accelerators, data center GPUs, and even consumer-grade chips is outpacing even its accelerated timelines. Analysts suggest TSMC’s capacity may need to grow by more than 100% over the next decade to meet the influx of orders, particularly from Nvidia’s data center and AI divisions.

H100 ram

Memory Crunch: The AI Effect on Consumer Hardware

The pressure isn’t limited to wafers. Huang acknowledged the broader supply chain strain, particularly in memory, where AI servers have siphoned off DRAM and GDDR6 supplies that once flowed to gaming PCs and laptops. ‘We need a lot of memory this year,’ he noted, hinting at no immediate relief for consumers facing inflated RAM and GPU prices. The so-called ‘RAMpocalypse’ shows no signs of abating, with no major price corrections in sight as data centers continue to prioritize AI workloads over traditional markets.

For Nvidia, the outlook is brighter than ever. Despite geopolitical tensions—including years of restricted sales to China—the company has secured conditional approvals for high-end AI GPUs like the H200, with reports indicating Chinese firms such as DeepSeek now in line for shipments. This marks a potential thaw in Nvidia’s long-standing export limitations, though the broader impact on consumer hardware remains unclear. With AI driving nearly half of Nvidia’s revenue, the company has little incentive to divert resources toward balancing the scales for gamers and creators.

What This Means for Gamers and Tech Buyers

The gap between enterprise and consumer priorities is widening. While TSMC’s expansion in the U.S. and Taiwan—including plans for four new advanced packaging plants—could eventually ease some bottlenecks, the timeline for trickle-down benefits is uncertain. For now, those hunting for GPUs or memory upgrades should brace for sustained high prices, as Nvidia’s AI juggernaut ensures the focus remains firmly on data centers.

The irony isn’t lost on industry watchers: Huang’s ‘trillion-dollar dinner’ guests may have enjoyed a lavish evening, but for the average tech buyer, the year ahead promises more of the same—long waits, premium pricing, and the occasional reminder that the chips are stacked against them.