The AI economy has driven memory chips to the forefront of technological value, with the Philadelphia Stock Exchange Semiconductor Index soaring 60% in just six weeks. Micron, a leading memory chipmaker, saw its stock surge 38% last week, marking its best performance since 2008. Retail traders have joined the frenzy, pushing trading volumes to their highest levels in a year.

The Memory Market Under Pressure

Every computing device—whether a laptop, smartphone, or AI server—relies on memory chips to function. These chips store data and instructions, enabling processors to operate efficiently. While DRAM (dynamic random-access memory) has traditionally been the standard for consumer devices, the AI boom has created unprecedented demand for high-bandwidth memory (HBM), used in AI accelerators like Nvidia’s GPUs. AI workloads require significantly more memory than traditional computing, with a single AI server needing eight to 10 times the DRAM of a standard server.

‘Anytime people show me these curves that just go to the sky with no end, that never continues forever,’ said Willy Shih, a Harvard Business School professor and semiconductor expert. ‘This too will pass.’

AI’s Impact on Pricing and Supply

Historically, memory chips followed a predictable trend of becoming more affordable and efficient over time, akin to Moore’s Law. However, the AI-driven demand has disrupted this pattern. DRAM contract prices are projected to rise 58%-63% in Q2 2024, the steepest increase in a decade. Samsung, the world’s largest memory maker, reported a 90% price hike in Q1 alone. Manufacturers have shifted production toward HBM, the most profitable memory type, leaving consumer electronics companies scrambling for dwindling DRAM supplies.

The competition for memory has already begun to impact consumer markets. Nintendo raised prices for its Switch 2 console, Sony increased PlayStation 5 prices by up to $150, and entry-level 5G phones in India have risen 30% since October. Analysts predict a 31% year-over-year decline in global smartphone shipments over the next 12 months, a contraction unprecedented outside the pandemic.

Nvidia’s Role in the Crisis

Adding to the supply crunch, Nvidia announced plans to use low-power DRAM (LPDDR5) for its inference GPUs by 2026, citing its superior power efficiency. This decision pits Nvidia against consumer electronics giants like Apple and Samsung in the battle for memory resources. The shift has already doubled server memory prices, further straining the market. As the AI boom continues to reshape global supply chains, the ripple effects on consumer industries and American workers remain a critical concern.