Brace yourself for a memory crunch! The memory shortage is a bigger headache than we initially anticipated. Get ready for a significant price hike in DRAM and NAND flash memory during the first quarter of 2026. Why? It's a perfect storm of AI-driven hyperscalers, cloud service providers (CSPs), and unexpected PC shipment surges.
Industry experts at TrendForce issued a warning in early January, predicting a 55-60% sequential increase in DRAM contract prices. Simultaneously, NAND flash memory, crucial for solid-state storage, was expected to rise by 33-38%. But here's where it gets controversial: TrendForce's recent estimates paint an even grimmer picture, forecasting a whopping 90-95% surge in DRAM contract pricing and a 55-60% jump in NAND prices during the current quarter.
AI's insatiable demand is a major culprit, but there's more. According to TrendForce, higher-than-expected PC shipments in Q4 2025 have exacerbated the shortage. OEMs like Dell and HP, known for their bulk memory purchases a year in advance, have kept pre-build pricing steady, but as their inventories deplete and they restock, system prices are set to soar.
TrendForce predicts PC DRAM prices to nearly double from the holiday quarter, with similar steep increases for LPDDR memory in notebooks, other soldered-RAM systems, and smartphones. Pricing for LPDDR4x and LPDDR5x memory is expected to increase by a staggering 90% QoQ, marking the steepest increases in their history.
And this is the part most people miss: the impact of large language models (LLMs) on memory usage. During LLM inference, the model's state is stored in the key-value cache, acting as its short-term memory. This cache, computed during active use, is typically stored in HBM. As inference providers aim to reduce compute requirements and enhance user interactivity, they store these precomputed KV caches, but it comes at a cost - a massive memory footprint.
NAND flash pricing is also set to surge as hyperscalers and CSPs rush to deploy SSDs for AI inference workloads. TrendForce highlights the unprecedented demand for high-performance storage, driven by the growth of AI applications. Leading North American CSPs have been rapidly increasing their procurement, leading to a surge in enterprise SSD orders.
As AI infrastructure shifts from training to inference-dominated spaces, the demand for additional DRAM and storage is inevitable. But here's the catch: while memory vendors have the funds for new fabs, these facilities will take years to become operational. So, if you were hoping for an end to the memory winter, it seems we're in for a prolonged freeze. DRAM prices are expected to remain high through 2028, with no signs of returning to normal anytime soon.
So, what's your take on this memory crisis? Do you think AI's impact on memory usage is overstated, or is it a legitimate concern? Share your thoughts in the comments; we'd love to hear your perspective on this memory-intensive future!