Samsung’s memory chief, Kim Jaejune, warned in the company’s full earnings report on April 30, 2026, that “significant shortages” across memory products will likely continue through at least 2027. The company says that demand fulfillment rates have dropped to record lows as customers rush to secure future supply, reports SCMP. This warning echoes comments made by rival SK Hynix during its earnings call just a week earlier.
Together with U.S.-based Micron Technology, Samsung and SK Hynix control over 90% of the global DRAM market. When two of the world’s three biggest memory suppliers both warn of multi-year shortages, it’s reasonable to be concerned.
The shortages are mainly caused by the huge demand for artificial intelligence infrastructure. Modern AI systems need massive amounts of high-speed memory to constantly feed data to GPUs and accelerators. At the heart of this demand surge is HBM (high-bandwidth memory), a type of DRAM that is stacked vertically to provide extremely high speeds while staying very close to the processors.
HBM has become crucial for AI accelerators. However, this technology is hard and expensive to make, requiring advanced stacking, precise bonding, and complex packaging methods. As a result, supply is limited, and demand is growing faster than manufacturers can build new capacity.
While HBM demand is the primary driver of the shortage, its effects are starting to spread to the wider memory market. Because HBM is a type of DRAM, manufacturers are increasingly shifting their production capacity, engineering resources, and investments toward these high-profit AI memory products. This shift risks tightening the supply for more common DRAM products used in servers, PCs, and mobile devices. Demand for enterprise SSDs is also rising as AI data centers need huge storage systems alongside their computing hardware.
Ironically, the industry is also looking for alternatives because current memory designs use a lot of power. We recently reported on efforts to develop next-generation memory technologies like 3D X-DRAM and ZAM (Z-Angle Memory), which aim to use less power and overcome scaling limits. Yet, despite massive investment into future alternatives, demand for the existing memory technologies remains overwhelming.
Samsung reportedly stated that some customers have already secured their memory supply through 2027. Earlier this year, SK Group chairman Chey Tae-won suggested that AI-related memory demand pressure might even last until 2030. These shortages aren’t necessarily bad news for the memory companies themselves. Samsung’s semiconductor division reported 53.7 trillion won (36.1billion) in operating profit during the first quarter of 2026. This accounted for about 9436.1billion) in operating profit during the first quarter of 2026. This accounted for about 94
35.5 billion) and an operating profit of 37.6 trillion won ($27.8 billion), largely fueled by strong HBM sales for AI infrastructure. Part of the problem is cyclical. The memory industry has historically gone through periods of too much supply and then shortages. However, analysts increasingly believe this current cycle is different because the growth in AI infrastructure is consuming hardware at unprecedented rates.
To deal with this crisis, companies are aggressively expanding production capacity and increasing investment in advanced packaging and memory fabrication. According to the Korea Times, recent regulatory filings show that Samsung Electronics invested 465.4 billion won in its Xi’an memory chip plant in 2025, a 67.5% increase from the previous year. SK Hynix also significantly increased spending, investing 581.1 billion won into its Wuxi facilities and 440.6 billion won into its Dalian operations.
However, building new semiconductor factories and advanced memory packaging facilities takes years to expand and get running. This means supply growth cannot keep up with the fast pace of AI-driven demand. The memory crunch joins a growing list of resource shortages emerging from the AI explosion.
GPU shortages have already become severe in parts of the industry. Earlier this month, we reported Intel’s confirmation that demand had become so intense that customers were even buying chips that might previously have been thrown away or considered low-value.
Power is becoming another major bottleneck. AI data centers are using enormous amounts of electricity, forcing technology companies to look for increasingly unusual energy solutions. Earlier this month, Meta Platforms supported plans for space-based solar power systems that could theoretically send solar energy back to Earth to help power future AI infrastructure needs.











