DRAM revenue will surge to $98 billion this year, up 88%

After going through a difficult period, driven by the demand for high-performance computing and generative artificial intelligence applications, the memory industry is expected to achieve record revenues by 2025.

The booming growth of generative artificial intelligence has greatly stimulated the demand for advanced DDR5 DRAM and HBM technologies in data centers, while also triggering an increase in the demand for enterprise solid-state drives that support artificial intelligence servers. In addition, the first smartphones and personal computers equipped with generative artificial intelligence functions on devices are entering the market. These devices, due to the size of large language models, require a large amount of memory/storage, which will further drive the demand in the mobile and consumer segments.

At present, the recovery of the memory industry is faster than previously expected, and revenue forecasts show a significant increase in the coming years.

The latest data from Yole shows that in 2024, DRAM revenue is expected to surge to $98 billion (up 88% year-on-year), and NAND revenue is expected to surge to $68 billion (up 74% year-on-year). These figures are expected to continue to rise, reaching a new peak level in 2025, with DRAM and NAND revenues reaching $137 billion and $83 billion, respectively. Driven by strong demand (especially from data centers), long-term revenue is expected to grow further by 2029, with DRAM and NAND revenues likely to reach $134 billion and $93 billion, respectively, with a CAGR of 17% and 16% from 2023 to 2029.

Advertisement

In addition, the surge in HBM demand has also driven a significant increase in DRAM demand.

HBM is a type of graphics DDR (Double Data Rate Synchronous Dynamic Random Access Memory) with advantages such as high bandwidth, high capacity, low latency, and low power consumption. It stacks multiple DDR chips together and packages them with the GPU to form a DDR combination array by increasing bandwidth and expanding memory capacity, in order to "keep larger models and more parameters closer to the core computing," thereby reducing the latency brought by memory and storage solutions. In the process of AI large model training and inference, HBM can speed up data processing speed, making it more suitable for high-performance computing scenarios such as ChatGPT.

From a technical iteration perspective, since the world's first TSV (Through-Silicon Via) HBM product was introduced in 2014, it has evolved from HBM, HBM2, HBM2E to the fourth generation HBM3 and the fifth generation HBM3E. Data shows that in 2023, the mainstream product in the HBM market is HBM2E, and NVIDIA A100/A800, AMD MI200, and most CSPs' self-developed acceleration chips are designed with this specification. To adapt to the evolution of AI accelerator chip demand, memory manufacturers have successively launched new products HBM3E in 2024, and it is expected that HBM3 and HBM3E will become the mainstream products in the market this year. As for the higher specification HBM4, TrendForce Consulting expects it to be launched in 2026.

With the surge in demand for AI and other related applications, the HBM market size is also growing rapidly. According to forecasts from several consulting agencies, the global HBM market is expected to enter a period of rapid development. Data shows that the global HBM revenue scale will reach $4.976 billion by 2025, an increase of 148.2% from 2023. It is expected that from 2024 to 2029, the global HBM market size will surge from about $2.52 billion to $7.95 billion, with a compound annual growth rate of 25.86% during the forecast period. In the future, the global HBM memory market will continue to expand, and production capacity and market share will continue to increase.

At present, the global HBM market is highly concentrated, mainly dominated by three major memory manufacturers: SK Hynix, Samsung Electronics, and Micron, with market shares of 50%, about 40%, and about 10% in 2022, respectively. Data shows that in 2023, SK Hynix's market share reached 53%, while Samsung Electronics and Micron's market shares were 38% and 9%, respectively.SK Hynix's exploration and practice in the field of artificial intelligence storage have made it a pioneer in HBM technology. Since the launch of its first HBM product in 2014, the company has continuously deepened its investment in HBM technology, relying on innovative technologies such as Advanced MR-MUF and HKMG. Its HBM products have achieved significant performance advantages, meeting the explosive demand for high-bandwidth storage in the AI field, thus occupying a leading position in the global market.

In the first quarter of 2024, SK Hynix successfully launched the HBM3e product and achieved a yield rate of nearly 80%. The company is currently working with TSMC to develop the more forward-looking HBM4 technology and plans to launch it ahead of schedule in 2025 to cope with the rapid market growth. SK Hynix's ambition does not stop here. The company announced on June 30 that it will invest up to 82 trillion won in the R&D and production of HBM by 2028. This investment scale indicates its far-reaching layout in the AI storage field.

In addition, SK Hynix also plans to invest $3.87 billion in building an advanced packaging factory in West Lafayette, Indiana, focusing on the production of AI storage products. It is expected to start production in 2028, which will also be an important part of the global strategic layout.

SK Hynix recently announced that it expects the shipment of its high-performance memory - HBM to more than double next year compared to this year. This news not only marks SK Hynix's continued innovation in the field of storage solutions but also paves the way for the innovation of future data center storage technology. HBM occupies an important position in the fields of GPU and AI computing due to its high-speed transmission capability and low latency characteristics, becoming one of the key factors driving the development of high-performance computing and data center infrastructure.

As a leading global manufacturer of storage chips, Samsung Electronics has an unshakable position in terms of technological innovation and market influence. Although its market share in the HBM market is slightly lower than that of SK Hynix, Samsung has firmly held the second place in the industry with its unique technological path and market strategy.

In February 2024, Samsung launched the industry's first 12-layer HBM3e product, which has significantly improved performance, demonstrating its profound foundation in stacking technology and material science. Samsung's TC-NCF technology enables the new product to achieve a significant performance improvement while maintaining high consistency. It also plans to launch a 16-layer HBM4 product in 2026 to further consolidate its position in the high-end storage market.

To narrow the gap with SK Hynix, Samsung is making a series of strategic adjustments, including optimizing the HBM R&D team and expanding production capacity on a large scale. According to the plan, the output in 2024 will increase by nearly three times compared to 2023, and by 2026/2028, this number will reach 13.8 times and 23.1 times, respectively.

Micron Technology has adopted a differentiated strategy, choosing to skip HBM3 and directly invest in the development of HBM3e. This strategy has made Micron a leader in power consumption control in the industry, and its HBM3e products began to be supplied in the second quarter of 2024, providing strong support for NVIDIA's H200 Tensor Core GPU system.

The rapid growth of the AI market has brought substantial returns to Micron, with its HBM3e revenue expected to reach tens of millions of dollars in 2024 and over one billion dollars in 2025. Micron has completed the sample testing of the 12-layer HBM3e and plans to achieve mass production in 2025, with the HBM4 product to be launched between 2026 and 2027.

Comments