24-Hour Hour Hotline:0755-23884560
MORE
SK Hynix HBM is sold out, and memory semiconductors are on the rise
Source: | Author:hkw6c68a0 | Published time: 2024-02-27 | 153 Views | Share:

Under the influence of artificial intelligence and high-performance computing (HPC), the HBM (High Broadband Memory) product, which is considered one of the pillars of artificial intelligence computing, has developed rapidly in the past two years, with both quantity and price rising. According to Trendforce's data, it is expected that the global HBM market size will reach approximately $15 billion by 2025, with a growth rate of over 50%.

 

 

 

As a partner of NVIDIA's high bandwidth storage, SK Hynix holds a leading position in the HBM market. Recently, Kim Ki tae, Vice President of SK Hynix, stated that the company's HBM has been sold out this year and preparations have begun for 2025. Although external unstable factors still exist, the memory semiconductor industry has begun to show an upward trend this year.

 

At the market level, the demand for products from large global technology customers is recovering, and with the expansion of artificial intelligence usage areas, including devices equipped with their own artificial intelligence, such as personal computers and smartphones, the demand for products such as DDR5, LPDDR5T, and HBM3E is expected to increase.

 

Data shows that SK Hynix is expanding its investment in HBM production facilities to cope with the increasing demand for high-performance AI products. The company plans to invest more than twice in facilities related to through silicon via (TSV) compared to 2023, aiming to double production capacity. It also plans to start producing its fifth generation high bandwidth memory product HBM3E in the first half of 2024.

 

In addition, SK Hynix is forming an AI alliance with TSMC. According to industry insiders, SK Hynix and TSMC have formed a One Team strategy, including collaborating on the development of the sixth generation HBM, HBM4.

 

Compared to the previous three generations of HBM, the HBM4 stack will change the design of 1024 bit interfaces since 2015, adopting a 2048 bit interface to solve the problem of "wide but slow" 1024 bit memory interfaces. The doubling of bit width is also the biggest change since the introduction of HBM memory technology. Due to the complex wiring required for the 2048-bit interface on integrated circuits, TSMC's more advanced packaging technology may be needed to validate the layout and speed of the HBM4 chip.

 

There are also reports that the HBM4 chip will be used in the second iteration upgrade of the Blackwell architecture GPU.