Samsung Unveils Highest-Capacity Chip to Boost AI Tech Leadership


Samsung Electronics announced the development of HBM3E 12H, marking the industry's pioneering achievement in 12-stack HBM3E DRAM technology, surpassing competitors in the expanding artificial intelligence chip sector. As the leading memory chip manufacturer globally, Samsung has initiated sample provision of HBM3E 12H to clients, with mass production scheduled for the first half of this year.

"The industry's AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need", Bae Yong-cheol, Executive Vice President of Memory Product Planning at Samsung Electronics said. "This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era", he said.

High Bandwidth Memory is a memory chip with low power consumption and ultra-wide communication lanes, using vertically stacked memory chips to break processing bottlenecks caused by conventional memory chips. "In light of the exponential growth of the AI market, the advanced, high-capacity memory chip would be an optimal solution for future systems that require more memory", Samsung said.

In AI applications, it is projected that utilizing HBM3E 12H could boost the average speed of AI training by 34% compared to using HBM3 8H and enable an expansion of over 11.5 times in the number of simultaneous users for inference services. "Its higher performance and capacity will especially allow customers to manage their resources more flexibly and reduce the total cost of ownership for data centers", Samsung said.

Samsung stated that its HBM3E 12H offers an unprecedented bandwidth of up to 1,280 gigabytes per second (GB/s) and a leading industry capacity of 36 GB. These metrics represent improvements of over 50 percent compared to the 8-stack HBM3 8H. The HBM3E 12H utilizes advanced thermal compression non-conductive film (TC NCF), enabling the 12-layer products to maintain the same height specifications as 8-layer ones, ensuring compatibility with current HBM package requirements.

Samsung clarified that as the industry endeavors to address chip die warping associated with thinner die, this technology is expected to offer additional advantages, particularly for higher stacks. Through ongoing efforts to reduce the thickness of its NCF material, Samsung has achieved the industry's smallest chip gap at seven micrometers and has effectively eliminated voids between layers. Consequently, this has resulted in an improved vertical density of over 20 percent compared to the HBM3 8H, according to the chipmaker.

Chip manufacturers worldwide are vying for dominance in the HBM market, driven by escalating demand fueled by the widespread adoption of generative AI. Micron Technology, a US-based semiconductor company, disclosed the initiation of mass production for HBM3E, outpacing competitors Samsung and SK Hynix. Micron stated that its latest chip, boasting a capacity of 24 GB, will be integrated into Nvidia's H200 Tensor Core GPUs.

Last year, the HBM market represented approximately 1 percent of the total volume of the memory chip market. It is expected to more than double in size this year. Samsung Electronics and SK hynix are engaged in direct competition to outperform each other in this emerging chip market, each holding roughly 45 percent of the market share. Micron ranks third, fulfilling approximately 10 percent of all orders.

Current Issue