SK Hynix begins mass production of next-generation AI server memory modules

AI News


Published: April 20, 2026, 10:21

This photo provided by SK hynix Inc. shows the company's 192GB SOCAMM2 next generation AI server memory module. [YONHAP]

This photo provided by SK hynix Inc. shows the company’s 192GB SOCAMM2 next generation AI server memory module. [YONHAP]

SK Hynix announced on Monday that it has begun mass production of next-generation memory modules designed for AI servers, aiming to strengthen its position in the AI ​​infrastructure market.


The 192GB SOCAMM2 module is based on 6th generation 10-nanometer-class LPDDR5X low-power DRAM technology, the company said. This module was specifically designed for use with Nvidia Corp.’s Vera Rubin AI platform.

This module is designed to adapt mobile-oriented low-power memory to server environments and serve as the primary memory solution for next-generation AI servers.

According to SK Hynix, the product delivers more than twice the bandwidth and more than 75% improved power efficiency compared to traditional RDIMMs (Registered Dual Inline Memory Modules), making it suitable for high-performance AI operations.

The South Korean semiconductor giant said the new product is expected to solve memory bottlenecks in training and inferring large language models with hundreds of billions of parameters, and significantly improve overall system performance.

“With the supply of 192GB SOCAMM2, SK Hynix has established a new standard in AI memory performance,” said Kim Joo-sung, president and head of AI infrastructure.

United








Source link