Friday, April 25th 2025

SK hynix Showcases HBM4 to Highlight AI Memory Leadership at TSMC 2025 Technology Symposium
SK hynix showcased groundbreaking memory solutions including HBM4 at the TSMC 2025 North America Technology Symposium held in Santa Clara, California on April 23. The TSMC North America Technology Symposium is an annual event in which TSMC shares its latest technologies and products with global partners. This year, SK hynix participated under the slogan "Memory, Powering AI and Tomorrow," highlighting its technological leadership in AI memory through exhibition zones including HBM Solutions and AI/Data Center Solutions.
In the HBM Solution section, SK hynix presented samples of its 12-layer HBM4 and 16-layer HBM3E products. The 12-layer HBM4 is a next-generation HBM capable of processing over 2 terabytes (TB) of data per second. In March, the company announced it has become the first in the world to supply HBM4 samples to major customers and plans to complete preparations for mass production within the second half of 2025. The B100, NVIDIA's latest Blackwell GPU equipped with the 8-layer HBM3E, was also exhibited in the section along with 3D models of key HBM technologies such as TSV and Advanced MR-MUF, drawing significant attention from visitors.In the AI/Data Center Solutions section, SK hynix displayed its lineup of server memory modules, including RDIMM and MRDIMM products. The section featured various high-performance server modules based on DDR5 DRAM built using the 1c node, the sixth generation of the 10 nm process technology.
Notably, SK hynix exhibited a range of modules designed to enhance AI and data center performance while reducing power consumption. These included the MRDIMM lineup with a speed of 12.8 gigabits per second (Gbps) and capacities of 64 GB, 96 GB, and 256 GB; RDIMM modules with a speed of 8 Gbps in 64 GB and 96 GB capacities; and a 256 GB 3DS RDIMM.
At the TSMC 2025 North America Technology Symposium, SK hynix's next-generation solutions such as HBM4 drew great attention from industry officials. By successfully mass-producing its HBM lineup through continued technological collaboration with partners such as TSMC, the company aims to expand the AI memory ecosystem and further solidify its industry leadership.
Source:
SK hynix
In the HBM Solution section, SK hynix presented samples of its 12-layer HBM4 and 16-layer HBM3E products. The 12-layer HBM4 is a next-generation HBM capable of processing over 2 terabytes (TB) of data per second. In March, the company announced it has become the first in the world to supply HBM4 samples to major customers and plans to complete preparations for mass production within the second half of 2025. The B100, NVIDIA's latest Blackwell GPU equipped with the 8-layer HBM3E, was also exhibited in the section along with 3D models of key HBM technologies such as TSV and Advanced MR-MUF, drawing significant attention from visitors.In the AI/Data Center Solutions section, SK hynix displayed its lineup of server memory modules, including RDIMM and MRDIMM products. The section featured various high-performance server modules based on DDR5 DRAM built using the 1c node, the sixth generation of the 10 nm process technology.
Notably, SK hynix exhibited a range of modules designed to enhance AI and data center performance while reducing power consumption. These included the MRDIMM lineup with a speed of 12.8 gigabits per second (Gbps) and capacities of 64 GB, 96 GB, and 256 GB; RDIMM modules with a speed of 8 Gbps in 64 GB and 96 GB capacities; and a 256 GB 3DS RDIMM.
At the TSMC 2025 North America Technology Symposium, SK hynix's next-generation solutions such as HBM4 drew great attention from industry officials. By successfully mass-producing its HBM lineup through continued technological collaboration with partners such as TSMC, the company aims to expand the AI memory ecosystem and further solidify its industry leadership.
Comments on SK hynix Showcases HBM4 to Highlight AI Memory Leadership at TSMC 2025 Technology Symposium
There are no comments yet.