SK hynix Unveils World's First HBM4 at TSMC 2025 Symposium, Showcasing 2.0 TB/s Bandwidth and 16-Hi Stack Design

At the TSMC 2025 North America Technology Symposium held in Santa Clara on April 23, SK hynix made headlines by publicly unveiling its next-generation HBM4 technology for the first time. The Korean memory leader showcased a range of cutting-edge memory innovations across its HBM Solutions and AI/Data Center Solutions zones under the banner “Memory, Powering AI and Tomorrow.”

The most anticipated reveal came in the form of SK hynix’s 12-layer HBM4, which supports data processing speeds of over 2 terabytes per second (TB/s) — a record-breaking milestone in high-bandwidth memory design. SK hynix had previously announced in March that it became the first company in the world to supply HBM4 samples to major customers, with plans to complete preparations for mass production by H2 2025.

Alongside the HBM4 samples, SK hynix also exhibited its 16-layer HBM3E modules, already featured in NVIDIA’s Blackwell B100 GPUs, offering a glimpse at how top-tier GPUs are integrating advanced memory solutions to support the AI boom. The booth also featured 3D models of critical technologies like Through-Silicon Via (TSV) and Advanced MR-MUF, both essential in enabling the extreme stacking and thermal performance of modern HBM.

Advanced Manufacturing Technologies Showcased

  • TSV (Through-Silicon Via): Micro-scale vertical connections between chip layers to enable high-performance stacking.

  • Advanced MR-MUF (Molded Reflow Underfill): An enhanced packaging process enabling thinner, warp-free chip stacks with improved heat dissipation.

AI & Data Center DRAM Modules Take the Spotlight

In the AI/Data Center Solutions zone, SK hynix presented a comprehensive portfolio of server memory modules based on its 1c DRAM node, the sixth generation of its 10nm-class technology. The display included high-speed MRDIMM and RDIMM products designed to meet increasing AI processing demands with a focus on power efficiency and density.

Highlighted specifications include:

  • MRDIMM5:

    • Up to 12.8 Gbps speed

    • Capacities: 64 GB, 96 GB, and 256 GB

  • RDIMM4:

    • 8 Gbps speed

    • Capacities: 64 GB, 96 GB, and 256 GB (3DS)

These modules underscore SK hynix’s commitment to delivering memory solutions that scale alongside next-gen AI workloads while balancing power and thermal efficiency.

Strategic Partnership with TSMC: Driving the AI Ecosystem Forward

SK hynix's decision to showcase its memory innovations at TSMC’s symposium is no coincidence. The company has been deepening its partnership with TSMC in anticipation of expanding the AI memory ecosystem, a strategy that positions both firms at the core of the industry's move toward vertically integrated, high-performance computing.

By pushing the boundaries of memory bandwidth and stack density, SK hynix is not only setting new industry benchmarks but also strengthening its competitive edge against rivals like Samsung and Micron in the HBM and server DRAM segments.


What are your thoughts on SK hynix’s 2TB/s HBM4 breakthrough—do you see it as a major leap for AI infrastructure? Let us know how you think this will affect GPU and data center evolution.

Angel Morales

Founder and lead writer at Duck-IT Tech News, and dedicated to delivering the latest news, reviews, and insights in the world of technology, gaming, and AI. With experience in the tech and business sectors, combining a deep passion for technology with a talent for clear and engaging writing

Previous
Previous

WUCHANG: Fallen Feathers Opens Pre-Orders Ahead of July 24 Release, Reveals PC Requirements and New Trailer

Next
Next

The Outer Worlds 2 Unveils New Combat and Weapon Mechanics, Drops Level Scaling in Favor of Tiered Enemies