Next-generation Samsung HBM3E Shinebolt DRAM was unveiled during the company’s Memory Tech Day 2023 this week. Designed for AI applications, this memory will improve total cost of ownership (TCO) and speed up AI-model training as well as inference in the data center.
Boasting a blazing quick 9.8 gigabits-per-second (Gbps) per pin speed, this enables the memory to achieve transfer rates exceeding 1.2 terabytes-per-second (TBps). Samsung optimized its non-conductive film (NCF) technology to eliminate gaps between chip layers and maximize thermal conductivity, resulting in higher layer stacks as well as improved thermal characteristics. Samsung’s 8H and 12H HBM3 products are currently in mass production, with samples shipping to customers.
- Hand-sorted memory chips ensure high performance with generous overclocking headroom
- VENGEANCE LPX is optimized for wide compatibility with the latest Intel and AMD DDR4 motherboards
- A low-profile height of just 34mm ensures that VENGEANCE LPX even fits in most small-form-factor builds

The new era of hyperscale AI has brought the industry to a crossroads where innovation and opportunity intersect, presenting a time with potential for great leaps forward, despite the challenges. Through endless imagination and relentless perseverance, we will continue our market leadership by driving innovation and collaborating with customers and partners to deliver solutions that expand possibilities,” said Jung-Bae Lee, President and Head of Memory Business at Samsung Electronics.