SK Hynix has released a promotion for the next generation HBM3 memory, which will include plenty of Nvidia’s Hopper server architecture and which is somewhat indirectly introduced. HBM3 memory, which was held in a session at the GTC 2022 conference, is breaking new records.
What does HBM3 memory offer?
HBM memories were introduced to the market as a revolution thanks to their high data transfer rates. However, due to high costs and high adoption rates of GDDR memory, it was mostly limited to server systems or performance-oriented systems.
Nvidia Hopper redefines the rules
HBM3 standard, which is the last point of HBM memory, is shown as the most performance and fastest DRAM memory type. The fourth generation memory comes with significant improvements over HBM2E memory.
Approximately 8000 in-silicon connections per stack – HBM3 memory with TSV can increase up to 12 stacks in total. Thus, there are around 100 thousand TSVs in the 12-stack package. Reaching a total capacity of 24GB, the memory can reach speeds of 6.4Gbps with its 16-channel architecture. 819GB/s bandwidth is obtained in each stack. Two extra channels can also offer 64 virtual channels in total.
HBM memories are more preferred in Level 4 and 5 autonomous driving systems. HBM3 memories are expected to play a larger role in high-performance systems, artificial intelligence, machine learning and advanced driving systems.
On the other hand, it is stated that memory products reach 20 percent of a server’s power consumption. In 2025, this rate will be 35 percent. It is stated that when HBM3 memories are replaced with GDDR memories, carbon emissions will be reduced by 830 thousand tons in 2030.
- Home
- Hardware
- Ram News
- What are the HBM3 memory specifications?