DoNews News on August 21, SK Hynix announced on the 21st that it has successfully developed a new ultra-high-performance DRAM product for artificial intelligence, HBM3E, and has begun to provide samples to customers for performance verification
HBM (High Bandwidth Memory) refers to significantly improving data processing speed by vertically connecting multiple DRAMs. HBM DRAM products are divided into HBM (first generation), HBM2 (second generation), HBM2E (third generation), HBM3 (fourth generation) and HBM3E (fifth generation) in order. HBM3E is an extended version of HBM3
SK Hynix said that the new product can process up to 1.15TB (terabytes) of data per second. This equates to 230 Full-HD (FHD) movies (5 gigabytes, 5GB) in 1 second. Mass production of HBM3E will begin in the first half of 2024.
SK Hynix’s technical team used the latest Advanced MR-MUF technology on this product, which improved the heat dissipation performance by 10% compared to the previous generation. Moreover, HBM3E also has backward compatibility, which means that customers can directly use new products without modifying the design or structure when using systems built on HBM3
It is reported that the new products will be used in NVIDIA’s new generation of AI computing products.
The above is the detailed content of SK Hynix develops new HBM3E memory: used in AI industry. For more information, please follow other related articles on the PHP Chinese website!