• Samsung's arch-rival presents HBM3E memory chip that could power

    From TechnologyDaily@1337:1/100 to All on Fri Feb 2 22:45:05 2024
    Samsung's arch-rival presents HBM3E memory chip that could power Nvidia's Blackwell B100 AI GPU with 16 layers, 48GB and 10.24Gbps transfer rate, this may well be the key to make ChatGPT 6 live

    Date:
    Fri, 02 Feb 2024 22:34:59 +0000

    Description:
    SK hynix is set to debut a new HBM3E chip on the same stage Samsung will be showcasing its memory products.

    FULL STORY ======================================================================

    Samsung is set to showcase numerous new products at the forthcoming International Solid-State Circuits Conference (ISSCC), including a superfast DDR5 memory chip, 280 layer QLC NAND flash memory, and the the world's
    fastest GDDR7 .

    But while Samsung will certainly draw a lot of attention, its not the only game in town, as its South Korean rival SK hynix is also set to reveal its
    new HBM3E DRAM straight after Samsung finishes talking about its 3D-NAND
    flash memory at the High-Density Memory and Interfaces session.

    HBM3E (High Bandwidth Memory gen 3 Extended) is a groundbreaking memory technology that offers a significant leap in performance and power efficiency and is designed to meet the escalating demands of high-performance computing, AI, and graphics applications. Nvidia has a choice

    HBM3E is the 5th generation of HBM, and interconnects multiple DRAM chips vertically, significantly increasing data processing speed, capacity, and
    heat dissipation.

    According to SK hynix, its new memory chip can process data up to 1.15 terabytes per second, equivalent to processing over 230 Full-HD movies of 5GB each in a second. It also boasts a 10% improvement in heat dissipation,
    thanks to the implementation of the cutting-edge Advanced Mass Reflow Molded Underfill technology (or MR-MUF).

    SK Hynix sees its HBM3E as the driving force behind AI tech innovation, but
    it could also power Nvidias most powerful GPU ever the B100 Blackwell AI
    GPU. Micron has stated it wont release the next generation of its high-bandwidth memory unit, HBM4, until 2025. This has led to speculation
    that Nvidia may seek an alternative supplier for the B100 Blackwell .

    Although Samsung was considered the most likely contender for this , with its new Shinebolt HBM3E memory, SK Hynix is well-positioned to step in with its new product.

    Theres no official word on this yet, but it likely wont be long until we find out which of the Korean companies Nvidia chooses. More from TechRadar Pro Samsung to showcase the worlds fastest GDDR7 memory next month Nvidia teases its most powerful GPU ever Samsung's new HBM3E memory technology hits 1.2TB/s



    ======================================================================
    Link to news story: https://www.techradar.com/pro/samsungs-arch-rival-presents-hbm3e-memory-chip-t hat-could-power-nvidias-blackwell-b100-ai-gpu-with-16-layers-48gb-and-1024gbps -transfer-rate-this-may-well-be-the-key-to-make-chatgpt-6-live


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)