ISLAMBAD, AUGUST 7: According to three people briefed on the results, Nvidia has approved the usage of a variant of Samsung Electronics’ fifth-generation high bandwidth memory (HBM) chips, known as HBM3E, in its artificial intelligence (AI) processors.
The largest memory chip manufacturer in the world, who has been battling to overtake regional rival SK Hynix in the competition to produce cutting-edge memory chips that can manage generative AI tasks, has cleared a significant obstacle with this qualification.
The sources stated that although Samsung and Nvidia have not yet signed a supply agreement, they plan to do so shortly in order to begin providing the certified eight-layer HBM3E chips. They anticipate that shipments would begin in the fourth quarter of 2024.
However, the 12-layer HBM3E chips from the South Korean tech giant have not yet passed Nvidia’s testing, according to the sources, who asked not to be named because the information is still private. Nvidia and Samsung both declined to comment.
Chips are vertically stacked in HBM, a form of dynamic random-access memory (DRRAM) that was initially manufactured in 2013 in order to conserve space and lower power usage. It is an essential part of AI-focused graphics processing units (GPUs) that aids in processing vast volumes of data generated by intricate applications.
Since last year, Samsung has been attempting to pass Nvidia’s tests for the HBM3E and earlier fourth-generation HBM3 models but has encountered difficulties because of problems with heat and power consumption, according to Reuters in May, citing sources.
The insiders who were briefed on the situation claim that the corporation has now revised its HBM3E design to solve those difficulties.
Following the May publication of a Reuters article claiming that Samsung’s chips had failed Nvidia’s tests because of issues with heat and power consumption, Samsung clarified that the assertions were false.
The most recent test approval comes after Reuters last month revealed that Nvidia had certified Samsung’s HBM3 chips for use in less advanced CPUs designed for the Chinese market.
The generative AI boom has produced a skyrocketing demand for advanced GPUs, which Nvidia and other makers of AI chipsets are finding difficult to meet. This is why Nvidia approved Samsung’s most recent HBM processors.
According to research firm TrendForce, HBM3E chips are probably going to take the lead as the most popular HBM product on the market this year, with shipments concentrated in the second half. The market leader, SK Hynix, projects that until 2027, the overall demand for HBM memory chips might rise at a rate of 82% annually.
In July, Samsung predicted that by the fourth quarter, HBM3E chips will account for 60% of its total sales of HBM chips. Many analysts believe that this target might be met if Samsung’s most recent HBM chips receive final certification from Nvidia by the third quarter.
Specific semiconductor product revenue breakdowns are not provided by Samsung. According to a Reuters poll of 15 analysts, Samsung’s total DRAM chip revenue for the first half of this year was projected at 22.5 trillion won ($16.4 billion), with some speculating that up to 10% of it could have come from HBM sales.
HBM is primarily produced by just three companies: Samsung, Micron, and SK Hynix.
Nvidia’s primary HBM chip supplier, SK Hynix, also provided HBM3E chips to an unnamed customer in late March. According to earlier sources, shipments were made to Nvidia. Micron has also declared that it will provide HBM3E chips to Nvidia.