Samsung steps up to fill gap in AI memory chip manufacturing for NVIDIA | Tech News

AI For Business


Yurim Lee and Ian King

Samsung Electronics is beginning to make headway in closing the gap on rival SK Hynix after suffering a series of setbacks in the development of a type of memory chip crucial to the artificial intelligence market.

Samsung has made important strides toward a comeback, including winning long-awaited approval from AI giant Nvidia for a version of its high-bandwidth memory chip HBM3, according to people familiar with the matter. The company also expects approval of the next-generation HBM3E within two to four months, according to the people, who asked not to be identified discussing internal developments.

Click here to connect with us on WhatsApp

The progress comes after months of stumbles, including development setbacks that allowed smaller SK Hynix to take a commanding lead in the fast-growing sector. Such a play is unusual and humiliating for South Korea's largest company. Historically, Samsung has leveraged its size and engineering expertise to lead the memory chip market. Amid struggles in HBM, the company took the highly unusual step of replacing the head of its semiconductor unit in May.

“We've never seen Samsung in a situation like this before,” said Jim McGregor, an analyst at Tirias Research. “The industry and Nvidia need Samsung more than anyone, but it's going to take Samsung at full throttle to get there.”

The company declined to comment on specific partners but said it is generally working closely with customers and testing is progressing well.

Samsung's recent gains are likely to position the company to take advantage of a surge in demand for AI products: The HBM market is expected to grow from $4 billion last year to $71 billion by 2027, according to Morgan Stanley. The sooner Samsung gets approval from Nvidia, the leader in AI accelerator manufacturing, the more it stands to profit from that increase.

“Investor perception of Samsung may change soon,” Morgan Stanley analysts Sean Kim and Duan Liu said in a research note this month. “The situation is improving rapidly.”

The researchers named Samsung the top stock in their report because they see the company growing its HBM market share by at least 10% in 2025, leading to roughly $4 billion in additional revenue. While it will lag SK Hynix in this area, progress could change investor perceptions and boost the stock price.

Samsung is likely to be asked about its HBM strategy when it reports its final second-quarter financial results on Wednesday, and it's unclear how much detail the company will provide.

Samsung is expected to get Nvidia's approval by November, but the company is still struggling to iron out some kinks, and the outcome is unpredictable given the complexity of AI chips, and the approval deadline could slip into 2025, according to people familiar with the matter.

Samsung's missteps come at an unusual time for the company: Chairman Jay Y. Lee has been battling prosecutors for years over bribery and corruption allegations, and during that time senior executives didn't see HBM as a priority. Indeed, the market was in the red until OpenAI launched ChatGPT in late 2022, sparking a surge in demand for Nvidia chips used to train AI models.

While SK Hynix was primed for the surge, Samsung struggled with the complex engineering challenges of its new chips. The HBM is a collection of DRAM chips stacked in eight layers in the latest generation. Each layer generates significant heat, plus the Nvidia graphics processing unit (GPU) packed into it, which can reach 100 degrees Celsius on its own. Without the right heat-dissipating and cooling materials, the entire stack risks melting.

“As you go up in layers, it gets harder to get reasonable yields,” said Jake Silverman, an analyst at Bloomberg Intelligence. “The problem is heat. You're stacking DRAM on top of each other, so it gets hot, and it's in such close proximity to the GPU that the GPU gets even hotter.”

Samsung has struggled to resolve the so-called thermal coupling, according to one of the people, who asked not to be identified because the matter is confidential.In May, the company took dramatic steps, announcing that its semiconductor head, Kyung Kye-hyun, would step down and be replaced by Chung Yong-hyun.

Jun, who joined Samsung in 2000 to develop DRAM and flash memory chips, quickly stepped up the pressure to find a solution. The 63-year-old convened a series of meetings to sift through the technical details and get to the root of the problem. In one meeting that lasted hours without a break, he lamented that HBM could be part of a broader problem, according to a person familiar with the matter.

Samsung was in danger of falling behind not just in terms of the technology for its memory chips, but also in terms of the urgency to innovate. To improve collaboration, he restructured the team dedicated to HBM and appointed a new head.

Samsung employs a thermal management strategy called thermo-compressed non-conductive film (TC-NCF) to insulate each layer of DRAM, while SK Hynix has pioneered an alternative that improves heat dissipation and production yields.

But rather than explore other approaches, Samsung chose to stick with and improve TC-NCF. A company spokesman said TC-NCF is a “well-proven technology” that will be included in future products.

The company ultimately revised the HBM design to address heat and power consumption issues, which led to Nvidia's approval of HBM3, according to the people.

Samsung said that since Jun took over, he has prioritized the company's culture of group discussion and persistence in solving problems, adding that “our HBM products have not experienced any heat or power consumption issues” and that “we have not made any design changes for any specific customers.”

The saving grace for Samsung is that AI has yet to mature: Tech companies including Microsoft, Google parent Alphabet, Amazon.com Inc., Apple Inc. and Meta Platforms Inc. are all investing heavily in developing their AI capabilities.

Samsung has been producing HBM3 chips since late last year, according to details in the quarterly report. Companies such as Google that design their own chip features are expected to continue using HBM3 this year. Samsung began supplying HBM3 for Nvidia's H20 chips, a product customized for China to meet U.S. export controls.

HBM3E first came to market this year when Nvidia paired SK Hynix chips with its H200. Nvidia will continue to use HBM3E in nearly all of its products through 2025, and chip rivals will likely stick with it into 2026, analysts at Sanford C. Bernstein said in a July report.

“Although Samsung is lagging behind, the HBM3E window is still open for Samsung to catch up,” wrote the analysts, led by Mark Lee.

In a symbolic sign of the delays, Micron Technology Inc. earlier this year announced HBM3E chips that Nvidia had approved for use in its AI devices. Micron, which has lagged its South Korean rival in scale, now claims leadership in some areas of memory manufacturing and product introductions, a sign of further erosion of Samsung's dominance.

But Samsung's big advantage is its financial and manufacturing capabilities: If it meets Nvidia's approval standards, Samsung can rapidly ramp up production and address the shortages that have hamstrung Nvidia and other AI advocates.

“Micron and Hynix don't have the capacity to support the entire market yet,” said Bloomberg Intelligence's Silverman. Nvidia Chief Executive Jensen Huang “wants to encourage them” because he needs more supply, he added.

SK Hynix isn't slowing down: The company is in a unique position to steal attention from better-known rivals, and its shares have soared more than 150% since the start of 2023, more than triple Samsung's performance.

SK Hynix said last week that it was accelerating production of its HBM3E products, aiming for growth of more than 300%. The company also said it plans to mass-produce its next-generation 12-layer HBM3E chips this quarter and start delivering them to customers in the fourth quarter, likely a sign that certification from Nvidia is imminent.

Under Jun's leadership, Samsung has made progress: It has developed its own 12-layer HBM3E technology and is working to get Nvidia's approval for its generation of chips and 8-layer HBM3E, which shows the market is promising.

“This represents a $71 billion revenue opportunity by 2027 (our estimates) – growth that did not exist two years ago,” the Morgan Stanley analysts wrote. “The key debate for Samsung is whether it can perform as a strong second supplier to Nvidia.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *