Micron Initiates Full-Scale Production of Memory Chips for Nvidia’s AI Semiconductors

  • Micron Technology has initiated mass production of high-bandwidth memory (HBM) semiconductors for Nvidia’s latest AI chips.
  • The HBM3E chips promise a 30% reduction in power consumption compared to competitors.
  • Nvidia plans to integrate these chips into its forthcoming H200 graphic processing units, which are expected to surpass the current H100 chip.
  • Analysts anticipate significant market opportunities for Micron due to increasing demand for HBM chips in AI applications.
  • SK Hynix’s inventory depletion underscores the importance of Micron’s role as an additional supplier, potentially aiding GPU production for companies like AMD, Intel, and Nvidia.

Main AI News:

Micron Technology has commenced mass production of its cutting-edge high-bandwidth memory (HBM) semiconductors tailored for application in Nvidia’s latest artificial intelligence chip, sparking a notable surge in its shares by over 5% on Monday.

According to Micron, the HBM3E (High Bandwidth Memory 3E) boasts a remarkable 30% reduction in power consumption compared to its competitors. This innovation is poised to meet the escalating demand for chips fueling the growth of generative AI applications.

Nvidia has earmarked this chip for integration into its upcoming H200 graphic processing units, slated to hit the market in the second quarter. These units are anticipated to surpass the current H100 chip, which has been instrumental in driving substantial revenue growth for the chip designer.

Anshel Sag, an analyst at Moor Insights & Strategy, remarked, “I see this as a significant opportunity for Micron, especially given the escalating preference for HBM chips in AI applications.”

The burgeoning demand for HBM chips, primarily spearheaded by Nvidia supplier SK Hynix, has buoyed investor optimism regarding Micron’s ability to navigate through a sluggish recovery in its other market segments.

Sag emphasized the significance of diversifying suppliers, stating, “With SK Hynix already depleting its 2024 inventory, having an additional supplier catering to the market can facilitate GPU makers such as AMD, Intel, or NVIDIA in ramping up their GPU production capabilities.”

HBM stands out as one of Micron’s most lucrative offerings, attributed in part to the intricate technical craftsmanship involved in its production.

Previously, the company had projected “several hundred million” dollars in HBM revenue for fiscal 2024, with further expansion anticipated in 2025.

Conclusion:

Micron’s venture into mass production of HBM chips for Nvidia’s AI semiconductors marks a pivotal moment in the market. With a promising reduction in power consumption and the potential to meet soaring demand, Micron is poised to capitalize on the burgeoning AI semiconductor landscape. This move not only strengthens Micron’s position but also underscores the critical role of diversified suppliers in facilitating market expansion for GPU makers amidst evolving technological demands.

Source