Micron Is Not A Bargain Despite The AI-Driven HBM Sensation (NASDAQ:MU) (2024)

Micron Is Not A Bargain Despite The AI-Driven HBM Sensation (NASDAQ:MU) (1)

Micron (NASDAQ:MU) has been a key beneficiary of ongoing AI momentum, as its HBM technology continues to play a critical role in chips that underpin the transition to accelerated computing. The stock has more than doubled since the advent of OpenAI’s ChatGPT, outpacing previous losses due to the extended cyclical downturn in consumer-facing end markets.

Nvidia’s (NVDA) latest introduction of the next-generation Rubin AI platform provides further reinforcement to this secular tailwind for Micron. Specifically, Micron’s product roadmap already considers next-generation HBM4 technology, which management expects to start volume shipments around the same timeline as the Nvidia Rubin platform’s go-to-market. This will be key to reinforcing revenue visibility for Micron, compensating for the company’s inherently elevated exposure to cyclical risks in the consumer-facing memory market.

Meanwhile, near-term capacity constraints continue to be a double-edged sword for Micron. On one hand, limited supply helps maintain its price advantage in the HBM market. Yet constrained production capacity could also hamper its share of the rapidly expanding market fuelled by ongoing AI developments.

While our last coverage discusses the gives and takes in Micron coming out of its strong fiscal Q2 earnings, the following analysis will dive further into expectations for the stock’s anticipated outlook heading into the next earnings season. We have also performed a sensitivity analysis on Micron’s growth outlook to gauge the durability of its valuation at current levels, as well as the stock’s prospects for further upside appreciation considering management’s expectations for stronger fundamentals in the back half of FY 2024.

Heading into its upcoming earnings update, we believe investor focus will remain on HBM3E contributions to gauge the technology’s long-term implications for Micron’s financial prospects. In addition to robust volumes, we will also be looking for signs of stronger-than-expected ASP to back the potential for another upward revaluation given the underlying business’ supply-constrained environment.

AI-Driven Memory Momentum

CPU-based servers are gradually becoming obsolete, as accelerated computing takes over to comply with the competitive performance and power efficiency requirements of the data- and AI-first era. As repeatedly emphasized by Nvidia CEO Jensen Huang, the replacement cycle for traditional CPU-based general purpose data centers will create an annualized TAM of $250 billion over the next four years, representing a trillion-dollar opportunity.

GPU-Attached Memory

As discussed in our previous coverage, Micron remains a key beneficiary of the secular transition, as AI becomes an accelerant for memory demand:

Industry expects up to 1 terabyte of additional DRAM demand for every server AI processor, or 30x more than what is required in a high-end PC workstation. For instance, Nvidia’s latest Blackwell B200 GPUs requires 33% more HBM3e content than its predecessor.

Source: “Micron Fiscal Q2 Quick Takes: Beware of the Capacity Chokehold”

Specifically, Micron’s HBM technology is a critical component of the server GPUs that make up accelerated data centers. HBM utilizes 3D through-silicon via-based stacking technology that essentially allows it to pack more memory into a single chip. This does not only reduce the size of the overall chip, but also minimizes the distance that data needs to travel between the memory and the processor, thus lowering power consumption requirements to operate.

The technology is accordingly optimized for increasingly complex high-performance computing and accelerated computing workloads. Specifically, Microns’ HBM3E technology comes in 8- and 12-stack configurations, accommodating the large data requirements of AI workloads. Produced based on Micron’s 1-beta process technology, its HBM3E delivers 50% more memory capacity than its predecessor, enabling training at “higher precision and accuracy”. Micron’s HBM3E also boasts a 2.5x improvement in performance/watt compared to the preceding generation, and delivers up to 30% greater power efficiency than competing products. This accordingly complements increasing industry demands for competitive total cost of ownership (“TCO”) without compromising performance going forward in order to optimize computing efficiency:

And as the world is now suffering from computing cost and computing energy inflation because general-purpose computing has run its course, accelerated computing is really the sustainable way of going forward. So, accelerated computing is how you're going to save money in computing, is how you're going to save energy in computing. And so, the versatility of our platform results in the lowest TCO for their data center.

Source: Nvidia F1Q25 Earnings Call Transcript

CPU-Attached Memory

In addition to opportunities in the accelerated computing replacement cycle, Micron is also facilitating an emerging upgrade cycle in CPU-based general-purpose computing. Admittedly, accelerated computing will likely overtake CPU-based general-purpose computing eventually, as the former addresses sustainability considerations in facilitating large data processing at scale. But CPU-based data centers still represent an opportunity for Micron, nonetheless, while server CPUs also play a critical role in accelerated computing.

Specifically, accelerated data centers typically combine both CPUs and GPUs to “dramatically speed up work”. CPUs process computing tasks in a “serial fashion” – or one task at a time – which can be slow and increase latency in processing massive workloads like AI training and inferencing. Meanwhile, data center GPUs have been a common accelerator used to complement server CPUs in creating accelerated computing platforms. Unlike CPUs, GPUs execute computing tasks in a “parallel fashion” – or multiple tasks across multiple processors at a time. This accordingly reduces latency in processing massive workloads like AI/ML.

By combining accelerators like GPUs with the CPU in accelerated computing, the former effectively takes on the “highly complicated calculations” in parallel, while delegating the remainder of tasks to the latter in sequential formation. This results in faster execution times, while also enabling improved energy efficiency and reducing idle time.

And Micron’s market share gain prospects underpinned by the AI era are further reinforced by its supply of industry-leading CPU-attached memory. Micron has started volume shipments of its 32Gb monolithic die-based 128GB DDR5 RDIMM memory for CPUs in early May. The product will go-to-market through Micron’s broader channel of distributors and resellers starting June, which should drive further accretion to its compute and networking business segment’s performance in the back half of FY 2024.

The next-generation CPU-attached memory is also based on Micron’s 1-beta process technology to enhance capacity and address increasingly demanding data center workloads, including AI/ML. The latest 128GB DDR5 RDIMM memory delivers “more than 45% improved bit density, up to 22% improved energy efficiency, and up to 16% lower latency” over competing products.

Micron’s industry-leading performance capabilities demonstrated through the two prominent types of memory for facilitating AI workloads – namely, HBM and DDR – makes it well-positioned for the upcoming accelerated computing replacement cycle and general-purpose data center upgrade cycle. The advanced capabilities from both products are optimized for supporting accelerating memory demands, particularly from the transition to AI inferencing workloads. Specifically, many of the generative AI use cases developed over the past year are starting to enter scaled usage going forward. This will accordingly increase inference-driven compute capacity demand and, inadvertently, increase the memory-intensity of data centers. Current forecasts expect inferencing to represent “2x the number of cycles and spend as training by mid-2025”, reinforcing demand for advanced memory capabilities and creating a sustained longer-term tailwind for Micron.

Improving Revenue Visibility with HBM

Micron’s F3Q results are likely to reflect stronger HBM3E contributions, as the product enters its first full quarter of volume shipments. Recall from Micron’s F2Q earnings update that it has already sold out HBM3E volumes for FY 2024, safeguarding hundreds of millions of dollars in revenue contributions for the year ahead.

Specifically, the price premium on Micron’s HBM3E will be a key margin accretive factor in the near-term. Historically, Micron has demonstrated peak margins at about 60% on a consolidated level. Given HBM3E bit share is expected to approach DRAM levels by next year, its premium ASP is likely to accompany greater margin tailwinds in the near-term, which will be favourable to cash flows underpinning its valuation.

This is complemented by incremental volumes of the 128GB DDR5 RDIMM that started shipping in early May. Management had previously guided several hundred millions of dollars of additional revenue in fiscal 2H24 from relevant shipments, which will further reinforce its near-term revenue and margin expansion visibility.

Yet capacity and supply availability remains a bottleneck to anticipated volume and pricing tailwinds ahead for its flagship memory products curated for AI workloads. As mentioned earlier, Micron’s HBM3E volumes in FY 2024 have already been sold out, with the majority of FY 2025 supply also primarily accounted for as of its last update in March. This is consistent with robust industry demand for server GPUs coming out of the latest earnings season, with hyperscalers’ capex allocated to AI infrastructure developments still resilient.

We believe Micron’s constrained capacity remains a key overhang on its fundamental and valuation outlook. Specifically, Micron’s HBM3E is a key component used by Nvidia for its next-generation H200 GPUs, which has also recently started volume shipments. While Nvidia’s data center sales are expected to exceed $50 billion over the next six months, Micron has only earmarked “several hundred million dollars of revenue from HBM in fiscal 2024”. HBM3E’s price premium also suggests that unit volumes are likely to be much lower in proportion to Nvidia’s H200 supply availability in the upcoming months. This highlights Micron’s inferior market share compared to rivals like SK Hynix and Samsung (OTCPK:SSNLF) despite its product having 30% better power efficiency, with its limited capacity further restraining near-term market share gains. With Nvidia’s upcoming Blackwell GPUs incorporating 33% more HBM3E content than the H200 GPUs, Micron’s market share of AI-driven memory demand will likely diminish further.

Although Micron stays committed to improving capacity requirements for its HBM product line, it will likely remain a near-term challenge. HBM3E is more capital-intensive to produce, requiring 3x the wafer supply to get the same number of bits as one DDR5. The trade ratio inefficiency is expected to be even greater for the next-generation HBM4 that management expects to start sampling in 2025, with volume shipments slated for 2026 to support Nvidia’s upcoming Rubin platform.

Durability of HBM’s Price Advantage

Considering Micron’s constrained HBM supply environment, its ability to drive further ASP expansion will be key to enabling growth and margin outperformance in the near-term. Specifically, a supply constrained environment is expected to aid ASP expansion and, inadvertently, profit margin growth for Micron. This is critical for cash flows underpinning Micron’s valuation prospects, especially given its inherently capital-intensive business model as an integrated device manufacturer (“IDM”). The lower capex yield in producing HBM wafers also highlights the importance of sustaining a price premium in order to offset the diminished cost-returns spread of relevant investments.

However, increasing industry supply availability from competitors like SK Hynix and Samsung in the near-term will likely reduce Micron’s pricing advantage over time. In the latest development, Nvidia has confirmed it is in the works of certifying Samsung’s HBM3 and HBM3E components for incorporation into its GPUs. Meanwhile, Samsung already plans to triple its HBM supply availability this year compared to levels in 2023. Materializing said plans will likely alleviate industry supply constraints further. This will accordingly diminish Micron’s near-term price advantage for its HBM products, and risk thwarting the stock’s prospects for further multiple expansion from current levels.

Sensitivity Analysis

In order to gauge the sustainability of Micron’s valuation following its steep upsurge at current levels, we have performed a sensitivity analysis to estimate the level of growth required over a five-year forecast period. We believe the five-year forecast horizon is adequately reflective of the ongoing data center upgrade and replacement cycle that underpins AI-driven memory demand. A cost structure in line with Micron’s current margin expansion outlook, which includes consideration of elevated accretion from higher-ASP HBM contributions going forward as discussed in the foregoing analysis, is also applied in the analysis.

Considering the discounted cash flow approach, alongside a 10.6% WACC in line with Micron’s capital structure and risk profile, and a 3.9% implied perpetual growth rate, the company will likely need to grow DRAM and NAND revenue at a 33% CAGR through FY 2028 to sustain the stock’s current price at about $134 apiece.

i. Revenue Growth Sensitivity Analysis

ii. Accompanying DCF Analysis

The implied annualized revenue growth estimate of 33% over the forecast period compares to management’s expectations for the long-term CAGR in DRAM and NAND bit demand growth to hit mid-teens to low-20% range. It also compares to industry expectations for HBM opportunities to expand at a 26.4% five-year CAGR and broader server GPU opportunities to expand at a 34.6% five-year CAGR to support the ongoing transition to accelerated computing.

We believe the stock’s current price remains reasonably durable, with the annualized 33% revenue growth expectation priced-in still achievable when considering Micron’s improving HBM capacity, the robust demand environment particularly in the compute and networking business unit, and its price premium. The outlook’s durability is further reinforced by Micron’s upcoming HBM4 technology, which will likely support Nvidia’s next-generation Rubin AI platform and bolster the company’s capture of memory opportunities stemming from the data center upgrade cycle. Anticipated revenue visibility improvements will also compensate for Micron’s inherently elevated exposure to cyclical risks, especially in consumer-facing end-markets (e.g. PCs; smartphones). Yet, the set-up also leaves little room for error, especially when considering the imminent risks of moderating CPU-attached memory demand and tight supply conditions for HBM.

Upside Scenario Sensitivity Analysis

We have also performed a sensitivity analysis on Micron’s revenue outlook to determine the extent of growth needed to warrant another valuation upward re-rate for the stock. Specifically, our analysis shows an annualized revenue growth rate of 37% over the forecast period through FY 2028, while keeping all cost and valuation assumptions unchanged, to support another 10% gain in the stock to $147 apiece.

i. Revenue Growth Sensitivity Analysis

ii. Accompanying DCF Analysis

This further elevates execution risks ahead for Micron. As mentioned in the earlier section, both ASP and volume expansion will eventually be required in order to drive a sustained pace of long-term revenue visibility going forward for Micron. And the upside sensitivity analysis effectively highlights the critical importance of further capacity expansion required for Micron in order to improve its capture of high-growth HBM opportunities ahead, and unlock incremental valuation upside for the stock.

Admittedly, Micron currently trades at a discounted multiple on a relative basis to its semiconductor peers with a comparable growth outlook despite its mission-critical role in enabling the build-out of next-generation AI infrastructure. However, the discount is likely reflective of not only Micron’s capital-intensive IDM business model, but also its elevated execution risks given anticipated growth limitations due to ongoing capacity constraints.

Final Thoughts

Micron’s upsurge continues to highlight investors’ confidence in its strong demand environment, underpinned by the mission-critical role of its HBM3E in facilitating ongoing AI developments. In addition to the favourable pricing structure of Micron’s HBM3E, the ensuing positive accretion to profit margins and reinforcement to its revenue visibility are also upside drivers to the stock.

However, we remain cautious of implications pertaining to Micron’s capacity bottleneck. Despite expectations for a strong earnings report later this month and through the back half of FY 2024, the near-term cap on Micron’s HBM supply could potentially stymie its capture of AI-driven memory TAM expansion.

Admittedly, expectations for a strong HBM ramp through F2H24 will reinforce durability to the stock’s performance at current levels. However, without additional capacity allocation to HBM, while not compromising NAND and DRAM leadership in the process, there is likely limited room for further upside potential in the stock.

Editor's Note: This article discusses one or more securities that do not trade on a major U.S. exchange. Please be aware of the risks associated with these stocks.

Livy Investment Research

Livy Investment Research is a technology sector research analyst providing long investment ideas by uncovering hidden value ahead of the tech innovation curve.Livy runs the investing group Livy Investment Research. They provide deep-dive coverage, interactive financial models, industry primers and community chat. Livy covers companies that are playing a fundamental role in tackling existing technology hurdles capable of capitalizing on long-term growth frontiers. They include electric and autonomous vehicles, semiconductors, cloud-computing, AI/ML, cybersecurity, and analytics – all of which are disrupting legacy norms and contributing towards a more efficient, value-adding economy. Learn more.

Analyst’s Disclosure: I/we have no stock, option or similar derivative position in any of the companies mentioned, and no plans to initiate any such positions within the next 72 hours. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Seeking Alpha's Disclosure: Past performance is no guarantee of future results. No recommendation or advice is being given as to whether any investment is suitable for a particular investor. Any views or opinions expressed above may not reflect those of Seeking Alpha as a whole. Seeking Alpha is not a licensed securities dealer, broker or US investment adviser or investment bank. Our analysts are third party authors that include both professional investors and individual investors who may not be licensed or certified by any institute or regulatory body.

Micron Is Not A Bargain Despite The AI-Driven HBM Sensation (NASDAQ:MU) (2024)
Top Articles
Latest Posts
Article information

Author: Maia Crooks Jr

Last Updated:

Views: 5976

Rating: 4.2 / 5 (63 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Maia Crooks Jr

Birthday: 1997-09-21

Address: 93119 Joseph Street, Peggyfurt, NC 11582

Phone: +2983088926881

Job: Principal Design Liaison

Hobby: Web surfing, Skiing, role-playing games, Sketching, Polo, Sewing, Genealogy

Introduction: My name is Maia Crooks Jr, I am a homely, joyous, shiny, successful, hilarious, thoughtful, joyous person who loves writing and wants to share my knowledge and understanding with you.