Micron Technology (MU) – Riding the HBM Wave in the AI Boom
Micron Technology (MU) trades around $385, and it’s setting up as one of the most compelling plays in the AI hardware space that’s not named Nvidia.
While GPUs get most of the attention, they rely heavily on high-bandwidth memory (HBM) to perform at peak levels. HBM allows AI chips to move more data faster, using less power and space—critical for training and running large AI models. The problem? HBM production is far more resource-intensive than standard DRAM, requiring 3 to 4 times more wafer capacity. That’s led to a tight supply environment and rising prices, which puts Micron in a very strong position.
Micron is a major player in both DRAM and NAND memory, with DRAM making up about 80% of its revenue. That exposure is paying off. In its fiscal Q1, the company reported 57% revenue growth, with adjusted EPS nearly tripling to $4.78. Gross margin jumped to 56.8% from 39.5% a year earlier. This kind of margin expansion in a tightening supply cycle is exactly what long-term investors should be looking for.
Micron’s management expects the HBM market to grow at a 40% annual rate, reaching $100 billion by 2028. To keep pace, the company is ramping up investment—raising its capex from $18 billion to $20 billion this year and kicking off new fab construction in New York and Idaho. The fact that its current HBM supply is fully booked for the year suggests that this pricing power is just beginning.
Add in a net cash positive balance sheet and strong free cash flow generation, and Micron is clearly positioned to capitalize on the demand surge driven by AI.
We see this as a strong, overlooked AI pick with room to outperform—even in a market dominated by Nvidia headlines.





