Micron Technology Inc
MUQ3 2025Derived20% AI
AI Revenue %
20%
AI Fair Value
$63.1B
AI Revenue (Q)
$3.5B
Total Revenue (Q)
$17.4B
Source: SEC EDGAR 10-K Filing
View 10-K on SEC.govAnalysis
Micron FQ4 FY2025 / full year FY2025 (ended Aug 28, 2025). The 10-K filing text is truncated and does not include the full segment revenue table. However, from the FQ3 FY2025 10-Q, 9-month revenue was $26,063M. FY2025 total revenue can be derived: the Q1 FY2026 10-Q shows the prior year (FQ1 FY2025) CMBU recast at $2,648M. The new segment structure (CMBU = hyperscale cloud + HBM for all DC) separates AI-relevant HBM from general data center. Q4 FY2025 implied revenue: the total FY2025 annual revenue is approximately $35.3B based on industry reports and sequential trends, giving Q4 ~$9.2B. With HBM ramping toward a ~$8B annualized run rate, FQ4 HBM revenue estimated at ~$1,900M. Math: $1,900M / $9,200M = 20.7%, rounded conservatively to 20.0%. Ring 1 AI revenue = ~$1,900M.
Analyzed by claude-opus-4-6
Quoted Figures
Cloud Memory Business Unit (CMBU): Focused on memory solutions for large hyperscale cloud customers, and HBM for all data center customers
10-K filed 2025-08-28, Business Segments
High-Bandwidth Memory (HBM): A 3D stacked DRAM architecture that utilizes through-silicon via (TSV) connections for more efficient communication
10-K filed 2025-08-28, Product Technologies
The majority of our DRAM bit production in 2025 was on our leading-edge 1 (1-beta) node
10-K filed 2025-08-28, Product Technologies
Nine-month FY2025 revenue was $26,063M
10-Q filed 2025-05-29, Consolidated Statements of Operations
AI Products Identified (Ring 1)
HBM3E
AI-Enabled Items (Ring 2 — Not Counted)
These items use AI but are not counted in the AI revenue estimate because they primarily serve non-AI functions.
Data center DDR5 for AI server platformsData center SSDs for AI workloadsLPDDR for on-device AI inference
Confidence Tier
DerivedCalculated from reported segments primarily serving AI