Intel, Ford and Target rise premarket; Deckers slumps
BOISE - Micron Technology, Inc. (NASDAQ:MU), a semiconductor giant with a market capitalization of $227 billion and trading near its 52-week high of $206.34, announced Wednesday it has begun customer sampling of its new 192GB SOCAMM2 (small outline compression attached memory modules) built with LPDDR5X memory technology for AI data centers. According to InvestingPro data, the company has demonstrated remarkable momentum with a 189% price return over the past six months.
The new modules deliver 50% more capacity than Micron’s previous generation SOCAMM in the same compact footprint, while improving power efficiency by more than 20% through the company’s 1-gamma DRAM process technology. This innovation comes as Micron maintains strong financial health, with InvestingPro analysis showing robust revenue growth of 49% in the last twelve months to $37.4 billion.
According to the company’s press release statement, the increased capacity can reduce time to first token by more than 80% in real-time inference workloads, a critical metric for AI system responsiveness.
The SOCAMM2 modules are designed to address growing power efficiency concerns in AI infrastructure, offering more than two-thirds better power efficiency compared to equivalent RDIMMs while occupying just one-third of the physical space.
"As AI workloads become more complex and demanding, data center servers must achieve increased efficiency, delivering more tokens for every watt of power," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit.
The modules are currently sampling to customers at capacities up to 192GB per module and speeds up to 9.6 Gbps. High-volume production will align with customer launch schedules.
Micron developed the SOCAMM2 technology through a five-year collaboration with NVIDIA, adapting low-power DRAM originally designed for mobile devices into data center-class solutions through specialized design features and enhanced testing protocols.
The company is actively participating in the JEDEC SOCAMM2 specification definition process to establish industry standards for low-power memory adoption in AI data centers. With a strong return on assets of 11.2% and 25 analysts revising earnings estimates upward, Micron continues to strengthen its position in the AI memory market. For detailed analysis and additional insights, investors can access the comprehensive Pro Research Report available on InvestingPro, which covers over 1,400 top US stocks.
In other recent news, Micron Technology Inc. reported its fourth-quarter 2025 earnings, which surpassed analysts’ expectations. The company achieved an earnings per share (EPS) of $3.03, exceeding the forecast of $2.77, and generated revenue of $11.32 billion, surpassing the anticipated $11.11 billion. In terms of analyst activity, UBS raised its price target for Micron to $245 from $225, maintaining a Buy rating due to DRAM supply shortages and a robust demand environment. Additionally, UBS had previously increased its price target to $225 from $195, citing stronger demand projections for high-bandwidth memory (HBM). The firm forecasts HBM industry demand to grow significantly in the coming years. Furthermore, Micron announced that board directors Richard M. Beyer and Mary Pat McCarthy will retire in January 2026 at the company’s annual shareholders meeting. Beyer has been with the board since 2013, while McCarthy joined in 2018.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
