Fannie Mae, Freddie Mac shares tumble after conservatorship comments
BOISE, Idaho - Micron Technology, Inc. (NASDAQ:MU), a prominent player in the Semiconductors industry with a market capitalization of $129 billion, announced Thursday its HBM3E 36GB 12-high memory solution will be integrated into AMD’s upcoming Instinct MI350 Series GPU platforms. According to InvestingPro data, Micron has demonstrated strong momentum with a 9% return over the past week.
The new AMD Instinct MI350 Series, built on AMD’s CDNA 4 architecture, will feature 288GB of Micron’s high-bandwidth memory per GPU, delivering up to 8 TB/s bandwidth. This configuration enables support for AI models with up to 520 billion parameters on a single GPU.
In a full platform configuration, the MI350 Series offers up to 2.3TB of HBM3E memory and achieves peak theoretical performance of up to 161 PFLOPS at FP4 precision, according to the company’s press release.
"Micron’s HBM3E industry leadership and technology innovations provide improved TCO benefits to end customers with high performance for demanding AI systems," said Praveen Vaidyanathan, vice president and general manager of Cloud Memory Products at Micron. The company’s strong market position is reflected in its impressive 71% revenue growth over the last twelve months. InvestingPro analysis reveals multiple additional growth indicators and financial metrics available to subscribers, along with a comprehensive Pro Research Report that provides deep insights into Micron’s market position and future potential.
Josh Friedrich, corporate vice president of AMD Instinct Product Engineering at AMD, stated that Micron’s memory solution is "instrumental in unlocking the performance and energy efficiency" of the new accelerators, helping customers "train larger AI models, speed inference and tackle complex HPC workloads."
The companies highlighted their collaboration as enabling faster time to market for AI solutions through joint engineering efforts that optimize compatibility between Micron’s memory and AMD’s GPU platforms.
Micron’s HBM3E 36GB 12-high product has been qualified on multiple leading AI platforms, the company reported.
In other recent news, Micron Technology announced a significant investment of approximately $200 billion in U.S. semiconductor manufacturing and research, which is expected to create 90,000 jobs. This investment includes $150 billion for domestic memory manufacturing and $50 billion for research and development, with plans to expand facilities in Idaho, New York, and Virginia. Micron has also secured a $275 million CHIPS Act funding for its Virginia facility and expects up to $6.4 billion in total CHIPS Act funding for its U.S. projects. Additionally, Micron has been selected by Nvidia as the first supplier for a new memory solution called SOCAMM, which is designed for AI servers in data centers. This development positions Micron ahead of competitors like Samsung and SK Hynix in this emerging market. On the analyst front, Citi raised its price target for Micron to $130, citing stronger-than-expected DRAM pricing, while UBS increased their target to $120, noting stable demand for DDR. Mizuho also adjusted its price target to $130, highlighting Micron’s potential growth in the High Bandwidth Memory market.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.