Bullish indicating open at $55-$60, IPO prices at $37
Investing.com -- South Korea’s SK Hynix expects the market for high-bandwidth memory (HBM) chips used in artificial intelligence to grow 30% annually until 2030, according to a senior company executive.
In an interview, SK Hynix’s head of HBM business planning, Choi Joon-yong, expressed confidence in the sector despite concerns about rising prices in the memory chip market.
"AI demand from the end user is pretty much, very firm and strong," Choi said.
The company projects the custom HBM market will reach tens of billions of dollars by 2030. HBM chips, first produced in 2013, stack memory chips vertically to reduce space and power consumption while processing large volumes of data generated by complex AI applications.
Choi noted that the billions in AI capital spending projected by cloud computing giants like Amazon (NASDAQ:AMZN), Microsoft (NASDAQ:MSFT) and Alphabet (NASDAQ:GOOGL)’s Google will likely be revised upward, which would benefit the HBM market.
"The relationship between AI build-outs and HBM purchases is very straightforward," Choi explained, adding that SK Hynix’s growth projections are conservative and account for constraints such as available energy.
The memory business is undergoing significant strategic changes. Next-generation HBM4 chips being developed by SK Hynix and competitors Samsung Electronics (KS:005930) and Micron Technology (NASDAQ:MU) will include customer-specific logic dies that help manage memory functions.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.