SK Hynix forecasts that the market for a specialized form of memory chip designed for artificial intelligence will grow 30% a year until 2030, a senior executive said in an interview with Reuters.

The upbeat projection for global growth in high-bandwidth memory (HBM) for use in AI brushes off concern over rising price pressures in a sector that for decades has been treated like commodities such as oil or coal.

"AI demand from the end user is pretty much, very firm and strong," said SK Hynix's Choi Joon-yong, the head of HBM business planning at SK Hynix.

The relationship between AI build-outs and HBM purchases is "very straightforward" and there is a correlation between the two, Choi said. SK Hynix's projections are conservative and include constraints such as available energy, he said.

But the memory business is undergoing a significant strategic change during this period as well. HBM - a type of dynamic random access memory or DRAM standard first produced in 2013 - involves stacking chips vertically to save space and reduce power consumption, helping to process the large volumes of data generated by complex AI applications.

SK Hynix expects this market for custom HBM to grow to tens of billions of dollars by 2030, Choi said.

Due to technological changes in the way SK Hynix and rivals such as Micron Technology, opens new tab and Samsung Electronics, opens new tab build next-generation HBM4, their products include a customer-specific logic die, or "base die", that helps manage the memory.

That means it is no longer possible to easily replace a rival's memory product with a nearly identical chip or product.