
Photo: The Economic Times
South Korea’s SK Hynix has announced that its entire 2026 production capacity for advanced memory chips has already been sold out, highlighting the explosive and sustained demand for artificial intelligence (AI) infrastructure worldwide. The company’s third-quarter results revealed record-breaking earnings, driven by surging sales of high-bandwidth memory (HBM) used in AI data centers, particularly by its biggest client, Nvidia.
For the September quarter, SK Hynix reported revenue of 24.45 trillion won ($17.13 billion), rising nearly 39% year over year, and operating profit surged 62% to 11.38 trillion won. It marks the first time the company’s quarterly profit surpassed 10 trillion won, underscoring its dominance in the booming AI memory market.
Quarter-on-quarter, revenue climbed 10%, while operating income jumped 24%. Following the earnings release, SK Hynix shares soared nearly 5% in Seoul trading, pushing its year-to-date gain above 210% — one of the most impressive stock rallies among global semiconductor makers in 2025.
SK Hynix produces memory chips used across a range of devices, from smartphones and laptops to cloud servers and supercomputers. But its biggest growth engine has been its high-bandwidth memory (HBM) technology, which plays a critical role in powering AI models such as ChatGPT and Google Gemini.
“With customer investments in AI infrastructure expanding at an unprecedented pace, our premium product mix has driven another quarter of record-breaking results,” SK Hynix said in its statement.
The company confirmed that its HBM product line has been fully booked through 2026, with demand expected to exceed supply through at least 2027. SK Hynix plans to ramp up production capacity at its M15X and Cheongju fabs while continuing work on its next-generation HBM4 chips, expected to enter mass production later this year.
HBM chips belong to the broader DRAM (Dynamic Random Access Memory) family, designed for high-speed data processing. They are crucial in AI training and inference workloads, where massive datasets are processed simultaneously.
SK Hynix currently holds a commanding 64% share of the global HBM market, according to Counterpoint Research, and a 38% share of the DRAM market by revenue — surpassing rival Samsung Electronics earlier this year. The global HBM market itself grew 178% year-over-year in Q2, reaching record levels as AI chipmakers like Nvidia, AMD, and Intel raced to secure supply.
SK Hynix’s early lead in developing advanced HBM3E and HBM4 products has positioned it as the primary supplier for Nvidia, which integrates the chips into its most powerful AI GPUs such as the H100 and Blackwell B200.
However, competition is heating up. Micron Technology in the U.S. has started delivering its own HBM3E chips to Nvidia, while Samsung Electronics recently cleared Nvidia’s qualification tests for next-generation memory modules.
“With AI transforming every aspect of the tech ecosystem, memory technology is entering a new era,” said Kim Woohyun, Chief Financial Officer of SK Hynix. “We will continue to lead in AI memory by delivering innovative, high-performance products and expanding capacity to meet global demand.”
Analysts expect the HBM market to reach $43 billion by 2027, as AI adoption accelerates in sectors like cloud computing, autonomous vehicles, and data analytics.
“SK Hynix is likely to maintain a strong 60% global HBM share in 2026,” said Ray Wang, semiconductor analyst at Futurum Group, citing the company’s strong customer relationships with Nvidia, Google, and Amazon Web Services.
SK Hynix’s dominance underscores how memory manufacturers have become central to the AI revolution, with chip capacity now a strategic global resource. As companies and governments worldwide invest heavily in AI infrastructure, HBM suppliers like SK Hynix will continue to see unprecedented order backlogs and record-breaking profitability.









