
Advanced Micro Devices (AMD) is gearing up for one of the most ambitious growth phases in its history, as soaring global demand for artificial intelligence hardware reshapes the semiconductor industry. Chief Executive Lisa Su told investors on Tuesday that AMD expects its overall revenue to grow around 35% annually over the next three to five years, powered by what she called “insatiable” demand for AI computing chips.
According to Su, the company’s AI data center division will be the main engine of that growth, projected to expand about 80% per year through 2027. That surge could push AI chip sales into the tens of billions of dollars, solidifying AMD’s place as a critical supplier in the global race to build AI infrastructure.
“This is what we see as our potential given the customer traction — both with the announced customers, as well as those working very closely with us,” Su said during AMD’s first Financial Analyst Day since 2022.
Currently, Nvidia dominates the AI accelerator market with over 90% market share, giving it a market capitalization of roughly $4.6 trillion — compared to AMD’s $387 billion. But AMD believes it can capture a double-digit share of the lucrative data center AI market within five years, narrowing the gap in one of the tech world’s most valuable battlegrounds.
AMD’s growth plan rests heavily on high-profile partnerships and new technology launches. In October, the company announced a multi-year deal with OpenAI to supply billions of dollars worth of Instinct AI chips, starting in 2026. The chips will deliver 1 gigawatt of computing power, enough to support next-generation large language models like ChatGPT.
As part of the collaboration, OpenAI may take up to a 10% equity stake in AMD, signaling deep strategic alignment between the two companies. Su also highlighted long-term partnerships with Meta and Oracle, both of which are expanding their AI cloud infrastructure using AMD hardware.
The company’s upcoming Instinct MI400X AI chips, expected to launch next year, will feature a “rack-scale” system allowing up to 72 chips to operate as a single unit, a critical capability for training the largest and most complex AI models. If successful, this would mark the first time AMD matches Nvidia’s advanced multi-chip rack architecture, which has been the standard for three product generations.
During the analyst event, AMD said it expects gross margins between 55% and 58% in the coming years — a figure that exceeded Wall Street expectations. Despite a brief 3% dip in after-hours trading, investor sentiment remains optimistic, with AMD shares nearly doubling in 2025 as enthusiasm around AI technology intensifies.
Su revealed that AMD now sees the total addressable market for AI data center components reaching $1 trillion annually by 2030, up from a previous forecast of $500 billion by 2028. This revised projection factors in both graphics processing units (GPUs) and central processing units (CPUs), highlighting AMD’s dual focus on AI accelerators and traditional computing power.
In fiscal 2024, AMD generated $5 billion in AI chip sales, underscoring how rapidly the segment is expanding relative to its broader business, which also includes gaming consoles, networking hardware, and custom processors.
While much of AMD’s spotlight remains on artificial intelligence, Su emphasized that the company’s traditional businesses — including its Epyc CPUs, which compete directly with Intel and Arm-based processors — continue to grow robustly.
“The other message we want to leave you with today,” Su said, “is that every other part of our business is firing on all cylinders, and that’s actually a very nice place to be.”
With surging demand for AI infrastructure, strengthening partnerships with top tech firms, and a clear roadmap for innovation, AMD is positioning itself as the most credible challenger to Nvidia’s supremacy in the AI chip market — a challenge that could redefine the semiconductor industry over the next decade.









