AMD CEO Lisa Su unveiled an ambitious long-term forecast on Tuesday, projecting that the company’s overall revenue will surge by around 35% annually over the next three to five years, propelled by what she called the “insatiable demand” for artificial intelligence chips. Su emphasized that the lion’s share of this growth will come from AMD’s AI data center business, which is expected to expand at a staggering 80% annual rate, setting the stage for tens of billions of dollars in sales by 2027.
“This is what we see as our potential given the customer traction, both with the announced customers, as well as customers that are currently working very closely with us,” Su explained to analysts during AMD’s first Financial Analyst Day since 2022. The event marked a pivotal moment for the chipmaker, showcasing its rising influence in the global AI infrastructure race.
Su further noted that AMD is on track to capture a double-digit market share in data center AI chips over the next few years—an extraordinary goal considering Nvidia’s current dominance of over 90% of the market. Despite this optimism, AMD’s stock fell by 3% in extended trading following the announcement, though it partially recovered as investors digested the company’s improved gross margin forecast of 55–58%, a figure that beat analyst expectations.
Currently, Nvidia’s dominance—backed by a market capitalization exceeding $4.6 trillion—dwarfs AMD’s $387 billion valuation, but AMD’s strategy suggests it is positioning itself as the only credible alternative to Nvidia in the AI chip ecosystem.
AI Partnerships, Market Strategy, and the Race to Catch Nvidia
At the center of AMD’s momentum lies a series of strategic partnerships with industry giants that are reshaping the competitive landscape. One of the most significant is its multi-billion-dollar agreement with OpenAI, announced in October. Under this deal, AMD will supply billions of Instinct AI chips to OpenAI over multiple years—beginning in 2026, when AMD is expected to deliver systems capable of consuming up to one gigawatt of power.
As part of this collaboration, OpenAI may acquire a 10% equity stake in AMD, symbolizing not just a commercial alliance but also a deep technological partnership. OpenAI is reportedly helping AMD optimize its next-generation AI systems, built around the forthcoming Instinct MI400X chips, slated for launch next year. These chips represent AMD’s response to Nvidia’s latest GPU architectures, and Su said they will enable “rack-scale” configurations, where 72 GPUs operate together as one cohesive system—a crucial feature for running massive AI models efficiently.
If AMD succeeds with this architecture, it would finally close the technological gap with Nvidia’s rack-scale systems, which have been in use for three product generations. Such advancements would not only elevate AMD’s competitiveness but also diversify the AI hardware market, giving hyperscalers and enterprises greater choice and negotiating power.
Su underscored the scale of the opportunity ahead: AMD now estimates the AI data center parts and systems market to reach $1 trillion annually by 2030, growing at a compound annual rate of 40%. This figure marks a dramatic revision from AMD’s prior projection of $500 billion by 2028, reflecting how swiftly AI-driven infrastructure spending is accelerating. Importantly, the new forecast incorporates not just GPUs but also central processing units (CPUs)—a category where AMD’s Epyc line remains its strongest revenue contributor.
Beyond AI, AMD continues to grow across its traditional product lines. Su highlighted that every major segment of AMD’s business—from Epyc CPUs and gaming console chips to networking components—is performing robustly. “The other message that we want to leave you with today is every other part of our business is firing on all cylinders, and that’s actually a very nice place to be,” Su remarked confidently.
AMD’s Epyc CPUs remain central to the company’s profitability, competing head-to-head with Intel’s Xeon processors and emerging Arm-based server chips. Meanwhile, its semi-custom division continues to power consoles for partners like Sony and Microsoft, reinforcing AMD’s diverse revenue base.
AMD’s shares have nearly doubled in 2025, underscoring investor optimism about the company’s strategic pivot toward AI. As global corporations—ranging from Meta and Oracle to OpenAI—pour hundreds of billions of dollars into AI infrastructure, AMD’s emergence as the primary alternative to Nvidia could redefine the competitive balance of the chip industry.
With its escalating R&D investment, strong partnerships, and deepening penetration in both AI and traditional computing markets, AMD is positioning itself not just as a fast follower but as a co-architect of the next era of intelligent computing—one where power, scalability, and openness determine who leads the trillion-dollar race in AI infrastructure.
The Information is Collected from NBC News and MSN.






