Personal Finance

AMD Launches New Chip As AI Challenge To Nvidia Heats Up

Advanced Micro Devices (AMD) on Wednesday launched its clearest challenge yet to the dominance of Nvidia (NVDA) in the red-hot market for chips that power artificial intelligence applications. AMD stock wavered following the announcement.


At a company event Wednesday, AMD launched its Instinct MI300 Series accelerator family. As previewed earlier this year, the chips are designed to handle the massive workloads in AI applications.

“This pace of innovation is faster than anything I’ve ever seen before,” Chief Executive Lisa Su said at the San Jose, Calif. event. “For us at AMD, we are so well positioned to power that end-to-end infrastructure that defines this new AI era.” Su forecast that the market for AI accelerator chips in data centers will surge 70% annually to reach more than $400 billion in 2027.

AMD unveiled the MI300 family of chips designed to handle the heavy computing workloads in AI systems. The company had projected more than $2 billion in 2024 sales from the new chips.

The rise of AI, including generative AI applications such as OpenAI’s ChatGPT and Google’s Bard, sparked strong demand for chips that can offer more significant processing power. Nvidia, known for powerful chips used for gaming and Hollywood special effects, took the early lead when AI started to become a big trend a decade ago. Now, AMD is looking to challenge Nvidia, whose H100 AI chip is considered the market leader.

On the stock market today, AMD stock slipped 1.6% to 116.45 in recent action. NVDA stock, meanwhile, is down more than 2% at 456.22.

AMD Stock: Focus On Inference

CEO Su said AMD’s newest chip has industry-leading capacity and bandwidth. The chip is equal to existing competition in running training for data models, Su said. But she said AMD’s chip is expected to outperform rivals in inference, or running AI applications after the models have been trained.

“I have to say this is truly the most advanced product we’ve ever built,” Su said. “And it is the most advanced AI accelerator in the industry.”

Su also updated AMD’s projection for the market for AI chips for data centers, which the company had estimated would grow from $30 billion in 2023 to $150 billion by 2027. The company now sees the total market instead reaching more than $400 billion.

“It’s really clear that the demand is just growing much, much faster,” than anticipated, Su said.

The AMD event also featured executives from Microsoft (MSFT), Meta (META) and Oracle (ORCL) who shared how their companies are working with the chip giant. Microsoft said AMD’s new chips are available for customers of its cloud business to evaluate starting today.

Challenging Nvidia

Nvidia’s processors run most data centers that are powering generative artificial intelligence products. That has helped power back-to-back quarters with triple digit sales growth for Nvidia. Further, NVDA stock’s 215% gain this year is tops in the S&P 500.

But challengers are coming. Along with AMD’s announcement Wednesday, Intel (INTC) next week will host an event called AI Everywhere to highlight new processors for data centers.

AMD stock has gained about 70% this year, helped by its push into AI data center products.

Further, the company’s latest earnings report broke a streak of three quarters with year-over-year earnings declines for AMD. AMD stock gained nearly 10% the day after it published its results on Oct. 31, headlined by the company’s projection of more than $2 billion in 2024 sales for its MI300 AI accelerators.

The 2024 sales projection “supports the view that AMD is well-positioned to participate in the large and growing Gen AI compute market,” wrote Goldman Sachs analyst Toshiya Hari in a Nov. 1 client note.

Further, Raymond James analysts led by Srini Pajjuri wrote in a client note previewing Wednesday’s event that “We see no reason why AMD can’t capture 10%-20% share of the $100 billion+ AI accelerator market longer term, which could mean double-digit revenue growth and margin expansion for the next 2—3 years.”


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button