1. Home >
  2. Computing

AMD Could Grab 7% of the AI Market With MI300X Accelerators: Analysts

Nvidia's backlog for its H100 and A100 accelerators could leave the door open for AMD to swoop in and snag some frustrated customers.
By Josh Norem
AMD Instinct MI300
Credit: AMD

Perhaps the only downside to Nvidia's current domination of the AI market is that not every company that wants a GPU can get their hands on one. That situation probably won't change any time soon either, as demand for AI currently seems to have no bounds, and competition remains somewhat slim. However, AMD is soon coming to market with its H100 competitor called the MI300 family, with the MI300A CPU+GPU product and the MI300X GPU. Now, analysts report that due to the long wait times and sky-high pricing for Nvidia hardware, many large organizations might opt for AMD's solution instead. This could allow AMD to capture up to 7% of the AI market in the near term, which would be a coup for the company in its fight against its much bigger rival.

AMD's prospects in the nascent battle for AI market share come from MyDrivers, which states that industry insiders are predicting AMD will begin shipping large quantities of its MI300X later this year, possibly putting a dent in Nvidia's dominance. It says that AMD has already shipped MI300X in small quantities for testing and will ramp up its efforts later in the year. Primary buyers are huge companies such as Microsoft, which will use it for some of its data centers. In addition, it's reported that Oracle, Meta, and even OpenAI are interested in AMD's offerings due to its competitive pricing and immediate availability.

MI300X GPU
The MI300X sports nine chiplets, along with 192GB of HBM3 memory. Credit: AMD

It's projected that if AMD secures several large purchases from industry titans, it could reach up to $2 billion in hardware sales in this category for 2024. Though that sounds paltry compared with Nvidia's earnings—it raked in $18.4 billion in data center earnings in the last quarter alone—it would still be the fastest-selling product in the company's history.

AMD seems to have a somewhat competitive product, at least on paper. As Wccftech notes, it sports up to 2.4X higher memory capacity and 1.6X higher memory bandwidth than H100. But obviously, software also plays a huge role. Microsoft's CTO also previously predicted AMD would become a significant player in the AI market in 2024, backing up the report from MyDrivers.

It will be interesting to see how it all plays out, as Nvidia just tipped its hand with yesterday's announcement of Blackwell as the H100's successor. However, Blackwell isn't a GPU but a platform designed to be sold at scale, which suggests the H100 will continue to be sold alongside it for those without gigantic budgets. Nvidia didn't mention any pricing for Blackwell but noted a single DGX system contains 600,000 parts and weighs 3,000 pounds, up from 70 pounds for the original model from back in the day.

Tagged In

Data Centers AMD GPUs

More from Computing

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up