1. Home >
  2. Computing

OpenAI Mulls Manufacturing Its Own AI Chips

The company behind ChatGPT is exploring alternatives to Nvidia's accelerators, including making its own.
By Josh Norem
Nvidia Hopper
Credit: Nvidia

It's no secret that Nvidia rules the roost regarding the hardware required to train large language models for AI applications. However, with demand far exceeding supply, prices have soared, and the queue for them rivals the one found at your local vehicle registration office. This situation has reportedly caused OpenAI—which created ChatGPT—to look for alternatives to Nvidia's hardware, including making its own chips.

A report from Reuters details OpenAI's plans to solve the hardware gridlock the AI industry faces. One of those options is to forge its path without Nvidia's hardware, which it would likely do by purchasing one of Nvidia's competitors so it could create its own chips in the future. The report states that OpenAI has gone as far as evaluating merger targets but has not moved beyond that stage. Other options it's considering aside from making its own chips include working more closely with Nvidia and its competitors or diversifying its chip supply to exclude Nvidia completely.

The Precious
What every company in the AI world desires currently; Nvidia H100s as far as the eye can see. Credit: Nvidia

The report states that OpenAI's CEO, Sam Altman, has currently made acquiring more AI chips the company's top priority. That's easier said than done. Nvidia can't make enough of its H100 AI chips to satisfy demand for another 1.5 years, according to TSMC, due to limitations on its manufacturing capacity. The company reportedly used 10,000 Nvidia GPUs to train ChatGPT. But as OpenAI tries to scale, it's finding it difficult due to tight supply and "eye-watering" prices.

According to Reuters, if OpenAI increased its query volume to 1/10th of Google's over time, it would require $48 billion in GPUs to scale to that level. Beyond that, it would need to spend $16 billion annually just to keep up with demand. This is an existential problem for the company and the industry at large. It's also good news for Nvidia, which reportedly earns up to 1,000% margins on every H100 it sells.

The path being examined by OpenAI is also not new, as Amazon, Google, and Meta use custom chips designed for their specific needs. Its biggest partner, Microsoft, is reportedly working on its own custom silicon. However, it would take OpenAI years to get new silicon designed, manufactured, and installed, so it'll still have to buy chips from Nvidia or AMD in the interim.

Tagged In

Semiconductors Artificial Intelligence

More from Computing

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up