OpenAI may produce its own AI chips
OpenAI CEO Sam Altman had made purchasing more artificial intelligence chips a top priority for the company and complained about a lack of supply in a market where Nvidia dominates more than 80 percent of the market. It’s understandable that OpenAI is sensitive about this issue. Because stronger and more hardware is needed for training the company’s artificial intelligence models. Buying more artificial intelligence chips already means incurring eye-watering costs.
Since 2020, OpenAI has developed its generative AI technologies on a massive supercomputer built by Microsoft, one of its biggest backers, and using 10,000 GPUs from Nvidia. However, developing is not enough; it is necessary to “burn” money to keep these models alive. If ChatGPT queries were to reach the scale of one-tenth the size of a Google search, it would need roughly $48.1 billion worth of GPUs initially and roughly $16 billion worth of chips per year to keep them running, according to Bernstein analyst Stacy Rasgon.
Special chip era
Acquiring a chip company could speed up OpenAI’s process of building its own chip, as was the case with Amazon’s acquisition of Annapurna Labs in 2015. According to sources close to the company, OpenAI is doing exactly this, but the identity of the company it aims to acquire remains unclear. Even if OpenAI goes ahead with its plans for a custom chip — including purchasing — that effort would need to take several years, leaving the company dependent on vendors like Nvidia and AMD in the meantime.