Nvidia can’t keep up with demand
In particular, the rapid development of ChatGPT and other productive artificial intelligence models has resulted in Nvidia’s recent investments in this field benefiting from it. According to the reports, companies from all corners of the tech industry are queuing up to get Nvidia’s brand new GPUs, including the A100 and H100.
However, it seems that this queue will remain as a “tail” for a while. Because multiple reports show that Nvidia cannot keep up with the demand for artificial intelligence chips. The Nvidia A100 uses the TSMC 7nm process node, while the H100, which started shipping last May, is based on the TSMC 4N process node (an optimized version of the 5nm node designed for Nvidia). Since its launch, the company has seen a huge demand for its H100 and DGX H100 GPU powered servers. While Nvidia is trying to meet this demand, the recent artificial intelligence boom has taken a toll on the company’s efforts to meet demand.
Prices took flight
Nvidia routes orders to non-Chinese technology partners first. For this reason, it is stated that some of the orders in China may sag until December. Currently, the artificial intelligence industry is seen as the most lucrative area for Nvidia, and the company is expected to focus on this side even more. If a good balance is not struck, the supply of consumer-grade graphics cards may drop and yet another graphics card crisis may occur.