How the AI revolution made Nvidia the world’s most valuable chip maker
• Nvidia is the world’s favorite maker of AI chips.
• It has revealed its latest chip, supposedly nine times faster than its predecessor.
• Only AMD stands any chance of challenging Nvidia’s AI chip supremacy.
If there is one company that holds prominence in the technology industry at the moment – and possibly for the foreseeable future – it is Nvidia Corp. The tech giant’s products, specifically AI chips, are the industry’s most precious resource. So much so that Nvidia has surged past US$1 trillion in market capitalization, to become the world’s most valuable chipmaker, signaling an apparent boom in AI demand.
The company has been the biggest beneficiary of the rise of ChatGPT and other generative artificial intelligence (AI) apps, virtually all of which are powered by its very powerful graphics processors. Before that, Nvidia’s chips were also extensively used to power traditional AI systems, with demand for the chips seeing an uptick during the boom in cryptocurrency, as that industry’s systems also rely on their processing power.
Nvidia has cornered the market on AI with its chips – and it wasn’t an effort that began in the post-ChatGPT world. Nvidia’s founder and CEO, Jensen Huang, has always been so sure about AI in the past that his certainty mostly felt like lofty rhetoric or a marketing gimmick.
How did Nvidia reach the inflection point on AI chips before anyone else?
Nvidia, led by Huang, has always been on the edge of development – with gaming, then machine learning, followed by cryptocurrency mining, data centers, and now AI. Over the last decade, the chip giant developed a unique portfolio of hardware and software offerings aimed at democratizing AI, positioning the company to benefit from adopting AI workloads.
But the real turning point was 2017, when Nvidia started tweaking GPUs to handle specific AI calculations. That same year, Nvidia, which typically sold chips or circuit boards for other companies’ systems, also began selling complete computers to carry out AI tasks more efficiently.
Some of its systems are now the size of supercomputers, which it assembles and operates using proprietary networking technology and thousands of GPUs. Such hardware may run for weeks to train the latest AI models. For some rivals, it was tough to compete with a company that sold computers, software, cloud services, trained AI models, and processors.
Tech giants like Google, Microsoft, Facebook, and Amazon were already buying more Nvidia chips for their data centers by 2017. Institutions like Massachusetts General Hospital use Nvidia chips to spot anomalies in medical images like CT scans.
By then, Tesla had announced it would install Nvidia GPUs in all its cars to enable autonomous driving. Nvidia chips provide the horsepower underlying virtual reality headsets like those brought to market by Facebook and HTC. Nvidia was gaining a reputation for consistently delivering faster chips every couple of years.
Looking back, it is safe to say that over more than ten years, Nvidia has built a nearly impregnable lead in producing chips that can perform complex AI tasks like image, facial, and speech recognition, as well as generating text for chatbots like OpenAI’s ChatGPT.
The biggest upside? Nvidia achieved its dominance by recognizing the AI trend early, tailoring its chips to those tasks, and then developing critical pieces of software that aid in AI development. As The New York Times (NYT) puts it, Nvidia has gradually turned, to all intents and purposes, into a one-stop shop for AI development.
According to the research firm Omdia, while Google, Amazon, Meta, IBM, and others have also produced AI chips, Nvidia today accounts for more than 70% of the world’s AI chip sales. It holds an even more prominent position in training generative AI models. Microsoft alone spent hundreds of millions of dollars on tens of thousands of Nvidia A100 chips to help build ChatGPT.
“By May this year, the company’s status as the most visible winner of the AI revolution became clear when it projected a 64% leap in quarterly revenue, far more than Wall Street had expected,” NYT’s Don Clark said in his article.
The “iPhone moment” of generative AI: Nvidia’s H100 AI chips
“We are at the iPhone moment for AI,” Huang said during his GTC Conference in March this year. He also quickly pointed out Nvidia’s role at the start of this AI wave: he brought a DGX AI supercomputer to OpenAI in 2016, hardware that was ultimately used to build ChatGPT.
At the GTC Conference earlier this year, Nvidia unveiled its H100, the successor to Nvidia’s A100 GPUs, which have been at the foundation of modern large language model development efforts. Nvidia flaunted its H100 as a GPU up to nine times faster for AI training and 30 times faster for inference than the A100.
While Nvidia doesn’t discuss prices or chip allocation policies, industry executives and analysts said each H100 costs between US$15,000 and more than US$40,000, depending on the packaging and other factors — roughly two to three times more than the predecessor A100 chip.
Yet the H100s continue to face a massive supply crunch amid skyrocketing demand and the easing out of shortages across most other chip categories. Reuters reported that analysts believe Nvidia can meet only half the demand, and its H100 chip is selling for double its original price of US$20,000. The trend, according to Reuters, could go on for several quarters.
Undeniably, the demand surge is coming from China, where companies are stockpiling chips due to US chip export curbs. A report by the Financial Times indicated that China’s leading internet companies had placed orders for US$5 billion worth of chips from Nvidia. The supply-demand divide will inevitably lead some buyers to turn to Nvidia’s rival, AMD, which is looking to challenge the company’s most robust offering for AI workloads with its M1300X chip.
Since Nvidia shares have tripled in value this year, adding more than US$700 billion to the company’s market valuation and making it the first trillion-dollar chip firm, investors expect the chip designer to forecast quarterly revenue above estimates when it reports results on August 23.
Nvidia’s second-quarter earnings will be the AI hype cycle’s biggest test. “What Nvidia reports in its upcoming earnings release is going to be a barometer for the whole AI hype,” Forrester analyst Glenn O’Donnell told Yahoo. “I anticipate that the results will look outstanding because demand is so high, meaning Nvidia can command even higher margins than otherwise.”
In terms of supplies, Nvidia won’t be the sole provider in town, but what sets the company on a higher pedestal than its competitors is its comfortable lead and patent-protected technology, that has been mooted in such an early stage. Some market experts believe that, for now at least, AMD is the ‘only viable alternative’ to Nvidia’s AI chips.
- A year of high-severity attacks and groundbreaking cybersecurity strategies in 2023
- How default probability analytics can make a difference
- SMIC defying US sanctions with 5nm innovation and Huawei alliance
- Nvidia’s CEO, Jensen Huang: AI will take over coding, making learning optional
- Chinese cloud companies in pricing war as Alibaba slashes prices