Advertisement
Advertisement


The demand for Nvidia’s chips illustrates the computing power needed for the booming generative AI industry. AI bots like OpenAI’s ChatGPT and Google’s Bard can write emails, haikus, and essays in human-sounding prose, but they require a lot of computing power.

The chips are not cheap, either. Microsoft, for instance, uses tens of thousands of Nvidia’s A100 graphic chips to fuel ChatGPT, which reportedly cost several hundred millions of dollars, according to Bloomberg.

Advertisement

“The computer industry is going through two simultaneous transitions—accelerated computing and generative AI,” said Jensen Huang, founder and CEO of Nvidia.

Both trends are driving Nvidia’s results. Revenue from its data center business reached a quarterly record of $4.28 billion, up 18% from the previous quarter and 14% from the first quarter of 2022.

Advertisement

Why are Nvidia’s chips so sought after?

Nvidia owns most of the market for graphic processing units, or GPUs. Known for graphics and video rendering, GPUs are becoming more popular for AI use due to their ability to process many pieces of data at once.

Advertisement

The global AI chip market is projected to grow from $17 billion in 2022 to $227 billion by 2032, according to Precedence Research, a market research firm.

The future of the world’s data centers

On a May 24 conference call going over the quarterly results, Huang said the world’s data centers are moving toward accelerated computing, adding that Nvidia has been preparing for this moment for the past 15 years.

📬 Sign up for the Daily Brief

Our free, fast, and fun briefing on the global economy, delivered every weekday morning.