Nvidia CEO Jensen Huang unveiled a 'very big' new chip at his company's 'Woodstock of AI'

Nvidia's closely watched annual conference came as the company has led the way in a Wall Street frenzy for AI stocks to start 2024

We may earn a commission from links on this page.
Jensen Huang and Nvidia's new processors
Nvidia CEO Jensen Huang during his keynote address at Nvidia’s GTC conference on Monday in San Jose, California.
Photo: Justin Sullivan (Getty Images)

Nvidia CEO Jensen Huang unveiled the AI chipmaker’s highly anticipated new processor on Monday, saying tech giants like Microsoft and Google are already eagerly awaiting its arrival.

Huang made the announcement during the company’s closely watched GPU Technology Conference, or GTC, which has been dubbed the “Woodstock of AI” by employees and analysts alike. The annual conference in San Jose, California, came as Nvidia has led the way in a Wall Street frenzy for AI stocks to start 2024 — blowing past sky-high earnings expectations, becoming the first chipmaker to reach a $2 trillion market cap, and soaring past companies including Amazon to become the third-most valuable company in the world.

Advertisement

Nvidia’s rise has been fueled by its $40,000 H100 chips, which power the so-called large language models needed to run generative AI chatbots like OpenAI’s ChatGPT. The chips are known as GPUs, or graphics processing units.

Advertisement

On Monday, Huang unveiled Nvidia’s next-generation GPU “Blackwell,” named after mathematician David Blackwell, the first Black scholar inducted into the National Academy of Sciences. The Blackwell chip is made up of 208 billion transistors, and will be able to handle AI models and queries more quickly than its predecessors, Huang said. The Blackwell chips succeed Nvidia’s hugely in-demand H100 chip, which was named for the computer scientist Grace Hopper. Huang called Hopper “the most advanced GPU in the world in production today.”

Advertisement

“Hopper is fantastic, but we need bigger GPUs,” he said. “So ladies and gentlemen, I’d like to introduce you to a very big GPU.”

Microsoft, Google parent Alphabet, and Oracle are among the tech giants preparing for Blackwell, Huang said. Microsoft and Google are two of Nvidia’s largest customers for its H100 chips.

Advertisement

Nvidia stock was largely flat Monday, but it’s up more than 83% so far this year and more than 241% over the last 12 months.

Read more: Is Nvidia stock in a bubble that will burst? Wall Street can’t make up its mind

Advertisement

During his keynote address at the Nvidia conference Monday, Huang announced new partnerships with computer software makers Cadence, Ansys, and Snyopsys. Cadence, Huang said, is building a supercomputer with Nvidia’s GPUs. And Nvidia’s AI foundry is working with SAP, ServiceNow, and Snowflake, Huang said.

Huang also shouted out Dell founder and CEO Michael Dell in the audience, whose company is partnering with Nvidia. Dell is expanding its AI offerings to customers, including new enterprise data storage with Nvidia’s AI infrastructure.

Advertisement

“Every company will need to build AI factories,” Huang said. “And it turns out that Michael is here, and he’s happy to take your order.”

Huang also announced that Nvidia is creating a digital model of the Earth to predict weather patterns with its new generative AI model, CorrDiff, which can generate 12.5 times higher-resolution images than current models.

Advertisement

And Huang said that Nvidia’s computing platform Omniverse now streams to Apple’s Vision Pro headset, and that Chinese EV-maker BYD is adopting Nvidia’s next generation computer Thor.

Huang wrapped up his two-hour long keynote accompanied by two robots, the orange and green Star Wars BD droids, which he said are powered by Nvidia’s Jetson computer systems, and which learned to walk with Nvidia’s Isaac Sim.

Advertisement

Throughout the week at the GTC, Nvidia researchers and executives will be joined by power players in the AI industry — including Brad Lightcap, chief operating officer of OpenAI, and Arthur Mensch, chief executive of French OpenAI rival Mistral AI — to deliver sessions on topics ranging from innovation to ethics.

Microsoft and Meta are Nvidia’s largest customers for its H100 chip, with both tech giants spending $9 billion on the chips in 2023. Alphabet, Amazon, and Oracle were also top spenders on the chips last year.

Advertisement

But the frenzy over Nvidia’s H100 chips has prompted worries about shortages, and competitors keen to stay ahead in the AI race have started building their own versions of the chips. Amazon has worked on two chips called Inferentia and Tranium, while Google has been working on its Tensor Processing Units.