Nvidia stock climbs as CFO says new chip to ship in 2024


FILE PHOTO: The logo of NVIDIA as seen at its corporate headquarters in Santa Clara, California, in May of 2022. Courtesy NVIDIA/Handout via REUTERS THIS IMAGE HAS BEEN SUPPLIED BY A THIRD PARTY. MANDATORY CREDIT/File Photo

SAN JOSE, California (Reuters) -Nvidia's stock climbed on Tuesday after the heavyweight chipmaker said its new flagship AI processor is expected to ship later this year and CEO Jensen Huang said he is chasing a data center market potentially greater than $250 billion.

Nvidia's stock rose nearly 2% to $901 after Huang and Chief Financial Officer Colette Kress answered questions from investors at the company's annual developer conference in San Jose, California. The shares had dipped nearly 4% earlier in the day.

"We think we're going to come to market later this year," Kress said, referring to the company's new AI chip, which the company debuted on Monday.

Huang estimated that companies operating data centers will spend more than $250 billion a year to upgrade them with accelerated computing components that Nvidia specializes in developing. He said that market was growing by as much as 25% a year.

Nvidia is shifting from selling single chips to selling total systems, potentially winning a larger chunk of spending within data centers.

"Nvidia doesn't build chips, it builds data centers," Huang said.

Called Blackwell, Nvidia's new processor combines two squares of silicon the size of the company's previous offering. Nvidia also detailed a new set of software tools to help developers sell artificial-intelligence models more easily to firms that use its technology.

Nvidia is working with contract chip manufacturer TSMC to avoid bottlenecks in packaging chips that slowed shipments of its previous flagship AI processor, Huang said.

"The volume ramp in demand happened fairly sharply last time, but this time, we've had plenty of visibility" into demand for Blackwell chips, Huang said.

Some analysts said Wall Street has already factored in the debut of the B200 Blackwell chip, which the company claims is 30 times faster at some tasks than its predecessor.

The Blackwell chip will be priced between $30,000 and $40,000, Huang told CNBC.

Huang later clarified that comment, saying Nvidia will include its new chip in larger computing systems and that prices will vary based on how much value they provide.

"The Blackwell technology shows a significant performance uplift compared to Hopper (the current flagship chip) but it's always hard to live up to the hype," said David Wagner, portfolio manager at Aptus Capital Advisors.

In a discussion about Nvidia's cooperation with South Korean chipmakers, Huang said Nvidia is qualifying Samsung Electronics' high bandwidth memory (HBM) chips.

Reuters reported last week that Samsung's HBM3 series have not yet passed Nvidia's qualification for supply deals.

Samsung's cross-town rival SK Hynix on Tuesday said it has begun mass production of next-generation HBM3E chips, with sources saying initial shipments will go to Nvidia this month.

At the center of Wall Street's AI euphoria, Nvidia's stock has more than tripled over the past 12 months, making it the U.S. stock market's third-most valuable company, behind only Microsoft and Apple.

Even after that meteoric rally, Nvidia is trading at about 35 times its expected earnings, cheap compared with its PE of 58 a year ago, according to LSEG data.

That decline in Nvidia's PE valuation is the result of analysts massively increasing their estimates of the chipmaker's future earnings, and if those forecasts turn out to be too optimistic, Nvidia's stock risks plummeting back to earth.

Nvidia expects major customers including Amazon.com, Alphabet's Google, Meta Platforms, Microsoft, OpenAI and Tesla to use its new chip.

Its hardware products will likely remain "best-of-breed" in the AI industry, Morningstar analysts said, lifting their estimates for Nvidia data-center revenue for 2026 and 2028.

"We remain impressed with Nvidia's ability to elbow into additional hardware, software, and networking products and platforms," they said.

The software push shows how Nvidia, whose chips are mostly used to train large-language models like Google's Gemini, is trying to make its hardware easier to adapt for companies rushing to integrate generative AI into their businesses.

Many analysts expect Nvidia's market share to drop several percentage points this year, as competitors launch new products and the company's largest customers make their own chips, although its dominance is expected to remain unchallenged.

(Reporting by Medha Singh and Aditya Soni in Bangalore and Stephen Nellis and Max A. Cherney in San Jose, California; Additional reporting by Noel Randewich in Oakland, California, Heekyong Yang in Seoul; Editing by Matthew Lewis and Stephen Coates)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

   

Next In Tech News

Iran restores access to WhatsApp and Google Play after they were banned amid protests
OpenAI unveils artificial intelligence that can 'reason' through math and science problems
Court orders recall of Signify lighting products over patents, Seoul Semiconductor says
Telegram and WeChat first to initiate licensing to operate in Malaysia
Japan Airlines delays flights after cyberattack
Japan airlines experiencing issues due to cyberattack
The war on wildfires is going high-tech
Opinion: Why I’m getting rid of my smartwatch
How smartphones powered the AI boom in 2024
JAL's systems back to normal after cyberattack delayed flights

Others Also Read