Meta and Google announce new in-house AI chips, creating “trillion dollar question” for Nvidia

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Hardware is emerging as a key AI development area. For big tech companies with the money and talent to do so, developing chips in-house helps reduce dependence on outside designers like Nvidia and Intel while allowing firms to tailor their hardware specifically to their AI models. allows, increasing efficiency and saving energy. Expenses

These in-house AI chips that Google and Meta just announced are the first real challenge to Nvidia’s dominant position in the AI ​​hardware market. Nvidia controls more than 90% of the AI ​​chips market, and demand for its industry-leading semiconductors is only growing. But if Nvidia’s biggest customers start making their own chips instead, its soaring share price, up 87 percent since the start of the year, could suffer.

“From Meta’s point of view… it gives them a bargaining tool with Nvidia,” Edward Wilford, an analyst at tech consultancy Omdia, told F.Ortune “This shows Nvidia that they are not exclusive, [and] That they have other options. It’s AI-friendly hardware they’re developing.

Why does AI need new chips?

AI models require massive computing power because of the large amount of data required to train large language models behind them. Traditional computer chips are not capable of processing the trillions of data points on which AI models are built, which has given rise to a market for AI-specific computer chips, often called “cutting-edge” chips because they all There are more powerful devices than Market

Semiconductor giant Nvidia dominates nascent market: The waiting list for Nvidia’s $30,000 flagship AI chip is months long, and demand has driven its share price up nearly 90% in the past six months.

And rival chipmaker Intel is fighting to stay competitive. It has just released its Gaudi 3 AI chip to compete directly with Nvidia. AI developers—from Google and Microsoft to smaller startups—are all competing for scarce AI chips, which are limited by productivity.

Why are tech companies starting to make their own chips?

Both Nvidia and Intel can only produce a limited number of chips because they and the rest of the industry rely on Taiwanese manufacturer TSMC to actually assemble their chip designs. With only one manufacturer firmly in the game, the manufacturing lead time for these latest chips is several months. This is a key factor that has led major players in the AI ​​space, such as Google and Meta, to design their own chips. said Alvin Nguyen, senior analyst at consulting firm Forrester good fortune Chips designed by the likes of Google, Meta and Amazon won’t be as powerful as Nvidia’s top-of-the-line offerings — but that could give the companies a speed advantage. They will be able to produce them on less specialized assembly lines with shorter wait times, he said.

“If you have something that’s 10% less powerful but you can get it now, I’m buying it every day,” Ngyuen said.

Even if the native AI chips that Meta and Google are developing are less powerful than Nvidia’s latest AI chips, they may improve according to the company’s specific AI platforms. In-house chips developed for the company’s own AI platform can be more efficient and save costs by eliminating unnecessary functions, Ngyuen said.

“It’s like buying a car. Well, you need an automatic transmission. But do you need leather seats, or heated massage seats?” Nguyen said.

“The benefit for us is that we can build a chip that can handle our specific workloads more efficiently,” Melanie Rowe, a Meta spokeswoman, wrote in an email. good fortune

Nvidia’s top-of-the-line chips sell for around $25,000. These are extremely powerful tools, and are designed to be good in a wide range of applications, from training AI chatbots to generating images to developing recommendation algorithms on TikTok and Instagram. That means a slightly less powerful, but more efficient chip might be a better fit for a company like Meta, for example—which has invested primarily in AI for its recommendation algorithms, not For customer-facing chatbots.

Nvidia GPUs are great in AI data centers, but they’re general purpose, said Brian Colello, equity research lead at investment research firm Morningstar. good fortune. “There are probably some workloads and some models where a custom chip might be even better.”

The trillion dollar question

Ngyuen said more specialized in-house chips could add benefits because of their ability to integrate into existing data centers. Nvidia chips use a lot of power, and they emit a lot of heat and noise—so much so that tech companies may be forced to redesign or relocate their data centers to integrate soundproofing and liquid cooling. Less powerful native chips, which use less power and emit less heat, can solve this problem.

AI chips developed by Meta and Google are long-term bets. Ngyuen estimated that it took about a year and a half to develop these chips, and it will take months before they can be implemented on a large scale. For the foreseeable future, the entire AI world will continue to rely heavily on Nvidia (and to a lesser extent, Intel) for its computing hardware needs. In fact, Mark Zuckerberg recently announced that Meta is on track to own 350,000 Nvidia chips by the end of this year (the company will spend about $18 billion on chips by then.) But outsourcing computing power to May move away from native chip design. Loosen Nvidia’s chokehold on the market.

“The trillion-dollar question for Nvidia’s valuation is the risk of these in-house chips,” Colello said. “If these in-house chips significantly reduce reliance on Nvidia, then Nvidia stock is likely to decline from here. This growth is not surprising, but its execution over the next few years is the key valuation question in our minds.” Is.”

Subscribe to the Eye on AI newsletter to learn how AI is shaping the future of business. Register for free.
WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment