Hardware has emerged as a major growth area for AI. For Big Tech companies with the money and talent to do so, in-house chip development can help reduce dependence on external designers such as Nvidia or Intel, while also allowing companies to tailor hardware to their AI models. You can also adjust it to improve performance and save energy. It costs money.
These homegrown AI chips announced by Google and Meta pose one of the first real challenges to Nvidia's dominant position in the AI hardware market. Nvidia controls more than 90% of the AI chip market, and demand for its industry-leading semiconductors is only increasing. But if Nvidia's biggest customer starts making its own chips instead, the company's stock price, which has risen 87% since the beginning of the year, could take a hit.
“From a meta perspective, meta gives us the tools to negotiate with Nvidia,” Edward Wilford, an analyst at technology consulting firm Omdia, told F.Ortune. “This allows Nvidia to recognize that it is not a monopoly. [and] It means they have other options. It's the AI-optimized hardware they're developing. ”
Why does AI need new chips?
AI models require large amounts of computing power, as vast amounts of data are required to train the large language models behind them. Traditional computer chips are simply not capable of processing the trillions of data points on which AI models are built. This has created a market for AI-specific computer chips. This computer chip is often referred to as the “cutting edge” chip because it is the most powerful device in the world. market.
Semiconductor giant Nvidia dominates this nascent market. The waiting list for Nvidia's $30,000 flagship AI chip is months long, and demand has driven the company's stock up nearly 90% in the past six months.
And rival chipmaker Intel is fighting to stay competitive. It just released its Gaudi 3 AI chip to compete directly with Nvidia. AI developers, from Google and Microsoft to small startups, are all competing for rare AI chips with limited manufacturing capacity.
Why are technology companies starting to manufacture their own chips?
Both Nvidia and Intel rely on Taiwanese manufacturer TSMC for the actual assembly of their chip designs, which limits the number of chips they can produce. He is the only manufacturer that is definitely on board, so the manufacturing lead time for these cutting-edge chips is months. This is a key factor that has led major companies in the AI space, such as Google and Meta, to rely on designing their own chips. said Alvin Nguyen, senior analyst at consulting firm Forrester. luck Chips designed by Google, Meta, Amazon and others aren't as powerful as Nvidia's top-of-the-line offering, but they could benefit companies in terms of speed. He said it would be possible to produce on less specialized assembly lines with less waiting time.
“If there is something available right now that is 10% less powerful, I buy it every day,” Nguyen said.
Even if the native AI chips being developed by Meta and Google aren't as powerful as Nvidia's cutting-edge AI chips, they could be better tailored to the company's particular AI platform. Nguyen said in-house chips designed for a company's own AI platform could be more efficient and reduce costs by eliminating unnecessary functionality.
“It's like buying a car. Yeah, you need an automatic transmission. But do you want leather seats or heated massage seats?” Nguyen said.
“The benefit for us is that we can build chips that can handle certain workloads more efficiently,” Meta spokeswoman Melanie Lo wrote in an email. luck.
Nvidia's top-of-the-line chips sell for about $25,000 each. These are extremely powerful tools, designed to be suitable for a wide range of uses, from training AI chatbots to generating images to developing recommendation algorithms for sites like TikTok and Instagram. This means a slightly less capable but more customized chip would be better suited for companies like Meta, which are investing in AI primarily for recommendation algorithms rather than consumer-facing chatbots. This means that there is a possibility that
“Nvidia GPUs are great in AI data centers, but they're general purpose,” said Brian Colello, head of equity research at Morningstar. luck. “For certain workloads and certain models, custom chips may be even better.”
The trillion dollar problem
Nguyen said more specialized in-house chips could be integrated into existing data centers, providing additional benefits. Nvidia chips consume a lot of power and emit a lot of heat and noise. As a result, tech companies may be forced to redesign or relocate their data centers to integrate soundproofing and liquid cooling. Less powerful native chips that consume less power and emit less heat could solve this problem.
The AI chips developed by Meta and Google are a long-term bet. Nguyen estimates that it will take about a year and a half to develop these chips, and it will likely be several months before they are implemented at scale. For the foreseeable future, the entire AI world will continue to rely heavily on Nvidia (and to a lesser extent Intel) for its computing hardware needs. In fact, Mark Zuckerberg recently announced that by the end of this year, he plans to own 350,000 Nvidia chips (the company will have spent about $18 billion on chips by then). It's a schedule). But a move away from outsourcing computing power and toward native chip design could loosen Nvidia's grip on the market.
“The multitrillion-dollar problem with Nvidia's valuation is the threat of these homegrown chips,” Collero said. “If these in-house chips significantly reduce dependence on Nvidia, we'll probably see a downturn in Nvidia's stock price from here. This development is not surprising, but in the coming years Whether that happens is a key evaluation issue in our minds.”