There is one area that has seen record spending this year as Metaplatforms has cut costs and headcount. It’s an infrastructure update for the social media giant to keep up with the artificial intelligence arms race.
On Thursday, Facebook’s owners unveiled a slew of new technologies, including a new chip developed in-house to speed up AI training and tools to help programmers get suggestions on how to build products. The company is also renovating its data centers to facilitate the adoption of AI technology.
In an emailed statement, CEO Mark Zuckerberg said, “This initiative is a long-term commitment to making this technology even more advanced and more effective in all of our activities. It reflects our efforts,” he said.
Custom accelerator chips help speed up the recommendation algorithms that power what people see on Facebook and Instagram. New data center designs are being rolled out dedicated to AI-optimized hardware. Meta said it has also completed the second phase of building an AI supercomputer to train large-scale language models, the same technology that powers ChatGPT.
Meta’s capital spending last year hit a record $31.4 billion, more than 4.5 times what it did in 2017. Many companies are investing in what Zuckerberg called the meta’s “year of efficiency,” with analysts expecting it to replicate 2022 levels. One of his will be devoted to improving and expanding the AI infrastructure.
“There is some tension” about mandating efficiency, but “it’s not a direct competition to invest in AI while also investing in efficiency,” said Kim Hazelwood, director of AI research at Meta. said.
Some of the AI updates clearly drive efficiency within Meta, which has laid off thousands of employees in recent months.
CodeCompose is a new generative AI-based tool for developers that can auto-complete code and suggest changes. The company says it has 5,200 programmers using it internally so far, accepting 22% of code completion suggestions.
The company is increasingly turning to AI to solve its biggest business problems. For advertisers unhappy with Apple’s privacy changes that make it harder to target digital ads, Meta plans to use his AI to make more accurate guesses about users’ interests. To compete with TikTok, Facebook and Instagram are starting to show content from people you don’t follow, but this requires algorithms to guess what you’re interested in.
CFRA Research analyst Angelo Zino said in an interview that investors will look for direct evidence of these improvements to justify spending more.
“Obviously it will take some time for some of the initiatives to really roll out,” Gino said of Meta’s overall increase in capital spending. “There will be a lot of scrutiny to see if some of the gains on the bottom line will accelerate.”
When an AI model is queried, it spits out an answer called an inference that requires a certain kind of processing. Meta decided to develop a new chip called the Meta Training and Inference Accelerator (MTIA) to complement Nvidia’s many graphics processing units so that it can perform certain tasks in-house.
Meta hopes that the MTIA chip will help make more accurate and interesting predictions about what kind of original and advertising content users are watching, hopefully encouraging people to spend more time in apps and more We hope that you will click on our ads.
The company also launched its first self-developed application-specific integrated circuit (ASIC) designed for processing video and live streaming. Already Facebook and Instagram users share more than 2 billion short videos per day, but with this new processor, users will be able to stream these videos with less data on the device they’re watching. videos can be displayed faster.
Alexis Bjorlin, Vice President of Hardware Engineering, said: “We also know all about what the different needs are for generative AI workloads and what comes next.”
The recommendation engine powering Meta’s social media apps is the current version of AI technology, but the key to future generative AI work is the company’s AI supercomputer, called Research SuperCluster. The company will use it to train large-scale artificial intelligence programs. called a model.
The company announced Thursday that the second phase of construction has been completed. This phase will train a large language model called LLaMA, and will be a key part of the effort to build the Metaverse (the renamed virtual reality platform). from facebook.
Meta has long been committed to making some of its sophisticated technology available to the external community. Much of the hardware in the stack isn’t, but some work that utilizes it will be open source. LLaMA is shared with researchers, and its supercomputer-trained AI models can solve 10 problems of the International Mathematical Olympiad. CodeCompose was built on public information shared by his AI research team at Meta. And that new inference chip will help the company continue to support his PyTorch, an open-source AI framework Meta created and then migrated to the Linux Foundation for greater independence.
Meta has been working on AI tools for years, but Zuckerberg chose to paint the future of the company around a more nebulous vision of virtual reality. Scott Kessler, an analyst at investment research firm Third Bridge, said the pivot has come under intense investor scrutiny, and that a major investment in AI infrastructure has reaffirmed confidence in Zuckerberg’s overall strategy. Said it could help build.
“They don’t want to lose,” Kessler said of the industry-wide race to bring AI to their businesses. “I think there are a lot more people accepting that story to some extent now than there were six months ago, nine months ago.”
