Meta announces AI training and inference chip project

AI Video & Visuals


NEW YORK, May 18 (Reuters) – Metaplatforms (META.O) on Thursday unveiled new data center projects to better support its artificial intelligence efforts, including a custom chip “family” it is developing in-house. shared the details.

In a series of blog posts, Facebook and Instagram owners said they designed the first-generation chip in 2020 as part of the Meta-Training and Inference Accelerator (MTIA) program. The goal was to improve the efficiency of the recommendation model we use to serve ads and other content in our news feed.

Reuters had previously reported that the company had no plans to widely deploy its first homegrown AI chip and was already working on a successor chip. In his blog post, he described his first MTIA chip as a learning opportunity.

The first MTIA chips focused solely on an AI process called inference. For inference, algorithms trained on vast amounts of data determine whether to show something like a video of her dancing or a cat meme as the next post in a user’s feed. Said.

During a presentation about the new chip, Joel Coburn, a software engineer at Meta, said Meta initially utilized graphics processing units (GPUs) for inference tasks, but they weren’t well suited for inference work. He said he found no.

“Despite significant software optimizations, the efficiency is low in real models, which makes it difficult and costly to implement in practice,” Coburn said. “This is why we need MTIA.”

A spokeswoman for Meta declined to comment on the timeline for the new chip or details on plans to develop a chip that can train models.

Meta upgraded its AI infrastructure over the past year after executives realized it lacked the hardware and software to support demand from product teams building AI-powered features. I have been working on a large scale project.

As a result, the company has scrapped plans to roll out its in-house inference chip at scale and has set about developing a more ambitious chip capable of performing training and inference, Reuters reported.

Meta’s blog post acknowledges that the company’s first MTIA chip stumbled on high-complexity AI models, but said it could handle low- and medium-complexity models more efficiently than competitors’ chips. pointed out.

The MTIA chip also consumes just 25 watts, a fraction of the power consumed by chips from market-leading suppliers such as NVIDIA (NVDA.O), and uses an open-source chip architecture called RISC-V. Meta said he does.

Meta also provided an update on its plans to redesign its data center around more modern AI-driven networking and cooling systems, saying it expects to break ground on the first such facility this year.

The new design will be 31% cheaper than the company’s current data center and can be built twice as fast, an employee said in a video explaining the changes.

Meta will help engineers create computer code, similar to tools offered by Microsoft (MSFT.O), Amazon.com (AMZN.O) and Alphabet (GOOGL.O). He said he has an AI-powered system.

Reporting by Katie Paul of New York and Stephen Nellis of San Francisco.Editing: Kenneth Li, Chizu Nomiyama

Our standards: Thomson Reuters Trust Principles.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *