Elon Musk begins training “the world's most powerful AI”

AI News


Elon Musk's AI startup xAI has begun training its large language model (LLM) Grok on what Musk calls the world's most powerful AI training cluster. The system, located in Memphis, Tennessee, is equipped with 100,000 Nvidia H100 AI chips. Musk made the announcement at another of his companies, X (formerly Twitter).

“Thanks to the great work of the @xAI team, @X team, @Nvidia and supporting companies, training for the Memphis Supercluster began at approximately 4:20am local time,” he said in the post.

“With 100,000 liquid-cooled H100s on a single RDMA fabric, this is the most powerful AI training cluster in the world!” Musk added.


In the thread, Musk noted that the cluster would give the company's AI models a “huge advantage.”

“This is a huge advantage in training the world's most powerful AI by any metric by December this year,” he said.


On the same day, he said, “Grok is undergoing training in Memphis.”

Why this matters: This development comes roughly two weeks after reports that xAI and Oracle had closed talks on a potential $10 billion server deal. The report added that Musk was building his own data centers and buying AI chips there, something he later confirmed in a post on X.

“xAl has built 100,000 H100 systems in-house. [the] “The time to completion is minimal. We are aiming to commence training later this month. This will be by far the strongest training cluster in the world,” he added.

“The reason we decided to develop 100,000 H100s and our next major system in-house is because our fundamental competitive advantage depends on being faster than any other AI company. It's the only way we'll keep up,” Musk emphasized.

Musk said xAI will release Grok 2 in August, with Grok 3 expected to be available in December.

“Grok 2 is currently undergoing tweaks and bug fixes and will likely be released next month,” Musk added.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *