Openai executives can't stop talking about not having enough GPUs

AI For Business


Openai's C-Suite cannot stop talking about the company's insatiable demand for computing power.

“Every time you get more GPUs, they're used right away,” Openai CPO Kevin Weil told Xprize founder Peter Diamandis in an interview about Diamandis's “Moonshot” podcast recently.

Weil is just the latest Openai executive to play this topic. Openai CEO Sam Altman said last month that the company will acquire more than 1 million GPUs by the end of the year. For comparison, Elon Musk's Xai revealed that he used a supercluster of over 200,000 GPUs called Colossus to train Grok4.

“I'm extremely proud of my team, but now I'd be better off coming up with a 100x LOL method,” Altman wrote in X in July.

Two days later, Musk, who became Altman's former ally, said he hopes Xai will have 50 million equivalents of Nvidia's H100 chips over the next five years.

Competition is a good reason. Jonathan Cohen, vice president of applied research, recently said that GPUs are like “currency” for AI researchers. Priscilla Chan, wife of Mark Zuckerberg and co-founder of the couple's charity, said Chang Zuckerberg's initiative uses GPUs as a recruitment tool.

Weil said the need is very simple. “The more GPUs you have, the more AI you use everything,” he compared how adding bandwidth allowed video explosions.

“It's like the internet. Every bit of us doing lower delays, increase bandwidth on the internet, and people do more,” he said. “It was impossible before. Now, video is everyday, because the network has the ability to handle it.”

CFO CFO Sarah Friar recently launched Openai Stargate in search of more computing power. A $500 billion project, Stargate is a joint venture between Openai, Oracle and SoftBank. During a January announcement at the White House, Altman said the project would allow the US to reach AGI, artificial general information.

“Rather now, it's greedy for GPUs and computing,” Friar told CNBC last week. “The biggest thing we're facing is constantly being calculated, which is why we launched Stargate. So we're doing a bigger build.”

On the product side alone, Weil said there is a lot of space where more GPUs can be plugged in.

“You can bring them to the product side and use them to reduce latency, speed up token generation, launch new products, use products that are only available to Pro users, or bring them to users and free users, or to perform more experiments.”

At the same time, OpenAI needs to balance researcher requests.

“On the research front, there's basically an infinite demand for GPUs on these barriers, which is why we do a lot to build capacity,” says Weil.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *