Google's AI chief says the company is investing more than $100 billion in AI to stay ahead of competitors

AI News


The war over AI is all about money, honey. Big tech companies like Google, OpenAI, and Microsoft are all busy training language models at scale, but they're also competing with each other. And this race for AI supremacy is costing them dearly. In fact, according to Google's head of AI, Google is spending more than $100 billion on AI development to stay ahead of its competitors.

Hassabis' disclosure was in response to inquiries about competitors' strategies in the AI ​​race. Recently, rumors abounded that Microsoft and OpenAI are collaborating on his $100 billion supercomputer called “Stargate” to accelerate his AI advances. Hassabis, who heads Google's AI research lab DeepMind, answered a question about the competition at the TED conference in Vancouver, saying that although he didn't provide specific numbers, Google's funding is greater than its competitors. revealed. “I won't talk about specific numbers, but I think we're investing more than that in the long run,” Hassabis said.

While this investment is large, it's not surprising given the surge in AI development in the tech industry, with AI startups raising about $50 billion last year alone. But Hassabis' comments suggest that the cost of this AI race could be significantly higher, especially for those seeking to first achieve artificial general intelligence (AGI) capable of human-like reasoning and problem-solving. suggests.

But how will Google and other technology companies plan to invest all this money? During the development of the LLM, a significant portion is likely to be devoted to chip development.

Currently, companies like Google and OpenAI rely on third-party chip manufacturers like Nvidia. But now these companies are shifting their focus to designing their own chips for greater control and optimization.

But rising costs aren't just limited to hardware. The cost of training AI models is also rising. According to Stanford University's annual AI Index report, OpenAI's GPT-4 used approximately $78 million worth of computing power for training. This is a significant increase compared to the $4.3 million he spent on GPT-3 training in 2020. By comparison, Google's Gemini Ultra training required him to invest $191 million.

Notably, back in 2017, companies could train the initial technology behind an AI model for about $900. But now, with the industry pushing towards his AGI, this exponential increase is likely to continue.

Meanwhile, OpenAI and Microsoft reportedly plan to build a $100 billion supercomputer called “Stargate” to support OpenAI's advanced AI models. The supercomputer will contain millions of specialized server chips and could be released as early as 2028. The project is expected to triple the amount invested by Microsoft in 2023. The supercomputer will be the focus of a five-phase plan to deploy supercomputers over the next six years. . It could be used to train the world's most powerful AI, which could require up to 5 gigawatts to operate.

Issuer:

Divya Bhati

date of issue:

April 18, 2024



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *