- By Zoe Corbin
- San Francisco
image source, Getty Images
Nvidia was known for its graphics processing computer chips
When ChatGPT went public last November, it made waves far beyond the tech industry.
But all this is not possible without very powerful computer hardware.
And one hardware company in particular has been at the center of a massive AI success: California-based Nvidia.
Nvidia Hardware was originally known for making computer chips of the type that handled graphics, especially computer chips for computer games, but now powers most AI applications.
“The company is a leader in the technology that enables this new thing called artificial intelligence,” said Alan Priestley, a semiconductor industry analyst at Gartner.
“Nvidia is to AI what Intel is to PCs,” adds TechInsights analyst Dan Hutcheson.
ChatGPT was trained using 10,000 Nvidia graphics processing units (GPUs) clustered on a supercomputer belonging to Microsoft.
“This is one of many supercomputers (some publicly known, some not) built with Nvidia GPUs for a variety of use cases in science as well as AI. ,” said Ian Buck, general manager and vice president of accelerated computing at Nvidia.
Widely Used A100 GPU Priced Over $10,000
The company’s AI business earned around $15 billion (£12 billion) in revenue last year, up about 40% year-on-year, according to figures, making it the biggest revenue source, surpassing gaming.
Nvidia shares soared nearly 30% after reporting first-quarter results late Wednesday. The company said it was ramping up chip production to meet a “surge in demand.”
Its AI chips, which are also sold in systems designed for data centers, cost around $10,000 (£8,000) each, but the latest and most powerful versions sell for much more.
So how did Nvidia become a central figure in the AI revolution?
So it’s a bold bet on proprietary technology and some good timing.
In 2006, NVIDIA CEO Jensen Huang made the company’s chips programmable.
Jensen Huang, now Nvidia’s CEO, was one of Nvidia’s founders in 1993. At the time, Nvidia was focused on improving graphics for games and other applications.
In 1999, he developed the GPU to enhance computer image display.
GPUs excel at processing many small tasks simultaneously (for example, processing millions of pixels on the screen), a procedure known as parallelism.
In 2006, researchers at Stanford University discovered another use for GPUs. GPUs can accelerate math operations in ways that regular processing chips cannot.
It was in that moment that Huang made a pivotal decision for the development of AI as we know it.
He invested Nvidia’s resources in creating tools to make the GPU programmable, thereby extending its parallel processing power to uses beyond graphics.
That tool was added to Nvida’s computer chip. For computer game players, this was a feature they didn’t need and probably didn’t realize, but for researchers it was a new way of doing high performance computing on consumer hardware.
This capability was the catalyst for some of the early breakthroughs in modern AI.
In 2012, Alexnet was announced, an AI that can classify images. Alexnet was trained using only two Nvidia programmable GPUs.
The training process took just a few days instead of the months that would normally take a much larger number of processing chips.
The discovery that GPUs can greatly speed up neural network processing began to spread among computer scientists, who began buying GPUs to run this new type of workload.
“AI found us,” Buck says.
Nvidia has pushed its edge by investing in developing new breeds of GPUs that are better suited for AI, as well as developing more software to make the technology easier to use.
Ten years later, we have ChatGPT, a multi-billion dollar AI that will give you spooky, human-like answers to your questions.
In 2021, Metaphysic made headlines with Tom Cruise’s deep fakes
AI startup Metaphysic uses AI technology to create photorealistic videos of celebrities and more. Tom Cruise’s deepfake made headlines in 2021.
Hundreds of Nvidia GPUs are used for both training and running the models, some purchased from Nvidia and others accessed through cloud computing services.
“There is no substitute for Nvidia to do what we do,” says co-founder and CEO Tom Graham. “It’s way ahead of its time.”
Nvidia’s dominance looks solid at the moment, but it’s hard to predict in the long run. “NVIDIA is a company with a target that everyone is trying to beat,” said Kevin Krewell, another industry analyst at TIRIAS Research.
Other major semiconductor companies also have some competition. Both AMD and Intel are best known for making central processing units (CPUs), but they also make dedicated GPUs for AI applications (Intel only recently joined the race).
Google has tensor processing units (TPUs) that are used for certain machine learning tasks as well as search results, and Amazon has custom-built chips for training AI models.
In addition, for the first time in decades, computer chip start-ups like Cerebras, SambaNova Systems and Habana (acquired by Intel) are also on the rise. They are keen to start with a clean slate and develop better alternatives to GPUs for AI.
UK-based Graphcore makes general-purpose AI chips called intelligent processing units (IPUs), which it says have more computing power and are cheaper than GPUs.
Founded in 2016, Graphcore has received approximately $700 million (£560 million) in funding.
Its customers include four U.S. Department of Energy national laboratories, and it is pressuring the British government to use its chips in new supercomputer projects.
“[Graphcore] built processors to run the AI that exists today and the AI that will evolve over time,” said Nigel Thun, co-founder and CEO of the company.
He admits that fighting giants like Nvidia is difficult. Graphcore also has the software to make that technology accessible, but when the world is building AI products that run on his Nvidia GPUs, it’s hard to coordinate the switch.
Over time, Thun expects cost-effective computation to start becoming more important as AI moves from cutting-edge experiments to commercial deployments.
Back at Nvidia, Ian Buck isn’t too worried about competition.
“Everybody needs AI now,” he says. “Where we contribute is up to others.”
