LOS ANGELES — It all started in 1993 at a Denny's in San Jose.
Three engineers, Jensen Huang, Chris Malachowski and Curtis Prime, met at a restaurant in what is now the heart of Silicon Valley to discuss developing a computer chip that would make video game graphics faster and more realistic.That conversation, and one that followed, led to the founding of Nvidia, a technology company that has rocketed up the stock market rankings, briefly surpassing Microsoft this week as the S&P 500's most valuable company.
The company's market capitalization now exceeds $3.2 trillion and its dominance as a chipmaker has positioned Nvidia as the poster child for the artificial intelligence boom, which Nvidia CEO Huang calls “the next industrial revolution.”
During a conference call with analysts last month, Huang predicted that companies using Nvidia chips will build a new type of data center he called “AI factories.”
Huang added that training AI models is becoming a more rapid process as they become “multi-modal” (able to understand text, voice, image, video and 3D data) and can also “reason and plan.”
“People talk about AI as if Jensen suddenly figured it out 18 or even 24 months ago,” says Daniel Newman, CEO of technology research firm Futurum Group, “but if you actually go back and listen to Jensen talk about accelerated computing, you'll find that he's been sharing that vision for over a decade.”
The Santa Clara, California-based technology company invented the graphics processor unit (GPU) in 1999, helping to grow the PC gaming market and redefining computer graphics. Today, Nvidia's specialized chips are a key component powering various forms of artificial intelligence, including the latest generative AI chatbots such as ChatGPT and Google's Gemini.
Neumann added that Nvidia's GPUs are a key factor in the company's success in artificial intelligence.
“They took an architecture that was supposedly used to power games and figured out how to network these things,” he said. “GPUs became the most attractive architecture for AI, from gaming, rendering graphics and so on, to actually using them for data. … They basically created a market that didn't exist: GPUs for AI, or GPUs for machine learning.”
AI chips are designed to perform artificial intelligence tasks faster and more efficiently. General-purpose chips such as CPUs can also be used for simpler AI tasks, but are becoming “less and less useful as AI advances,” according to a 2020 report from Georgetown University's Center for Security and Emerging Technology.
Tech giants are buying up Nvidia chips as they push deeper into AI, the movement that enables self-driving cars and creates stories, art and music.
“Jensen will basically make AI understandable and then Apple will make it consumable,” Newman said.
The company developed an early lead in the hardware and software needed to adapt its technology for AI applications, in part because of Huang's entry into the technology when it was still in its infancy more than a decade ago.
“NVIDIA has been working on different parts of this problem for over 20 years. They have a deep innovation engine that dates back to the early 2000s,” said Chirag Dekate, vice president analyst at technology research and consulting firm Gartner. “What NVIDIA did 20 years ago was discover that they could take the same processors, the same GPUs, that they were using for graphics, and make them capable of solving highly parallelized tasks, and they identified and nurtured an adjacent market.”
AI was still in its infancy at the time, he said, but Nvidia's realization that GPUs would be central to AI development “was the fundamental breakthrough that was needed,” Dekate said.
“Before that, we were kind of in the dark ages of analytics,” he says. “The analytics were there, but we just couldn't make the AI element come to life.”
Analysts expect Nvidia's revenue to reach $119.9 billion in the fiscal year ending January 2025, roughly double its revenue in fiscal 2024 and more than four times its revenue in the previous fiscal year.
“My hypothesis is that the kind of exponential growth we're seeing at NVIDIA right now could become a pattern that repeats more frequently in the coming decades,” he said. “This is a golden age, so to speak. It's the best time to be an AI engineer.”
