Microsoft’s campaign to allow enterprise IT users to easily integrate ChatGPT-generated AI technology into business, consumer and developer applications moved forward Tuesday as the cloud giant seeks to dominate the AI arms race.
The vendor has introduced numerous updates to ChatGPT and Bing browsers, a new plugin platform, and partnerships with major technology companies, including AI hardware/software vendor Nvidia.
Product introductions and other development took place at Microsoft Build, the vendor’s developer conference held in Seattle and streamed to a virtual audience.
“This is about really creating an opportunity for developers to reach out to all users in all these surface areas,” said Microsoft CEO Satya Nadella, referring to the vendor’s myriad of developer and business applications.
ChatGPT and Bing
One way the tech giant is helping developers reach a wider audience is by making Bing Search part of OpenAI’s ChatGPT.
This change not only brings ChatGPT users Microsoft’s search engine, but it also brings ChatGPT up to date, which was previously trained only on data up to 2021. Paid service users can immediately ask ChatGPT’s real-time his queries, which can make up for ChatGPT’s big gap.
Microsoft says users of the free version will be able to access the latest version of ChatGPT later using a plugin. The vendor did not set a specific date.
Forrester analyst Rowan Curran said the move to make Bing Search part of ChatGPT is logical, especially since large language models like ChatGPT are very useful when they can connect with the world in real time. says there is.
With Microsoft introducing many new plugins into its ecosystem, it also makes sense to have a close relationship with Bing Search, a longtime competitor to Microsoft’s biggest rival, Google Search. Curran added.
Enhanced plugin
Microsoft has revealed that it has adopted the same open plugin standard as OpenAI’s generative AI partners.
Developers now have access to over 50 plugins on the new platform that allow users to plug into consumer and business applications such as ChatGPT, Bing, Microsoft 365 Copilot and Dynamics 365 Copilot.
Developers can also create, test, and deploy plugins.
“Plugins are 100% a necessary step in making these large language models useful for deployment,” said Curran. “These plugins basically allow these models to be more than just brains in a box, they can interact with the world with their limbs and other ways.”
In addition to asking LLM questions, developers and other users can use LLM through plugins to interact with and perform real-world actions in the real world, such as travel and shopping settings.
Cambrian AI founder and analyst Karl Freund said the plugin strategy is also an effective strategy for the tech giant to make AI available across its portfolio, especially through its Copilot application. .
Microsoft defines Copilots as applications that use AI and LLM to help users perform cognitive tasks.
By using plugins to connect to various Copilot applications, “Microsoft is becoming the leading cloud provider for application AI applications,” says Freund.
He added that Google will remain a strong competitor, but lacks the Copilot feature that Microsoft leverages across a wide range of applications.
In addition to building bridges to the Copilot application through plugins, Microsoft is forging important new strategic partnerships.
strategic partnership
One of them is a partnership with Nvidia, one of the most dynamic independent players in the AI space.
Microsoft’s Azure Machine Learning service will be integrated with Nvidia AI Enterprise software. The integration will enable Azure customers to build, deploy and manage customized applications using 100 of his Nvidia AI frameworks and tools, while being supported by Nvidia’s AI platform.
“This is a natural fit for Microsoft Azure,” says Freund.
The partnership will give Microsoft users the same benefits offered by Nvidia Enterprise AI, a platform for building and customizing AI products, he said.
Companies wishing to build a domain-specific version of ChatGPT can do so using Nemo, Nvidia’s cloud-native framework for building LLMs.
While Microsoft and OpenAI have a close relationship, integrating with Nvidia’s Enterprise AI fills some hardware and software gaps that OpenAI from software vendors and research organizations cannot.
“Companies need to find a way to identify the best opportunities for AI from their own data and build an AI strategy that addresses everything from training to model development to governance,” said Daniel Neumann, an analyst at Futurum Research. said.
“NVIDIA provides a more complete set of tools for enterprises to build around and work with ChatGPT and other open source LLMs,” he continued.
Meanwhile, J. Gold Associates analyst Jack Gold said, “Microsoft wants Nvidia chips for their Azure platform, and they’re deploying it primarily to make sure OpenAI and Copilot all work. is proceeding,” he said.
Also, while Nvidia works with multiple cloud providers, including Google, NVIDIA positions itself as an AI “arm provider” that builds and deploys the “arms” that cloud providers use, so conflicts of interest are It doesn’t happen, Freund said.
Microsoft also unveiled new commitments to responsible AI. The cloud provider’s new Azure AI Content Safety service aims to help developers create safer online environments using models that detect inappropriate content in images and text. Content safety is currently in preview.
Esther Ajao is a news writer covering artificial intelligence software and systems.
