MongoDB on Thursday launched a series of new features designed to help customers develop generative AI models and applications, including the general availability of Atlas Stream Processing and integration with Amazon Bedrock.
In addition, the vendor announced the MongoDB AI Application Program (MAAP), which provides users with strategic advisory services and integration technology to build and deploy generative AI models and applications.
MongoDB announced new features at its user event MongoDB.local NYC in New York City.
According to BARC US analyst Kevin Petrie, these new features will help MongoDB remain competitive with peers such as tech giants Google and Oracle, as well as specialists such as MariaDB and Couchbase. It's said to be helpful.
Companies are competing to provide customers with the latest tools to develop generative AI assets.
“This is a pretty comprehensive set of announcements,” Petrie said. “MongoDB helps companies build their GenAI applications, feed real-time data, and optimize processes such as search extension generation to make their GenAI language models more accurate.”
New York City-based MongoDB is a database vendor that uses its NoSQL platform to provide users with an alternative to traditional relational databases that can struggle to handle the scale of modern enterprise data workloads.
Atlas, MongoDB's developer suite, has received significant attention from the vendor over the past year as interest in developing AI models and applications has exploded.
Recent updates include the launch of Atlas Vector Search and Atlas Search Nodes in December 2023 and the introduction of the MongoDB Partner Ecosystem Catalog in November 2023, giving users access to information shared by the vendor's many partners. Now you can access your data and AI products.
MAAP for AI success
Generative AI has been a key trend in data management and analytics for 18 months, ever since OpenAI announced ChatGPT and significantly improved large-scale language model (LLM) capabilities.
Applying generative AI to data management and analysis has the potential to empower more people to use data to make decisions and improve efficiency for everyone working with data.
The LLM has a huge vocabulary. Additionally, they can understand intent.
Therefore, integrating LLM with data management and analytics platforms allows users to interact with tools using true natural language processing (NLP) rather than the code previously required to manage, query, and analyze data. It will look like this.
This allows users without technical expertise to interact with data using analytical tools. Additionally, it helps data professionals become more efficient by reducing time-consuming tasks.
As a result of the potential of generative AI when combined with data management and analytics, many vendors are making generative AI a primary focus of product development, offering tools such as CoPilot to their customers, and helping customers develop AI applications. We are building an environment for development.
For example, MicroStrategy and Microsoft are among the many vendors that have added AI assistants, and Databricks and Domo are among the many that have provided users with an AI development environment.
MAAP is an environment for MongoDB to develop AI models and applications.
The suite includes integration with LLMs from generative AI providers such as Anthropic and Cohere, key features such as vector search and search augmented generation (RAG), a secure development environment, and support to help organizations get started with generative AI. Includes access to experts.
Petrie noted that generative AI models are becoming a must-have for businesses. However, to be successful, language models must be combined with analytical and operational capabilities specific to each company to enable the company to unlock valuable business value.
MAAP is designed to help MongoDB customers unlock their business value, making it an important addition to the vendor's suite.
“MongoDB's MAAP program helps developers optimize how language models are integrated into enterprise workflows,” said Petrie. “MongoDB helps many innovative companies differentiate themselves with cloud-native, data-driven software, and this new program will help customers take advantage of the wave of GenAI application development.”
But Sanjeev Mohan, founder and principal of SanjMo, says the program has its limitations.
MAAP includes access to LLMs from certain AI vendors, but not to all LLMs. That limits model selection.
“MongoDB provides customers with a curated environment, but at the cost of not being able to use the models and integration products of their choice,” Mohan said. “It's a trade-off. MAAP is good for large companies that want developers to experiment. But if you want freedom, MAAP limits your ecosystem.”
MongoDB partners participating in MAAP and providing consulting services include Anthropic, AWS, Google Cloud, and Microsoft.
More new features
In addition to launching an AI development environment, MongoDB has added new capabilities to Atlas.
Launched in preview in June 2023, Atlas Stream Processing is now generally available and allows users to combine data at rest and data in motion to build applications that respond to changing conditions and enable real-time decision making. It is intended to enable you to do so.
Streaming data includes information from sources such as IoT devices, customer behavior when browsing, and inventory feeds, and is a key way to help organizations act and respond with agility.
In addition to Atlas Stream Processing, MongoDB has made Atlas Search Node generally available on AWS and Google Cloud. Still in preview on Microsoft Azure.
Atlas Search Nodes works with Atlas Vector Search and Atlas Search to provide the infrastructure for generative AI workloads. Search nodes operate independently from MongoDB's core operational database nodes, allowing customers to isolate their AI workloads, leading to performance optimization and cost savings.
Finally, MongoDB introduced Atlas Edge Server in public preview. The tool allows users to deploy and operate applications at the edge rather than in a database environment, allowing business users to leverage their AI-based insights within their work flows.
Each new Atlas feature is a useful addition on its own. However, according to Mohan, their real power lies in using them all at once.
“I really like the combination of Atlas Stream Processing, Search Nodes, and GenAI,” he said. “This combination is super powerful.”
In particular, stream processing nodes and search nodes are important for AI applications, he continued.
If streaming data can be ingested, vectorized, and fed into a model in near real-time, it can be used to communicate information to someone during a customer conversation. On the other hand, if the generated AI workload runs on the same node as other database workloads, it can have a system-wide impact.
“I really like that real-time streaming piece,” Mohan said. “Also, I really like the whole search node idea. I don't want GenAI to suddenly slow down my basic production workloads.”
Petrie similarly emphasized the importance of search nodes that enable the low-latency processing needed to inform real-time decisions. And, he said, the combination of his new Atlas capabilities creates the foundation for successfully running generative AI applications.
“Most data-hungry applications, especially GenAI applications, have low-latency requirements,” says Petrie. “These Atlas enhancements are essential for MongoDB customers to succeed with their GenAI applications.”
In addition to new Atlas features, MongoDB has launched Atlas Vector Search integration with Amazon Bedrock.
Bedrock is a managed service from AWS that gives customers access to multiple AI vendor foundations and LLMs via APIs. According to Mohan, perhaps the primary significance of this integration is to provide joint AWS and MongoDB customers with a greater choice of models than those available through her MAAP.
I'm looking forward to
According to Petrie, MongoDB's latest new features are important overall.
These help customers develop AI applications and feed real-time data to users, and include key features such as RAGs that make AI models more accurate. Furthermore, partnerships are key to providing an ecosystem of AI development to our customers.
“GenAI is reinventing cloud-native software innovation,” said Petrie. “These announcements demonstrate that MongoDB understands the magnitude of change in our industry and intends to capitalize on this change.”
But Mohan says MongoDB can do more to provide customers with all the capabilities to develop, deploy and manage AI models and applications.
In particular, he noted that AI governance is an opportunity for vendors to add new capabilities. One option could be a developer toolkit. The other is an AI agent framework that aligns development with organizational goals.
“I would like to see MongoDB embrace AI governance,” Mohan said. “MongoDB performed vector search and his RAG very well. The question is how to enable in-context learning and fine-tuning. [of models].I'd like to see them launch a developer toolkit or an AI agent framework to do more end-to-end things. [management]. ”
Eric Avidon is a senior news writer for TechTarget Editorial and a journalist with over 25 years of experience. He is responsible for analysis and data management.
