

AMAZON Web Services (AWS) has extended the Amazon Bedrock service to simplify the development and deployment of generative artificial intelligence (AI) applications.
Bedrock is a platform for building AI applications using large-scale language models (LLMs).
The updated service adds some of our most popular features to meet our customers' needs for customization, model selection, and content management. As a service platform, we give users access to a variety of LLMs from providers such as AI21 Labs, Amazon, Anthropic, Cohere, Meta, Mistral AI, and Stability AI. These models are available as managed services, so customers don't have to worry about infrastructure complexity.
With the new Bedrock, organizations will be able to import their own customized AI models and gain access to Bedrock's management, security, and deployment tools. Added model identification functionality to help evaluate and compare offered models. This speeds model selection based on accuracy, performance, and other key metrics that are important to the desired application. We also now offer improved content filtering and moderation tools to help users block harmful content, align AI output to corporate standards, and use AI safely and responsibly. Masu.
Bedrock has also added several new models, including Amazon Titan Text Embeddings V2, Amazon Titan Image Generator, Meta Llama 3, Command R by Cohere, and Command R+.
The focus of these new features is to simplify and accelerate the development of generative AI applications for a variety of industries and use cases. This includes the ability to better customize models, choose the best model for a specific task, and enhance responsible AI practices.
“With today’s announcement, we are further strengthening our commitment to providing our customers with the most comprehensive feature set and industry-leading model choice, and by democratizing generative AI innovation at scale, we are empowering our customers to We will continue to innovate rapidly,” said Dr. Swami. Sivasubramanian, his vice president of AI and data at AWS, concludes:
