IBM Advances Open Source, Multi-Model AI Strategy

AI For Business


IBM said on Tuesday We open sourced some of Granite's large language models, expanded support for third-party AI models, and added to our line of AI assistants on our year-old Watsonx-generated AI platform.

The move comes on the eve of the more than 100-year-old tech giant's IBM Think 2024 conference, extending and reaffirming the vendor's commitment to the open source stance it adopted in 2000.

It also comes in the same month that OpenAI and Google splashily introduced their latest multimodal generative AI chatbots designed to talk and interact with people and create text, images, and videos.

Although IBM's latest products are not multimodal, the company does provide access to third-party AI models with multimodal capabilities.

IBM is considered to act as a neutral advisor to companies as they begin their LLM journey.

R Founder of Constellation Research

R “Ray” Wang, founder of Constellation Research, says, “We see IBM serving as a neutral advisor to companies as they begin their LLM journey.” “As trusted advisors, our clients see Watsonx and Granite LLM as insurance in a rapidly evolving and changing LLM world. I need you.”

GenAI movement

The new AI assistants are Watsonx Code Assistant for Enterprise Java Applications (generally available in June) and Watsonx Code Assistant for Z (generally available in October) to accelerate mainframe application modernization. IBM has also expanded Code Assistant for Z with natural language explanations.

Granite LLM for code is publicly available on Hugging Face and GitHub under the open source Apache 2.0 license.

Granite models range in size from 3 billion to 34 billion parameters and are intended for large-scale application modernization, code generation, bug fixing, code description and documentation, and code repository maintenance. The code model was trained on 116 programming languages, and IBM claimed it to be the best performing open LLM for code-related tasks.

Granite models come with user coverage. They are trained to use Internet, academic, code, legal, and financial datasets in their businesses.

Training data is removed from objectionable content and filtered to address governance, risk assessment, privacy protection, and bias mitigation. Last year, the vendor released Watsonx.governance, an AI governance toolkit for creating trust-based workflows.

AI for business

In a pre-conference media briefing, IBM executives emphasized that the vendor and its large consulting arm are focused on helping companies move generative AI beyond the planning stage and into action.

“Our AI assistant is at the forefront of providing a highly effective way to bring GenAI to your business,” said Kareem Yusuf, senior vice president of product management and growth.

Meanwhile, IBM Consulting Advantage, the vendor's AI services platform released last year, leverages new AI Assistants and Granite models, according to Muhammad Ali, IBM Consulting's senior vice president and chief operating officer. It is said that there is

Ali said the consulting division is currently working on 300 generative AI projects with clients.

“Consulting Advantage is a common layer that provides a common security framework, common PII. [personally identifiable information] “We have a framework, a common bias framework, a common governance framework, a common cost framework all under that, and under that we have various LLM apps, and on top of that we have a set of assistants. “In this way, our consultants will be able to take advantage of these types of applications,” he said. Deliver multimodal technology in a consistent way. ”

Partnership with third parties

As part of its open ecosystem strategy, IBM on Tuesday also announced a series of new and expanded partnerships involving generative AI with other tech giants.

A new partnership with AWS to bring Watsonx.governance to Amazon SageMaker users is now available.

IBM announced that it will collaborate with long-time partner Adobe on hybrid cloud and AI technology, bringing capabilities from Watsonx, Red Hat, and OpenShift to Adobe Experience by next month.

With Meta, IBM announced that the social media giant's latest open LLM, Meta Llama 3, is now available for Watsonx.

Watsonx is now also available in Microsoft Azure. Additionally, commercial models from generative AI vendor Mistral are expected to be available on Watsonx by June. IBM also expanded its partnership with security vendor Palo Alto Networks on AI security.

Futurum Group analyst Stephen Dickens said IBM's openness to many third-party generative AI technologies, combined with its focus on AI safety and businesses rather than consumers, makes IBM's AI products better than other It may not be as flashy, but businesses will welcome it, he said.

IBM's traditional customer base, which includes major financial institutions and government agencies, will also appreciate the established tech giant's slightly more modest but reliable approach to AI, he said.

“If you’re looking for an enterprise AI partner, do you really want to get into all that noise?” Dickens mentioned OpenAI and Google’s high-profile generative AI systems. “Traditional core regions will seek capabilities that are smart, solid, reliable and compensated.”

“They want to know that their information isn't being scraped.” The New York Times “IBM is very focused on being an enterprise AI company,” he continued.

Shaun Sutner is a senior news director on the Information Management team at TechTarget Editorial, driving coverage of artificial intelligence, unified communications, analytics, and data management technologies. He is a veteran journalist with over 30 years of reporting experience.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *