Google Cloud held an executive forum in June to discuss the latest AI advancements with customers and partners. The company’s CEO, Thomas Kurian, said how the cloud market is transforming, and that core to this reinvention is the evolution of AI, especially generative AI.
Generative AI helps companies create content, synthesize and organize information, automate business processes, and build engaging customer experiences. To achieve these goals, the company has a world-class AI infrastructure, a multi-model foundational model, deep integration with Google Workspace and Google Cloud Platform, AI-powered collaboration, and a broad ecosystem of partners. Kurian said the focus will be on three key priorities.
The Tech Stack Behind Google Cloud’s Generative AI Services
At the heart of Google Cloud’s generative AI service is the underlying model. The underlying model is a large AI model trained on vast amounts of unlabeled data. At Google, we employ self-supervised learning to optimize these models. These generalized models can perform various tasks such as recognizing, predicting, and generating text, images, software code, audio, and video.
Google says customers will have access to more than 60 models from Google and its partners. To make AI accessible to a wider range of organizations, Google is developing the ability to make underlying models such as PaLM 2 available in small, cost-effective configurations for a variety of applications and use cases. It offers.
Companies can customize PaLM 2 through adaptation, transfer learning, and distillation techniques. These models come in different sizes, each with their own performance, latency, cost, and memory requirements, providing flexibility for different scenarios. The latest enhancements in PaLM 2 now enable advanced reasoning tasks such as code and math, classification and question answering, translation and multilingualism, and natural language generation.
In addition, Google offers Vertex AI, an end-to-end AI platform that streamlines the deployment of AI and generative models, enabling enterprises to customize AI models for their specific applications. Vertex AI offers his three key generative AI capabilities: model detection and tuning, customization tools, and data and IP control. With Vertex AI, our customer can discover and tune his underlying model of Google and its partners.
The platform enables seamless model scaling, automation of reinforcement learning feedback loops, generation of synthetic data for testing, and control of deployment locations and costs. Companies can use rapid engineering, fine-tuning, parameter-efficient tuning, and reinforcement learning to refine Google’s models for better performance and specific tasks.
One of the key concerns for organizations is the protection of corporate data when using third-party large-scale foundation models. Google said its enterprise services ensure complete control and segregation of data, intellectual property protection and compliance with regulatory requirements when customizing models. Customer data remains confidential and is not shared with other customers or his Google underlying model.
How are real-world businesses leveraging AI today?
Companies want to leverage Google’s expertise and lessons learned in designing and scaling AI models. But you might be wondering what real-world companies are doing with these solutions. Google showcased several examples of how customers and partners are using his Vertex AI.
For example, aerospace company GA Telesis has developed a data extraction solution that automates customer calls by interpreting and correlating customer orders. Data science company Snorkel AI uses Vertex AI to generate high-quality training data to train models and customize patient treatments. Organizations like YouCite and Behavior Box are leveraging generative AI models on the Vertex AI platform to enhance employee interactions and identify insider threats.
Google featured speakers from the Mayo Foundation, Priceline, Wayfair and Wendy’s at the executive forum. These Google Cloud customers shared their insights on AI challenges, opportunities, and practical applications. Organizations see great opportunities in AI and have been working with Google on AI solutions for many years.
The combination of search and chat is a hot topic in the world of customer experience. The panel discussed how companies are exploring ways to make search and chat seamlessly work together using generative AI. Not surprisingly, travel companies are looking to create a more personalized customer experience. Priceline shared that generative AI not only helps create a more connected travel experience that incorporates all of its products, but it can also show additional relevant information to customers as they travel.
Like many healthcare organizations, the Mayo Foundation is looking for ways to minimize employee burnout. Use generative AI and search to surface complex clinical and medical information to clinicians and administrators. Ideally, we want to automate some of the routine tasks of clinicians and administrators, giving employees more time to focus on what matters.
Mayo spoke about the importance of human involvement in the early stages of new AI development. In multiple discussions with various companies, Lopez Research found that AI models often don’t behave as expected, so it’s important to have people review the intent and accuracy of the model’s output. I learned that AI needs tweaking.
Wayfair spoke about the home category challenges and the need to combine AI with search functionality. For example, finding the perfect sofa for your living room is a lot different than looking up the score of the latest NBA game. In retail, she doesn’t have one right answer. it’s a journey. Wayfair talked about how the company has long employed machine learning capabilities to understand customer intent and match the perfect product to them.
Wayfair has a framework for generative AI and a small cross-functional team that reviews which AI use cases deliver value, but to audit output and minimize enterprise risk, We need a human being involved. Wayfair is rapidly tracking use cases for customer service, sales, and code generation assistants. Moving forward, Wayfair hopes to focus on more intentional and differentiated features for customers, such as delivering generated images.
Wendy’s wanted to improve the speed, accuracy, and consistency of its pick-up window process. Wendy’s has over 30 customizable menus. Additionally, customers may order items differently from the menu board. Wendy’s uses AI technology, especially items like Dialogflow, to help figure out how to automate the order-taking process.
In June, Wendy’s began its first pilot with new generative AI products such as Vertex AI to understand bespoke requests and conduct conversations with customers, including the ability to generate answers to frequently asked questions. I was. The goal is to take the complexity out of the ordering process by leveraging generative AI. Wendy’s uses Google’s basic he LLM with data from Wendy’s menus and logic for established business rules and conversation guardrails. The AI system is integrated with Wendy’s restaurant’s hardware and his POS system.
Rather than solving a single problem, Wendy’s said it wants to go beyond the drive-thru and build a platform approach to solving problems across Wendy’s various channels, including mobile applications. Wendy’s says it’s seen early success with using language models to take orders. He also pointed out that the order-taking process is one of several jobs the restaurant does, so employees aren’t worried about job reassignment. Wendy’s saw the benefits of engaging staff and franchisees early in the technical design phase by providing proofs of concept and gathering feedback.
Priceline shared that developers need to change their mindset to develop experiences rather than features. Developers should design experiences that optimize input and output for large language models. This requires a very different skill set than being proficient in writing JavaScript. For example, prompt engineering requires developers to decide how prompts are weighted in terms of price or value to the customer.
Prompts are very important in communicating and directing the behavior of Large Language Model (LLM) AI. A prompt is an input or query that the user uses to get an answer from the model. Priceline also shared that companies underestimate the need to buy real-time data infrastructure solutions to enable generative AI. A company should be able to measure and monitor his AI model within its own ecosystem.
Wayfair pointed out that even digital native companies have legacy digital solutions. We shared how Google helped move us from a monolithic code base and database to a cloud native architecture. We also use Vertex AI and Gen App Builder to make it easy for data scientists and machine learning engineers to quickly onboard to the platform, allowing teams to focus on building and experimenting without worrying about designing ML infrastructure. . We use BERT to understand customer intent in search.
Mayo said the company has been using large language models for some time and is using generative AI as the latest version of its AI toolkit. The healthcare industry has always struggled with searching and finding information. Mayo said the average patient has 7,000 to 8,000 data points. An average doctor sees 10-15 patients a day. Like Wayfair, the Mayo Foundation has been working with Google for some time.
We used Google’s natural language processing (NLP) capabilities before generative AI to more easily collect and synthesize unstructured data. Generative AI drives these efforts. Mayo Clinic Chief Information Officer Chris Ross said in a press release about the partnership: “Google Cloud tools have the potential to unlock sources of information that are unsearchable, difficult to access and interpret using traditional methods, from complex patient histories to imaging to genomics to labs.”
Clearly, the Mayo Foundation also cares about the security of patient data. We discussed the benefits of secure enclaves with customer private keys to ensure data privacy, control where data is stored with an auditable view, and provide encryption at rest. We also want to provide innovators with a secure sandbox for R&D testing.
The Mayo Foundation shared that no model is 100% perfect, but said that what is more important is how the model’s accuracy is measured, providing confidence in the model’s expertise. We believe AI doesn’t need to address patient diagnosis directly to be successful. Because there are many easy outcomes to reduce the paperwork and administrative burden. For example, you can transform the patient experience by giving clinicians a summary of their records before they see them.
Wayfair pointed out that one of the biggest differences in AI today is that all companies have access to basic capabilities as a service to help jumpstart their AI efforts. In the past, companies had to hire highly skilled people, build AI infrastructure, and deploy AI software tools. The availability of AI-specific services from multiple providers is a game changer for organizations looking to adopt AI.
Adaire Fox-Martin, President of Google Cloud GTM and Head of Google Ireland, concluded the panel discussion with apt words summarizing the opportunities that lie ahead for us. AI capabilities. Now is the time to shape generative AI across the industry. “
follow me twitter Or LinkedIn. check out my website.
