Google has already acknowledged that video platforms like TikTok and Instagram are eating away at its core search product, especially among younger Gen Z users. Now, thanks to Gemini AI, he aims to make video searches a bigger part of his Google searches. The company announced Tuesday at the Google I/O 2024 developer conference that it will now allow users to search a combination of uploaded videos and text queries to get an AI overview of the answers they need.
This feature will initially be experimentally available in English on Search Labs for users in the United States.
This multimodal feature builds on the existing search functionality that allows users to add text to their visual search. First introduced in 2021, the ability to search for both photos and text combined has helped Google in an area it typically struggles with. For example, what you're looking for may have visual elements that are difficult to describe or that can be explained. In different ways. For example, you could use Google Search to find a photo of a shirt you like, then use Google Lens to find the same pattern on socks, the company suggested at the time.
With the addition of search via video functionality, the company is responding to how users, especially younger users, interact with the world through their smartphones. They often take videos rather than photos, and use videos to express themselves creatively. So it makes sense that you might want to use videos for search in some cases.
This feature allows users to upload videos, ask questions and create search queries. In its demo, Google showed a video of a broken record player whose arm wouldn't stay on the record. The question included the video in question, along with the question, “Why doesn't this stay the same?” (Referring to the arm). Google's Gemini AI then analyzes your video frame by frame to understand what you're watching and provides his AI summary of possible tips on how to fix it.
If you'd like to learn more, here's a link to our discussion forum and a video on how to rebalance your turntable's arm.
Google demonstrated this feature, which understands video content in conjunction with Google search queries, but it also works with videos on your phone, videos uploaded to private cloud storage like Google Photos, and videos shared publicly via YouTube. It also has implications for other areas, such as the understanding of
The company did not say how long Google Labs' new features will be tested in the U.S. or when they will be rolled out to other markets.
Publish an AI newsletter. Sign up here to start receiving it in your inbox on June 5th.