Skip to main content
The Keyword

Ask questions in new ways with AI in Search

Two phone illustrations showing new search capabilities of Google Lens video and voice input, and AI Overviews as someone searches for information about plants in different ways on Search

Over the years we’ve continually reimagined Google Search, so you can ask your question in any way you want — whether you type a query, search with your camera or simply hum a tune.

The advanced capabilities of AI have been integral to expanding what Search can do, and our Gemini model customized for Search has improved our ability to help people discover more of the web — and the world around them. In fact, people who use AI Overviews use Search more and are more satisfied with their results. And Google Lens is now used for nearly 20 billion visual searches every month, helping people search what they see with their camera or on their screen.

Today, we're taking another big leap forward with some of our most significant Search updates to date, using AI to dramatically expand how Google can help you, from finding just the right information you need to exploring any curiosity that comes to mind.

New ways to search what you see (and hear)

Since pioneering visual search years ago with Lens, we’ve continued to evolve the experience using our latest advancements in AI. Earlier this year, we incorporated generative AI into Lens — so you can point your camera, ask a question and get an AI Overview with the information you need, along with links to learn more. Already, people are finding this experience incredibly helpful. This feature drove increased overall usage of Lens, as people came back to get help with new and more complex questions.

Lens queries are now one of the fastest growing query types on Search, and younger users (ages 18-24) are engaging most with Lens. Now, we’re introducing more capabilities to Lens to make it even easier to search the world around you.

Video understanding in Lens

We previewed our video understanding capabilities at I/O, and now you can use Lens to search by taking a video, and asking questions about the moving objects that you see. Say you’re at the aquarium and want to learn more about some interesting fish at one of the exhibits. Open Lens in the Google app and hold down the shutter button to record while asking your question out loud, like, “why are they swimming together?” Our systems will make sense of the video and your question together to produce an AI Overview, along with helpful resources from across the web.

This capability is available globally in the Google app (Android and iOS) for Search Labs users enrolled in the “AI Overviews and more” experiment, with support for English queries.

Voice questions in Lens

The option to ask a question with your voice is also now available any time you take a photo with Lens. Just point your camera, hold the shutter button and ask whatever’s on your mind — the same way you’d point at something and ask your friend about it. Voice input for Lens is now available globally in the Google app for Android and iOS, for English queries.

Lens updates to help you shop what you see

We’re also making it easier to shop the world around you with Lens. For years, you’ve been able to use Lens to find visually similar products from retailers across the web. But starting this week, you’ll now see a dramatically more helpful results page that shows key information about the product you’re looking for, including reviews, price info across retailers and where to buy.

For example, say you spot a backpack at the airport and want to buy one for yourself. Just take a photo and Lens will bring together our advanced AI models and Google’s Shopping Graph — which has information on more than 45 billion products — to identify the exact item. So you can learn more about whatever catches your eye, and start shopping right in the moment.

A new way to identify the songs you hear

Whether on a video while you’re scrolling through social media, in a movie you’re streaming, or on the website you’re visiting, with our latest Circle to Search update, you can instantly search the songs you hear without switching apps. And we’re bringing Circle to Search to more users with our latest Android expansion — it’s now available on more than 150 million Android devices.

Your search results page, organized with AI

Earlier this year, we previewed how AI can help you explore and discover a wider range of results from the web, for those questions that may be open-ended or have no single right answer — like if you’re searching for a vegetarian appetizer to make for a dinner party.

This week, we’re rolling out search results pages organized with AI in the U.S. — beginning with recipes and meal inspiration on mobile. You’ll now see a full-page experience, with relevant results organized just for you. You can easily explore content and perspectives from across the web including articles, videos, forums and more — all in one place.

In our testing, people have found AI-organized search results pages more helpful. And with AI-organized search results pages, we’re bringing people more diverse content formats and sites, creating even more opportunities for content to be discovered.

More connections to the best of the web

We know that people want to go directly to the source for many of their questions. With AI in Search, we’re focused on helping people discover content and perspectives from a wide range of sources across the web.

We’ve been testing a new design for AI Overviews that adds prominent links to supporting webpages directly within the text of an AI Overview. In our tests, we’ve seen that this improved experience has driven an increase in traffic to supporting websites compared to the previous design, and people are finding it easier to visit sites that interest them. Based on this positive reception, starting today, we’re rolling out this experience globally to all countries where AI Overviews are available.

In addition, as we shared at Google Marketing Live, we’ve been carefully testing ads in AI Overviews for relevant queries. We’ve seen that people are finding ads directly within AI Overviews helpful because they can quickly connect with relevant businesses, products and services to take the next step. Following positive feedback, we’re starting to bring ads in AI Overviews to the U.S. for relevant queries, so we can continue to connect people with the products and brands that are helpful to their searches. You can read more in this post.

Whether it’s searching with text, audio, voice or images, we’ve always worked to expand the type of questions you can ask on Google. With AI, we’re continuing to reimagine how Search can help get you the information you need — fast — and we’re looking forward to bringing these experiences to more people around the world.