Google not only announced at Google I/O that AI Overviews are rolling out in the US this week but a number of new AI based search features coming out later in Google Search and in Google Search Labs. These features include the ability to adjust AI Overviews, muti-stepping reasoning, planning, AI-organized search results and Lens search with video.
Let's go through each one (I also covered this at Search Engine Land):
Adjust AI Overview
You will be able to adjust your AI Overview in Google Search through Google Search Labs with options to either "simplify" the language or "break it down" in more detail. You would use "simplify" to dumb it down to a novice and "break it down" to dive in more into the topic.
Here is an image of the "original" mode:
Here is an image of the "simpler" mode:
Here is an image of the "break it down" mode:
Multi-Step Reasoning
With this feature you rather than breaking your question into multiple searches, you can ask your most complex questions, with all the nuances and caveats and Google will give you everything - without the need to do multiple searches or click on "break it down."
The example Google provided was if you are looking for a new yoga or pilates studio, and you want one that’s popular with locals, conveniently located for your commute, and also offers a discount for new members.
Here is how those results look:
This too will be available later in Google Search labs for English US users.
Planning
Google really wants to help you plan things, not just trips, but anything and it can do that with Gemini and Search. Google Search can get help you create plans for whatever you need, but first Google is starting with meals and vacations but will also work with parties, date night and workouts later (US English Search Labs too). Search for something like “create a 3 day meal plan for a group that’s easy to prepare,” and you are step by step plans. You can swap out parts of the plan, export them to Gmail or Google Docs
Here is how it works (big GIF):
AI-Organized Search Results Page
Google is using AI to organize the search results page where the results "categorized under unique, AI-generated headlines, featuring a wide range of perspectives and content types," Google wrote. Funny thing is we saw Google doing this on some level in 2007 - yes, 2007.
Here is how it looks:
This will launch in Google Search (not labs) later this year for US English results. Google will start with dining and recipes, followed by movies, music, books, hotels, shopping and more.
Lens Search with Video
Google also is testing in Google Search Labs "Lens search with video." You can record a video, ask questions about the video while you are recording and Google will respond. Now you can search with video and ask a question during the video.
Here is how that looks:
This will also launch as a Search Labs feature later this year.
Here is a video of this:
Forum discussion at X.