Search is the biggest Google product and at the I/O, the company is pushing its service to new levels. Earlier, Google introduced Multisearch which allows you to, for example, take a photo of an object, and base a search query around that photo. Now, the company has announced that it will be rolling out Multisearch with the additional “near me” variable and Scene Exploration features later this year.
Multisearch’s “near me” variable allows you to search for a plate of food, find out what it’s called, and find out where to eat it near you. It’s like Shazam but for Search queries.
Basically, you can search for a photo and a question at the same time. The feature will work on everything, such as a photo of a dish, and add “near me” – which will bring up a restaurant that serves the dish. It will scan photos from Maps contributors to match the photo you searched. The feature will be rolled out later this year for English and other languages over time.
Another Search feature that’s coming is Scene Exploration which allows you to pan your camera and instantly glean insights about multiple objects from a wider scene. You can scene the entire shelf with your camera, and it will display helpful info overlaid on the objects.
The feature uses computer vision to connect multiple frames that make up the scene and all the objects within it. The knowledge graph surfaces the most helpful results. Google cited a shelf of chocolates that you can scan and get information on which chocolate bars are nut-free. It allows you to get a sort of AR-looking overlay of an entire scene in front of you.