Google's new 'Multiple Search' features point to the future of augmented reality glasses - TechCrunch

Google’s new ‘Multiple Search’ features point to the future of augmented reality glasses – TechCrunch

In April, Google introduced a new “Multiple Search” feature that provides a way to search the web using both text and images. Today at the Google I/O Developer Conference, the company announced an extension to this feature, called “Multisearch Near Me.” This addition, coming later in 2022, will allow Google app users to combine an image or screenshot with the text “Near Me” to be directed to the options of local retailers or restaurants. Who will have the clothes, household items or food you are looking for. It also announces an upcoming development of multiple search which appears to have been built with augmented reality goggles in mind, as it can visually search for multiple objects in a scene based on what you currently “see” through the viewfinder. smartphone camera.

With the new multiple search query “near me” you will be able to find local options related to your current set of visual and text search. For example, if you’re working on a DIY project and find a part you want to replace, you can take a picture of the part with your phone’s camera to identify it, then find a local hardware store with a replacement in stock.

Google says it’s not much different from how multiple search actually works – it’s just adding the local component.

Image credits: The Google

Originally, the idea with multiple search was to allow users to ask questions about an object in front of them and to narrow those results by color, brand, or other visual attributes. The feature works best with today’s shopping searches, as it allows users to narrow down product searches in a way that can sometimes be difficult to manage with standard text-based searches. For example, a user can take a picture of a pair of sneakers and then add text asking to see them in blue so that only shoes of the selected color appear. They can choose to visit the website to get the sneakers and buy them right away. An extension that includes a “Near Me” option now simply limits the results to directing users to a local retailer where the selected product is available.

In terms of helping users find local restaurants, the feature works the same way. In this case, the user can search based on an image they found in a food blog or elsewhere on the web to see what dish and local restaurants they might have had the option to choose on their dinner menu. , pick up or arrive. A google search here combines the image with the intent that you are looking for a nearby restaurant and will scan millions of photos, comments, and community contributions into Google Maps to find the local place.

Google said the new “Near Me” feature will be available globally in English and will be rolled out in more languages ​​over time.

The most interesting addition to the multi search is the ability to search within a scene. Going forward, Google says users will be able to pan the camera to learn more about multiple objects in this larger landscape.

Google suggests that the feature can be used to scan the shelves of a bookstore and then see several useful bits of information superimposed in front of you.

Screenshot 2022 05 11 at 1.23.07 pm

Image credits: The Google

“To make this possible, we combine not only computer vision and natural language understanding, but also knowledge of the web and on-device technology,” said Nick Bell, Senior Director of Google. research. He pointed out that “the possibilities and capabilities of this will be huge and important.”

The company – which entered the augmented reality market early with its Google Glass version – hasn’t confirmed that it has some kind of new augmented reality goggle-like device in the works, but has hinted at the possibility.

“With AI systems now, what is possible today — and will be in the next few years — kind of opens up a lot of opportunities,” Bell said. He noted that in addition to voice search, desktop and mobile search, the company believes that visual search will also be a bigger part of the future.

Screenshot 2022 05 11 at 1.22.32 pm

“There are 8 billion visual searches on Google using Lens every month now and that number is three times greater than it was just one year ago,” Bell continued. “What we are definitely seeing from people is that the appetite and the desire to search visually is there. What we are trying to do now is look at the use cases and identify the places that are most beneficial,” he said. “I think when we think about the future of search, visual search is definitely an essential part of that.”

The company is, of course, supposed to be working on a secret project, codenamed Project Iris, to build a new AR headset with a release date set in 2024. It’s not easy to imagine how this scene-scanning ability would work on such a device, but how could Use any type of image search function in addition to text (or audio!) on an AR headset. Imagine that you look back at a pair of sneakers you liked, for example, and then ask a machine to go to the nearest store where you can make the purchase.

“Looking even further, this technology could be used beyond everyday needs to help address societal challenges, such as helping conservationists identify plant species that need protection, or helping humanitarian workers in disaster situations to quickly sort donations when needed. ,” suggested Prabhakar Raghavan, Senior Vice President of Google Search, speaking on stage at the Google I/O conference.

Unfortunately, Google hasn’t provided a timeline for when it expected scene scanning capability to be available in users’ hands, as the feature is still “in development”.

2022-05-15 13:30:44

Leave a Comment

Your email address will not be published.