Google has announced new features coming to the Lens app. Thanks to its multi-search feature, Google Lens will even show you where you can find a food you have photographed in the nearest place.
Today, Google is able to translate from photos, scan texts and find relevant results by searching. Lens The app also gets new features. The application, which hosts 8 billion scans every month, will now have more useful features.
According to the statement shared by Google, Lens will soon reveal the translation result in the texts it translates. will be able to reflect on the photo without a background. But beyond that, the app will be able to detect products in a photo you’ve taken and show you where you can find them or similar. So how will this feature work?
It is possible to find the same or similar products of a product with multiple search:
Users will be able to see the options they can buy by taking a photo of a product, as in the example you can see in the image above. However, with some keywords, you can refer to the product in the photo. They will also be able to list similar products.
There will even be a list of restaurants where you can scan the food and buy it at the closest point to you:
With multi-search, users will be able to learn the name of a food by taking a photo of it, and by searching for the keyword ‘near me’, users will be able to find out the name of that food. nearest restaurants where they can find food they will be able to discover. In fact, users will also use filters such as vegan / vegetarian in restaurants.