Google’s much-anticipated Pixel 7 and 7 Pro are scheduled for the October 6 Pixel launch event, but it was preceded by the annual Search On conference pertaining to Google’s main business — Search. The company discussed several interesting improvements, including changes to make Google Lens more intuitive to use and Search improvements which help with pangs of hunger for a specific food.

Google knows all too well that translation is one of the primary use cases for Lens. The company now uses Generative Adversarial Networks (GANs) to realistically overlay translated text against the same background as the text in a foreign language. This makes the translation seem realistic and immersive, so a poster in a foreign language won’t lose context in translation.

Google says Lens answers 8 billion questions every month, and the company trusts users are ready for the next big step — combining image searches on Lens with text inputs using a feature called multisearch. It introduced us to this feature at Google I/O, following which it was beta tested in the US. At Search On, Google announced multisearch is now available in 70 new languages.

Multisearch is also getting better with localized result delivery using what Google is calling multisearch near me. Starting this fall in the US, the feature should help you point Lens at an object, and find similar items at retailers near you. The company says this will work with food and plants too. Just have Lens identify the item in question, and use multisearch to find a restaurant that serves the same dish or a gardening store that sells saplings of that plant.

Speaking of food, Search can help you satiate your desire for a specific dish more easily than before. Instead of searching for Chinese restaurants near you, you can directly search “noodles near me,” and Search will help you reach the nearby restaurants that serve noodles. Google says you can further narrow down your search depending on the desired level of spice, giving the example of "soup dumplings." To help you find the best place for a dish, Google says that in the coming months, Search will use machine learning to identify the specialty dishes for every restaurant, replete with filters for vegan and vegetarian preparations. Google could add more filters in due course.

Besides these improvements to Lens and Search, Google also introduced us to changes in Maps at the conference, like the aerial views for landmarks and a new “vibe check” feature for neighborhoods you plan to visit. The Search results display is also changing, so you find all the pertinent information in a more easy-to-consume layout.