Google Lens has got to be one of the most underrated tools on Android devices, capable of turning your phone's camera into a powerful investigatory tool. Multisearch for Google Lens is arguably the next biggest leap forward, combining image-based search with text prompts to help users find exactly what they're looking for. Following a geographically limited rollout last year, this tool is now deploying worldwide.

Google announced multisearch for Lens back in April of last year, finally giving users the power to steer Lens results in a more helpful direction without necessarily switching to text-based vanilla Search, or starting over with a new image search. For example, you could run an image search for a yellow dress in front of you, and then enter a text prompt to see if it comes in green, too.

Relevant results focused on local businesses were enabled subsequently by a feature called multisearch near me. It could allow you to search for nearby restaurants serving Chinese cuisine, for instance, and then refine results to show just the places serving vegetarian food.

Multisearch got started in the US in October, before expanding to India in December. At its event today, Google announced that multisearch for Lens now works around the world. As long as Lens is available in your region, multisearch should work as well. However, if you were hoping to use multisearch near me, you may need to wait for a few months — a broader rollout sounds like it's in the cards, but a timeline isn’t set in stone yet. Multisearch capability on the web is also a few months away, at least.

With granular and convenient Search utilities like this, it hardly surprised us when Google said over 10 billion people used Lens every month. New features are also in the works, like one which allows you to look up any content your screen, and we eagerly look forward to it.