Google showed off some pretty incredible changes for Lens at I/O, but one of the cooler upcoming features had to be the real-time detection, which is able to identify and add interactive elements to objects you might want information about. By all appearances, the new feature appears to be rolling out now, alongside an updated white pull-up interface.
Old Lens UI (left), new Lens UI (right).
The visual changes are fairly obvious. There's a new rounded card at the bottom of the screen which replaces the old gradient to black overlay interface and its big bubble buttons for actions. The new white UI can be pulled up to provide examples and instructions on how to use Lens.
Pull up and you'll see these examples.
If you happened to use the old "Remember this" or "Import to Keep" shortcuts in Lens, you may have to adopt a change in workflow, as that functionality doesn't appear to have been reproduced in the new interface.
Lens now identifies objects you might want information about with dots in real-time (left) and provides info if you tap them (right).
The new real-time view means you don't have to stop and tap something in the viewfinder before Lens can recognize it as an object capable of providing information. Now it proactively shows a colored dot on items as soon as they're noticed. As before, once you tap, the screen freezes on the image in the viewfinder and the information pops up from below.
Lens can recognize more than one object at a time.
The new version is also able to recognize more than one object at a time, so you can immediately tell if Lens has identified more than one item in the scene it can provide information for.
One other feature also appears to have been removed. Previously, the Assistant would show those frozen-screen images as a sort of search history when you backed out of the Lens dialog. That doesn't appear to be the case anymore. That history was mostly of vestigial use, but if you depend on it as part of a workflow, you'll need to come up with something else to replace it.
Google's GIF from I/O of real-time Lens in action, which is a bit different than reality.
Since Lens has now rolled out to a much wider, non-Pixel audience, these new features should be hitting most phones in the coming days. Only one of us here at AP has the new interface so it may be a bit before it hits everyone. We've tried a few of the latest versions of the Google app, including the recently released beta updates, and that doesn't seem to trigger it. The change may be server-side and activated remotely, with the new interface and features present in existing versions of the app.