Google Goggles has been basically dead since 2014. It had no updates for three years, and what little utility it brought was quickly replaced by other services, or merged into existing apps. Well, now Google has disclosed a worthy successor to the idea in the form of Google Lens. Just announced at I/O, the new system will provide contextual information about things visually, like flowers you take pictures of, or text you point your phone at. This is huge.
I loved Google Googles back when it was released in 2012, but I was sad to see it waste away without updates or added features. The new tool is basically a Camera for the Google Assistant that can pull whatever relevant information applies to a given context you apply it to. For instance, Google showed off entering login info from the bottom router just via your camera. That's going to save every visitor to Grandma's a ton of time, as it can automatically enter that information.
You can also check out info on local business info via Knowledge Graph, pull contact info from business cards, translate text from other languages, pull event information, the applications are basically endless. Most of this stuff already existed, but now it's all in one place, and interacting with other Google services for easy data entry.
This will all be a part of the Assistant in the future, so nothing extra is needed to take advantage of this when it hits. I can't wait to give it a try.