We've known for a few months now that Google planned to add a Lens shortcut to the search bar in the Pixel Launcher, but it was unclear when or even if the icon would make its way to all users. It seems like it might be timed with the upcoming release of Android 12, as new sightings of the logo within the widget are starting to pour in.
Despite the popularity of competitors like Instagram, Snapchat's still an insanely popular messaging app. Hell, it even managed to hit one billion installs on Android before Google Messages. That doesn't mean Snap Inc. has stayed confined to developing its smartphone. Experiments like AR glasses and original programming have pushed Snapchat far beyond its original concept. With a newly-reworked iteration of Scan, the company's looking to bring AR search results to its massive fanbase.
Google Lens is one of the most underrated tools out there, and if you're not using it already, you should give it a shot — it was one of our top 10 favorite Android features of 2020. If you're a frequent user, though, you're about to see an overhaul of the UI that puts more emphasis on the images you already have on your phone.
Google Lens is one of those features you can never go without or you've never used. Leveraging the power of search with your phone's camera, Lens allows anyone to look for information about any real-world object online just by snapping a photo. It seems like the feature is becoming more popular than ever, as Google has revealed some interesting stats regarding the platform, along with an all-new way to explore new areas when traveling this summer.
Google's long been the king of machine translation, but Apple appears to have designs upon its throne. The company revealed that its next release of iOS would feature "Live Text," a feature that's more or less a straight copy of the image processing Google has been doing in Lens for years. But as part of that, it's going to add rapid language translation pretty much everywhere in the operating system, including selectable text.
We're here at WWDC — well, not "here," it's remote, and I'm sitting in my office streaming, blogging, and drinking coffee — and Apple is showing off all the new software features we can look forward to for the coming year for its products. But one particular announcement just caught our eye, a new "Live Text" feature that promises to let you pull text and contact details from photos. For Android users, this sounds pretty goddamn familiar. Ever hear of Google Lens, Apple?
During I/O, Google has announced open cart reminders for shopping built right into Chrome. It allows you to track your open carts on shopping websites inside the browser, without you having to leave the corresponding websites open indefinitely. But the company has also introduced a few other shopping features that are worth highlighting, including an option that allows you to connect your loyalty programs with your Google account to show you the best prices right away.
Google Lens is an incredibly powerful tool that probably gets used way too little — you need to know how to access it in the first place, and even though Google isn't shy about adding it almost anywhere you could think of, it might still not be as discoverable as the company would like it to be. That's probably why it's experimenting with adding it to the homescreen search bar on Pixel phones.
Google Translate is a pretty slick tool, and the way that it's integrated with Google Lens for quickly applying it to text in photos is inspired. The latest tweak to the system on Android brings that functionality front and center. Users on Pixel phones are seeing an extra "Translate" action item in the pop-up screenshot menu, allowing them to get to Google Lens even faster.