Android is doing its darndest to become a better OS update by update — even beyond the actual OS upgrades. New pushes to Google Play Services and apps will improve how people reach emergency services, get them to bed, and bring the world clearer and closer to those with vision loss.
I can't imagine having to navigate today's world while visually impaired. From streets to people to items, and even trivial things like preparing a sandwich or knowing the proper toilet sign in a restaurant, it would all be infinitely more difficult without sight, and I have a lot of admiration for those who have to handle these situations every day. Smartphones can make some of this easier, especially with AI at the helm. If Google Lens can identify a dog's breed from a photo, there's nothing stopping it from using the same tech to help visually-impaired people, and that's where Lookout comes in.
For the International Agency for the Prevention of Blindness, today is World Sight Day — a time to think about how we can prevent avoidable vision diseases and reducing external impacts for people living with visual impairment or blindness. Navigation in unfamiliar places remains one of the biggest challenges for those people, so the Google Maps team has decided to take the opportunity to roll out detailed voice guidance for walking directions.
Google's machine learning wizardry is capable of more than just AR emoji. As proven by features like Live Relay, computational recognition of sound and images can lead to incredible quality-of-life improvements for people hard of hearing or sight. Google's newest trick? Chrome will soon add captions to every image on the web.
Google has been working on lots of awesome things that didn’t get a mention during its I/O keynote on Tuesday. One of those things is an app called Lookout that helps blind and visually impaired people discover the world around them. Here’s how it will work when it lands on Android later this year.
I often forget how easy things are for me because I can see the world around me and recognize objects and read words. But for those who are blind or partially visually impaired, simple tasks like knowing the expiry date on the milk carton can be very complicated.
That's where the Be My Eyes service comes into play. Already available on iOS, the app has landed on Android and connects sighted volunteers with visually impaired users who need help with something. It uses an audio-video call to share what the blind/visually impaired person is seeing and allow them to talk to the sighted person and ask them questions.
Google makes cool stuff. There's self-driving cars, that funky street view camera, and those experimental glasses anyone will be able to buy for one day only tomorrow, April 15th. Yet for every product that comes out, there's another in the pipeline that may or may not ever see the light of day. Last month we learned of a patent application for a pair of smart contact lenses that would process blinks as input for wearable devices. Now Patent Bolt has reported on a separate application for a micro camera component for those lenses.
The first thing that comes to mind here is the ability to take photos using just your eyes.
Move over SwiftKey. A challenger has appeared and it's aiming to bring even better predictions than we've seen before. This one, named Fleksy, touts predictions that are so accurate, you can type without looking at the screen. In fact, the company says that even if you get every single letter wrong, it can still tell what it is you meant to type. This is pretty impressive. Of course that means the developers need to take it one step further...
In the video above the company shows a blind user walking down the street, typing away on his smartphone (begins around 1:28).