To many of us, a simple command such as "Alexa, what am I holding?" might seem redundant, but it could be indispensable for blind or visually impaired users. To that end, Amazon has introduced a new object recognition feature for its Echo Show devices, something Google should take note of now it also has a camera-equipped smart display in the Nest Hub Max.
I can't imagine having to navigate today's world while visually impaired. From streets to people to items, and even trivial things like preparing a sandwich or knowing the proper toilet sign in a restaurant, it would all be infinitely more difficult without sight, and I have a lot of admiration for those who have to handle these situations every day. Smartphones can make some of this easier, especially with AI at the helm. If Google Lens can identify a dog's breed from a photo, there's nothing stopping it from using the same tech to help visually-impaired people, and that's where Lookout comes in.
Smartphones are very powerful computers that we often take for granted and simply use to check Twitter or Reddit and play some games or watch videos of puppies. But for the visually-impaired, smartphones can be very helpful tools that support them throughout the day and bring them a bigger sense of independence. We've covered several apps in this vein before, like Be My Eyes and Lookout, and now there's a new kid on the Android block: Envision AI.
Google has been working on lots of awesome things that didn’t get a mention during its I/O keynote on Tuesday. One of those things is an app called Lookout that helps blind and visually impaired people discover the world around them. Here’s how it will work when it lands on Android later this year.
I often forget how easy things are for me because I can see the world around me and recognize objects and read words. But for those who are blind or partially visually impaired, simple tasks like knowing the expiry date on the milk carton can be very complicated.
That's where the Be My Eyes service comes into play. Already available on iOS, the app has landed on Android and connects sighted volunteers with visually impaired users who need help with something. It uses an audio-video call to share what the blind/visually impaired person is seeing and allow them to talk to the sighted person and ask them questions.
It was announced earlier in the year at MWC (Mobile World Congress) in Barcelona, and now Samsung's impressive Relúmĭno app has been officially launched. The C-Lab project is designed to help people with visual impairments enjoy the same activities as others, and it uses Samsung's Gear VR headset in conjunction with one of the company's newer Galaxy phones, such as the S7, S7 Edge, S8, or S8+.
Move over SwiftKey. A challenger has appeared and it's aiming to bring even better predictions than we've seen before. This one, named Fleksy, touts predictions that are so accurate, you can type without looking at the screen. In fact, the company says that even if you get every single letter wrong, it can still tell what it is you meant to type. This is pretty impressive. Of course that means the developers need to take it one step further...
In the video above the company shows a blind user walking down the street, typing away on his smartphone (begins around 1:28).