Google has been working on lots of awesome things that didn’t get a mention during its I/O keynote on Tuesday. One of those things is an app called Lookout that helps blind and visually impaired people discover the world around them. Here’s how it will work when it lands on Android later this year.

There are 253 million blind or visually impaired people in the world, Google says. The company feels it has a responsibility to use its technologies to help those people live a more fulfilling life and become more independent. Lookout helps them do that by providing auditory cues as users encounter objects, text, and people while they navigate the world around them.

By using your smartphone’s rear-facing camera, Lookout identifies important items in your environment and reports the information it believes is relevant. This might include things like exit signs, the location of a bathroom, people or objects nearby, and even text in a book. Lookout’s spoken notifications are designed to be used with minimal interaction so that they don’t distract you or get in the way.

Lookout will offer four different modes initially. Its Work and Play mode is ideal for navigating environments like offices, shopping malls, and restaurants, while its Home mode can help you carry out chores, cook a meal, and more. Scan mode creates a snapshot of any text that Lookout will then read aloud, and the Experimental mode can be used to test upcoming features still in beta.

“When you select a specific mode, Lookout will deliver information that’s relevant to the selected activity,” Google explains. When using Home, for instance, Lookout might help you locate the dishwasher, the dining table, or the couch. But when using Work and Play, Lookout might tell you when you pass an elevator or stairwell.

Much of this processing happens on your device, so Lookout can be used without an internet connection. You’ll be able to try it out later this year when Lookout makes its debut on Android in the U.S.