When you make a phone call, there's a reason your ear doesn't press all the on-screen buttons. The optical sensor near the earpiece is that reason. Thing is, the sensor needs an opening to see if anything is close. It's a small blemish, sure, but it's one manufacturers would love to do away with as they come up with increasingly sexy hardware (soon TVs won't be able to show these devices in commercials without covering them with black bars).

Elliptic Labs has developed a solution. Or rather, it has come up with a way to pitch its existing technology in a way that stands a chance of getting some companies excited. Rather than sticking an optical sensor at the top of the phone, Elliptic Labs replicates the functionality using ultrasound. The company's software-based approach utilizes a device's microphone and earpiece, two pieces of hardware that, at least for foreseeable future, remain necessary ingredients in making what most of us would consider a phone. That's why Elliptic Labs has named the project BEAUTY.

Dimming your screen while talking is hardly the only use case for this code, which is also capable of reading hand gestures. This is an evolution of the capabilities the company previously showed off on Windows laptops. Three years ago the developers demoed similar functionality on Android. But this time Elliptic Labs has reduced the hardware needed to make the magic happen, not the other way around. We just have to wait and see if a manufacturer takes the company up on its offer.

PRESS RELEASE

Source: Elliptic Labs flyer

Via: Engadget