Google's Soli Concepts Have Us Thirsty For Virtual Controls

Google's ATAP team arguably stole the show at I/O 2015 last week, and of all the gadgets it's Project Soli that has us most intrigued. The tiny radar sensor could bring ridiculously-accurate virtual controls to wearables and more, opening mobile interfaces up to a wealth of actions and gestures that belie compact touchscreens and work around even the most minimalistic of designs.

Advertisement

"The hand can both embody a virtual tool, and it can also be acting on that virtual tool at the same time," Carsten Schwesig, design lead on Project Soli explains.

After ten months of work, Soli is just 5x5 mm of silicon, with no moving parts to get broken or stuck. Nonetheless, it's capable of tracking submillimeter motions of overlapping fingers in 3D space.

The possibilities are pretty much endless. In one Project Soli concept, the ATAP team shows how a sensor embedded in the casing of a table could allow for granular art tool adjustment simply by pinching and swiping alongside it.

Since the sensor's radar can work through other materials, control gestures made even while your phone is in your pocket could still be recognized: adjusting music playback, perhaps, as in the following ATAP concept:

Advertisement

There's also support for stacking controls in 3D space. To adjust time on a smartwatch, for instance, holding your fingers close could adjust the hour, while making the same twiddling gesture further away might change the minutes, without having to tap or otherwise indicate which aspect you were referring to.

In the radio concept above, meanwhile, three different gestures all completed in the same area can address different commands: a rotating movement for station browsing, a tap for play/pause, and a sliding motion to adjust volume.

Best of all, we might not have long to wait. The ATAP team expects to deliver a prototype board and software API to developers later this year.

SOURCE Google ATAP

Recommended

Advertisement