Imagine having the ability to use finger motions to control objects; this is Google’s vision for the virtual world.
Ivan Poupyev, a Technical Program Lead at Google ATAP is on a journey to find new possibilities for the human hand, and applying it to the virtual world.
His team, Project Soli, is using radars to track twitches of human hands, and using those precise “micromotions” to interact with wearable technology and other devices.
After 10 months of work, the team has already shrunk the technology in Soli down into a fingernail-sized chip:
The idea is to integrate the chip into electronic devices such as mobile phones, computers, and wearables.
The result is something that looks a little something like this:
Here you can see finger motions increasing the volume of a music player without actually touching the physical control panel.
As wearable technology becomes more popular and screens become smaller, clicking physical elements becomes hard and time consuming. Soli plans to change that.