Last Thursday, Google hinted at big things to come from its crack team of so-called “pirates” at the Advanced Technologies and Projects division. On Friday, it delivered with two ambitious technologies that move touch controls off of your screen and onto things you have around you every day: your clothes and hands.
Ivan Poupyrev, the Technical Program Lead for the project, took the stage at Google I/O on Friday to show off what his team was able to accomplish in less than a year since its creation.
Project Soli: Hands on with your hands
Project Soli uses radar to track your hand gestures in the air with extreme precision – it can detect movement of less than a millimeter, and do it through obstructions, like one finger in front of another. That opens a library of possible gestures that extend far beyond the swipes and pinches you may already know from the likes of the Leap Motion controller, which uses IR cameras that cannot see “through” your hands.
On stage, Poupyrev demonstrated the possibilities by rubbing his thumb against his index finger like a mime turning a dial, which moved a corresponding virtual dial moved on screen. He could also swipe against the finger and scroll exactly as you would on a touchscreen … without the screen. Since Soli can detect both the location of your hands and the gestures they make, different locations can also correspond to different effects. For instance, Poupyrev was able to “set the time” on his watch by placing his hand close to the sensor to adjust the hours with a virtual dial, and further away to set the minutes.
The possibilities aren’t difficult to imagine on wearables, which exist on the fringe of screen sizes humans find usable. Apple dealt with this size issue on the Apple Watch by adding a physical dial, the “digital crown,” that can be used for motions like scrolling and zooming in Apple Maps. Google may address the same issue on future watches with a dial that’s entirely virtual.
Google’s hands-on demo stations made it difficult to ascertain how precise Soli can really get – probably because the machine learning necessary to translate raw data from its new toy into usable gestures isn’t really done yet. Demos merely turned waves and swipes of your hand into different abstract shapes on screen, or showed the raw data output from the radar on different graphs.
Perhaps not surprisingly, the first board designs are circular and would fit nicely in say, a watch. Google plans to release it to developers later this year.
Project Jacquard: Real wearables
Like motion sensing, wearable textiles are nothing new, but Google has advanced the ball by bringing them past the realm of prototypes and into the realm of commercial viability with Project Jacquard, which it has partnered with Levi’s to bring to real garments.
The new technology allows touch panels to be woven into conventional fabrics, using conventional textile manufacturing processes. As Poupyrev joked, “We cannot expect the international garment industry to change just for us, even though we’re Google.”
Getting there required weaving custom yarn that withstand the pulling and heating of the weaving process, then developing a way to weave into small patches on a batch of fabric, rather than having conductive thread blanket the entire roll. Google’s yarn has a conductive metal core encased in a sheath of more conventional fibers, which can be dyed to any color.
The touch panels, which are formed from a grid of conductive threads, almost resemble a patch of rip-stop nylon – the type you might find on backpacks or military uniforms. The fabric around them remains soft and normal feeling, despite the fact that the touch panels are literally woven in on a loom, not stitched on later. To demonstrate how subtle the panels can be, Google brought the fabric to a tailor and had a functional suit jacket with touch panels in the sleeves custom made for Poupyrev.
Like the touch panel on a laptop, the panels track the location of your fingers and can even interpret multiple fingers at once, but they’re not designed to swipe pointers around on a screen. Rather, they read broad strokes, like swipes and taps. One demonstration station, for instance, allowed us to turn on LED light bulbs with a tap, increase or decrease their brightness with vertical swipes, and change their color with horizontal swipes. The earliest demonstration versions of Jacquard lacked the sensitivity we associate with modern touch devices – sometimes it took a few swipes to get it to respond, and the effect wasn’t always instant.
To deliver on its promise of integration into real clothes from regular manufacturers, Google has partnered with Levi’s. Neither partner has announced when touch-sensitive denim will show up at retailers near you, but considering ATAP’s promise to move fast – and the fact that it developed the technology in only 10 months – we wouldn’t be surprised to see it soon.