Google & the race for 'finger-free' smartphones

We look at Google’s Project Soli and what it may mean for future of tactile tech

Remember every near-future science fiction film you’ve ever seen? Films like ‘Minority Report,’ ‘Her,’ and every Marvel movie in which Tony Stark invents something, all feature the same technology: a holographic screen that can be manipulated via hand waves and finger twirls in mid air.

Google revealed its version of that future in May, and it has been making steady progress on this technology since. Here’s a look at how fiction is becoming reality, and whether you’ll be able to emulate Tony Stark any time soon.

Google’s Project Soli hopes to lead the finger-free charge

‘Project Soli‘ tracks consumer’s finger motions with radar waves. The finger’s micro-motions are then translated into simple actions. Users might raise and lower a volume control by twisting an invisible dial, or set a watch face with a finger flick.

Ivan Poupyrev, the founder of Google’s most futuristic project yet, demonstrated the prototype technology at Google’s I/O 2015 conference. There, he was able to kick a virtual soccer ball with a quick flip of a finger.

Following ten months of development at Google’s Advanced Technology and Projects lab, the project created a prototype last May that’s the size of a fingernail. After the May presentation, Project Soli’s first alpha kits were sent out to select developers in August, with a larger wave of beta developments planned for 2016.

Almost more exciting than the technology itself though, is the news that it may be incorporated into smartphones or smartwatches. Given the small, unwieldy screens on these devices, finger-free technology has a vast appeal. Google’s work might herald a shift towards finger-free tech and away from simple finger-or-stylus touchscreen displays.

Finger-free tech could be the natural next step in interface history

“This is really just part of the evolution in computer input—how we communicate what we want to a computer,” states Canadian futurist Richard Worzel, who holds that finger-free tech is a logical step towards the future, though one that must be implemented carefully to succeed. He tracks a history of input methods, from pegboards to punch cards to Xerox PARC's invention of the mouse, cursor, and “desktop-and-file-folder metaphor as input,” the latter of which were all popularized by Apple Computers in the 1980s.

“The touch screen liberated us from the limitations of the keyboard and mouse, although it clearly didn't entirely replace them,” Worzel says. “And now we're looking for new ways of doing things that are even closer to our natural, non-cyberspace ways of communicating.”

Gestures in the air—circling a finger to indicate someone is crazy, forming a finger-gun to shoot an idea down—are a natural form of human communication, like speaking and writing. But unlike speaking and writing, which correlate to vocal recognition and text inputs, tech innovation hasn’t yet captured the power the human gesture.

“Gesture interpretation,” Worzel says, “is aimed at making communicating with computers more natural, and closer to how we communicate with each other. And if it's done well, this kind of touch-free communication will be a welcome addition to the other means we use—including text and voice—for that purpose.”

Finger-free tech could solve the ‘textile’ touchscreen challenge

The current touchscreen technology has already been criticized as a “Novocaine drip to the wrist” by designer and futurist Bret Victor, who has pointed out the complete lack of textile interaction with what is essentially fingers sliding over glass.

Google’s Project Soli removes the slick, disaffected glass and replaces it with a version of the sophisticated textile experience Victor calls for. Users might rub their thumb and forefinger to scroll, or tap their fingers together to select something. In these cases, the sensation of their own fingers provide the experience users need to feel an organic connection that no smartphone today provides.

Interestingly, the only moderately popular Apple smartwatch might provide another stepping stone towards finger-free technology. Its wrist-based gesture-reading technology is also a main element in Android Wear, Google’s smartwatch. And if finger-free tech becomes more common, smartwatches might start appealing to the massive user base that’s currently too busy with its smartphones to notice a gimmicky watch.

Itchy foot and your phone shut down? Consumers could be baffled by commands  

There is one core challenge that could hinder the rise of finger-free tech though: The lack of obvious, instinctive commands. Apple already struggles with a clean design that still offers useful responses to intuitive finger flicks across a mobile or tablet screen.

“There is no way to discover what operations are possible just by looking at the screen,” Don Norman and Bruce Tognazzini wrote in a detailed take-down of Apple’s design chops. “Do you swipe left or right, up or down, with one finger, two, or even as many as five?”

When the screen is removed, users can’t even swipe. This makes the correct move even less intuitive. Users relying on Project Soli must be able to quickly navigate an immensely sensitive system without shutting down their phone every time they crack a knuckle or scratch their neck.

Competition comes in the form of haptic tech, aerial holograms and tangible user interfaces

These issues are far from crippling though. Instead they represent the same sorts of challenges that developers have addressed and conquered in every past wave of tech innovation since the invention of the wheel. And if designers can tweak finger-free technology like Project Soli fast enough, other competing tech might beat them out.

Haptic technology aims to recreate a tactile experience with waves or lasers. Japanese researchers are developing a femtosecond laser that can create touchable aerial holograms, while others are working on tangible user interfaces which might also contribute to the future of interactive tactile tech. Perhaps even farther down the futuristic road, Google can incorporate holograms that interact and respond to a user’s entire body rather than just their hands?

Yet poorly developed forms of this technology might have a negative impact on innovation as a whole. As Worzel cautions: “If it's done badly, it may set the whole field back—as happened when Apple's poor handwriting recognition software sunk the Apple Newton, and delayed a wider acceptance of hand-held (and touch screen) computers back in the 1990s.”

Whenever Project Soli arrives, it’s clear that all those science fiction films have successfully predicted one element of the future. We’ll all eventually be addressing our daily tasks with a simple wave of our hand.


Related reading:
Minority Report UI creator’s first collaborative system
Look, no hands! Will Canada’s Myo armband replace the mouse?