UI startup aims to beat Apple to the car cockpit

“We have had a number of complaints that customers are frustrated with the interface [in the wearables market]. In fact we have seen Google go back and make an upgrade to the Android Wear interface. It’s a hot area of conversation right now. And what we said was: we can do that. We can put this interface onto a watch and make it much faster and easier for consumers to get to their apps,” says Jonathan Josephson, founder and CTO of Quantum Interface.

Josephson, and Executive VP Kevin Fiur, are talking to me on Google Hangout from Austin, Texas, in the US about their predictive interface that uses motion control to improve user experience on any device or app.

The interface works like this: when a user starts the process of making a selection in a menu, its uniqueness lies in its ability to predict which way the user wants to go. Once that’s inferred, it moves towards the user making the whole process much faster than the usual “point and click” technology we use at the moment. You can see how it works here.

Josephson demonstrates by using his hands to navigate through various menu options on Netflix and later in an automobile setting. The motions come across as seamless but how easy are the gestures to learn?

“Well if it was gestures where you had to learn a sign language it would be very difficult,” Josephson tells me.

“We are not using gestures, we’re using natural motion. The better the sensors get, the easier it will be to draw a picture on the screen and move things where you want to,” adds Fiur.

I am told that learning to use “natural motion” in this context only takes about “20 seconds”. They tested how long it took with a small group of people from a variety of backgrounds.

“Every single one of them except for one got it in less than six seconds. The other one took about 15 seconds. And all of them said: this makes my new iPhone feel old,” says Fiur.

Naturally comparisons will be made with other gesture-based technologies in the market. At Oblong Industries, I witnessed the technology used in the movie Minority Report in action. But there are some key differences between the two types of interfaces. At Oblong Industries, they use a wand to drag and drop assets between screens. But it doesn’t have the predictive-motion capability. Plus it’s very clear they are targeting different markets. Oblong Industries sees its technology being useful in the workplace for presentations, meetings and real-time decision-making. Quantum Interface are targeting the wearables, VR, AR, and automobile industries.

“Obviously they are doing some really great things over there but this is additive and more advanced in some ways than what they are doing. They would be a potential licensing partner for us and would likely be a company interested in licensing our technology. The initiations of our innovations that we started even pre-date them,” says Fiur.

I’m not sure this is entirely accurate. Josephson says he started working on this technology in the 2000s which was roughly around the same time John Underkoffler, CEO of Oblong Industries was working on his.

But without getting into the race argument, Fiur says it’s only in recent years that “the sensors have improved enough” to get the technology to market in a cost-effective way.

“The company officially launched at CES in 2014 where Jonathan spent much of the decade before that still working his day job creating all this stuff. Now it’s been revised and upgraded and fine-tuned to the specific market needs where we see this technology solving some on-going problems in the market place,” says Fiur.

Josephson and Fiur are also excited about the applications of this technology in the automobile industry.                                                                                          

“When you add in the eye-tracker and you can use a touch-pad for confirmation or even your voice - we really have the ultimate killer user interface for the car,” says Josephson as he shows me the interface being used in the car context.

“In an automobile environment the goal is to keep your eyes on the road. So we give a quick glance in the direction of something, which is faster than having to look at it and then we put our eyes back on the road and confirm it with [touching the touch pad] that is on the steering wheel.  So somebody from the backseat doesn’t have to touch anything and can still control what is on in the front. This is what one of the automobile manufacturers want. They want the passengers in the rear to be able to control the infotainment without touching it,” says Josephson.

I ask about Apple and Google becoming increasingly involved in the automobile space. Are car manufacturer’s worried?

“That’s everybody’s fear right! I worry about that when I’m drinking coffee in the morning!” Josephson laughs.

“They know that Google and Apple are coming for the cockpit of the car. And they are trying to take the user experience away from the car manufacturers. The smart ones are saying before we give away that really valuable piece of real estate to technology companies, maybe we should try to do something ourselves.”

“So they are turning to us and saying if we can get one of these in our automobiles and build this up in our infotainment offerings, customers won’t need to plug in their phones as a substitute. At the same time they don’t want to make those guys mad so it’s in the balance right now. If they can get our stuff in first they can continue to own that experience which is really important to them,” Josephson concludes.



« Typical 24: Rick Clarkson, Signiant


Crowdsourcing Innovation: Hendrik Schenk, MYVR »
Ayesha Salim

Ayesha Salim is Staff Writer at IDG Connect

  • twt
  • Mail


Do you think your smartphone is making you a workaholic?