We Live In The Future: Better Entertainment Through Gesture Controls

Welcome back to “We Live in the Future,” in which we will tackle subjects of new and crazy ways science is changing the way we think about transportation. These will skew towards longreads, so consider yourself warned: here there be science.

minority report computer pic

In the summer of 2002, our world changed. For so long, we had been tethered to mouses and desktops when we wanted to use computers, and all of a sudden, here comes Tom Cruise, using a computer by waving his hands around and wearing awesome gloves. It’s only taken 11 years, but we might finally be close to getting some of that sweet, sweet gesture control computing.

False Starts

Arcade games have had an element of motion control to them for a long, long time. I’m not just talking about playing Silent Scope at your local arcade establishment, I’m talking Duck Hunt, the B-side to Super Mario Brothers, and the famous Nintendo Lightgun. Nintendo was possibly the first company in the current wave to bring motion control to the fore when they introduced their Wii system, relying on a controller that was more remote control than the average A/B/X/Y controller.

Then came the Kinect, and we thought, “Finally. The age of motion control has reached its usability peak.” But we were mistaken. The Kinect, like the Wii before it, has a lot of issues in terms of making it understand what you’re doing (i.e. the amount of times I screamed at my Wii, “NO YOU IDIOT I’M SCRATCHING MY ARM NOT TRYING TO SERVE THE TENNIS BALL! GET IT TOGETHER WIISPORTS.” It was a spectacle.)

Enter the Iron Man movies

Even slightly pre-Kinect, we were seeing Tony Stark up the gesture controlled computing game. 2008’s Iron Man features Marvel’s eccentric billionaire (a personality type that is apparently another running theme of this column…) pulling up entire 3D schematics, moving them about with his bare hands like hard light sculptures, and throwing them in a literal/digital trash can.

iron man computer image

Nerds everywhere salivated. This was the future. We could create objects out of light in a free-standing, three-dimensional space, and manipulate them to do our bidding. This is indeed what the human race was born to do.

So where is it?

Building the Better Touchscreen

There is a company called Ubi Interactive that announced recently that they would be selling an app for Windows 8 and the Kinect for Windows (I’m apparently crazy behind in even knowing that there were two separate Kinects?) which would enable touchscreen computing on any surface. The app is, perhaps unsurprisingly, called Ubi. Basically, you set up a projector and a Kinect, pointed at the surface you want to use as a touchscreen, and you hook them both up to your Windows 8 PC. Then, you start the Ubi software, and calibrate it, and you’re good to go.

I haven’t had a chance to play around with the interface, the most obvious reason being that it is $150 and up, with a pricing scheme based on how many touchpoints you want to be able to use. For the baseline $150, you can use one touchpoint to move the screen, click on things, etc. For two touchpoints and the ability to pinch-and-zoom, which is really the hallmark of a touchscreen interface and not a glorified mouse, the price suddenly more-than quintuples, to $799. The big boss dog package (technically the “Enterprise” package, which I assume means “small businesses”) is $1499 for 20 touchpoints, which seems like too many people huddled around for my comfort, but hey. When in Rome, I suppose.

And This Has… What? To Do with Transportation, Exactly?

Okay, so I’m out of the loop a little bit in terms of car stereo systems and whatnot, since my car was made way back in 2008 and all it has is an AUX In, a 6-disc changer, and the radio. This must be what it’s like to live in the third world, am I right? (I am not right). But in your more modern vehicles with infotainment systems like the MyFord Touch system, automakers have been struggling to strike a balance between functionality and making things as hands-free/safe as possible.

<Personal aside> I should put in a disclaimer that I don’t personally believe hands-free stuff is more safe, though there are studies to that effect. I find that I get annoyed easily when Siri-type voice-commanded programs don’t understand my voice, presumably because of my obscenely crisp diction. But either way, then I just get mad that I acn’t get my music to play, or a map to come up or what have you, and I drive more recklessly. It doesn’t help anyone.</Personal aside>

But what if they could project your infotainment display onto the console? Or onto a black pad behind your gear shift?

A lot has to change in terms of the organization of the average vehicle’s dashboard displays to make them truly non-distracting, but let me paint a mental picture, here.

Nick Paints a Mental Picture

So you’ve got a center display in your vehicle, like a MyFord Touch screen, or one of any various LCD screens in higher-end cars, but –game change– you’ve got the same screen projected into a Heads-Up Display (HUD) on your windshield, right above your dashboard. You’ve got a black pad with haptic feedback buttons beneath the rubber of the pad itself, located between your gear shift and your center console. This puts it in a comfortable spot for you to rest your hand when using the system. You’ve got the Kinect mounted in the ceiling of the car directly above the pad, possibly in the underside of the mirror.

Now, you can use the touchpoint system from Ubi to control your infotainment. You can click through to your music choice, or the navigation tab, all without having to search for a button and without taking your eyes off the road. The big issues here are making the touchpad textured enough that you can tell where your finger is at any given time without putting a cursor icon in the HUD, otherwise you might accidentally click into all kinds of things before you land on what you intended. The other is the HUD system, which I think is only a matter of time. Some vehicles currently display MPH and other related info on the windshield, but there are a lot of hoops companies would have to jump through to prove it is safe/non-distracting.

So What’s Next?

We’re still a ways from actual gesture control for our computers. We still have to click a physical space, even if that physical space is a projected one that only exists as light.

Honestly, I’m really not going to be satisfied until they can project a steering wheel as a hologram that I can control while driving and then make it disappear when I turn the vehicle off so I don’t keep barking my knees on the darn thing.

So fingers crossed.

 

About Nick Philpott

Nick Philpott is the Chief Storyteller at Lebanon Ford. He believes that every vehicle and driver has a unique story to share. You can contact him directly at (513) 932-1010 or nphilpott@lebanon-ford.com.