Published 7 years ago · 4 mins read
4 mins read
You probably already know Kinect, the gesture controller developed by Microsoft for its gaming platform Xbox360. If you ever tried to use it, you could have been likely well impressed by how “different” it felt: choosing your film by waving your hands, dancing or even playing sports seems much more cool and funnier, especially when done with somebody else in the room. However, you probably also noticed how imprecise the controller is. It can’t recognize finger movements for example, and also the body tracking is sometimes not really “sharp”. Nevertheless, it has been praised – with reason – as a big innovation in the field of Human-Computer Interaction; however we’re still far away from the gesture controlled dashboard seen in the sci-fi movie Minority Report.
At least for the moment …
Enter Leap Motion
A few weeks ago I discovered the existence of Leap Motion, a gesture-based controller that promises to revolutionize the way we interact with digital devices. I was astonished by the video demo published on their website, where the controller is claimed to have a 1/100th mm accuracy. Yes, you read well: 0.01 mms! My reaction was: wait, too easy to claim your product to be hyper-fast and super-precise in a completely controlled demo. How could it work in real life? So I had a look around on the web to find some opinions: and every review I found, from Wired, to The Verge and Engadget was nearly enthusiastic.
Leap Motion demo video
All the people that used the controller were doubtful as me, but as soon as they tried they were mesmerized by the impressive precision and – above all – the imperceptible lag. As declared by producers the latency between the movement of the hands and what was on the screen was zero. And that’s huge, if compared to what happens on Microsoft Kinect (150 ms on average) or also on touch devices (around 80-100 ms) where there’s a noticeable delay when you wave your hands – or move your fingers – too fast.
Leap Motion is a small USB device that can be connected to any PC or Mac and it’s in pre-order phase at a cost of 69.99 $. The plans of the company – who just raised $30 million – are to integrate the controller directly in computers, tablets, smartphones and any digital device you could imagine. It seems that Asus will be the first one to integrate one in its PCs.
Stunning… Then what?
Like multi-touch, before becoming ubiquitous, gesture controllers need to be used effectively in a few definite areas to prove their effectiveness. Only starting from that this innovation could find his way in our everyday life like multi-touch did. Starting from the iPhone and few high-end smartphones, it proved to be reliable enough for simple applications. Then after few years of use, many realized that it could be employed also in more traditional domains, like for example photo-editing or writing software. Now multi-touch displays are everywhere, from your notebook to your car control panel.
So, what are the most interesting early adopters use cases for our new shiny gadget? One of the first one that comes to my mind is when you need to use a device, but you can’t have physical interaction with your input devices, like touch or even the mouse. Maybe your hands are dirty or wet, or you can’t reach the touch controls. For example a smart shower control panel, that can be controlled through gestures, to change the temperature of the water, control the flow, play music or turn on the TV (yes, shower TVs already exist!). Direct interaction through multi-touch is impossible and waterproof physical controls may be much less usable.
And what about medical devices that can be used with gestures by a doctor while doing surgery without touching anything with his hands? Complex interactions and manipulation of 3D models on a screen is another interesting case in which a NUI could improve the overall experience.
And – of course – games! The LeapMotion is really precise and fast and would allow a direct and immersive gaming experience. First-person shooters, arcade games like Fruit Ninja would be even more funnier with such a controller. Kinect already brought NUIs to gaming and the results – despite the not-so-stunning performances of the sensor – are really encouraging, particularly in case of multiplayer games.
Of course, there are still some doubts about this interaction paradigm. The lack of a physical feedback is one of the biggest drawbacks when talking about gestural interfaces. One thing is to grab a gun with your hands in a game, feeling it with touch, another is to wave your hands in the air pretending that what you see on your screen is “real”.
Above all – like it was done for touch – a common interaction paradigm should be followed in order to provide coherent and learnable gestures to use. In this sense, a great help has to come from the application itself that should provide continuous feedback in order to tell the user what he can – or cannot – do.
Leap Motion – thanks to its amazing features and really competitive price – is making a lot of buzz in the tech world. As soon as designers and developers will be able to deliver compelling applications to play with it, it will jump outside the early adopters circle towards the main audience. In the coming months, if you see somebody waving his hands in front of a computer, don’t think he’s trying to make a sort of magic trick to make it work.
Most probably, he’s using the technology that will make multi-touch obsolete.