Natural User Interfaces make some serious buzz these days – not without a reason. Natural User Interfaces (NUIs) literally change the way people interact with computers. They create new patterns, new means of communication, and new business opportunities. We can now play football without a controller. We can create personalized 3D models of the human body. We can accurately track finger movements. Computers are able to even understand our voice and what we mean. A few days ago, Microsoft announced HoloLens: an innovative way of viewing and interacting with holograms.
Guess what: if I was writing this article 5 years ago, all that would be pure Science Fiction. Today, all these possibilities are absolutely feasible using inexpensive 3D sensors and smart software. A lot of Universities are already including NUI classes in their schedule and more and more Computer Science students are familiarizing themselves with the technology of tomorrow.
If you are following this blog for a while, you already know that I’ve been using Kinect, Leap Motion and RealSense in client projects. During the past few weeks (including the recent Christmas holidays), I have been playing with a variety of 3D sensors. Throughout this article, I’ll present the dominant NUI sensors and how you can utilize them in your company or startup!
(Yeap, that’s a picture of my office during the recent holidays…)
How 3D sensors work
A “sensor” is a mechanical device which understands the physical world and interprets it into bits and bytes. For example, a common RBG camera is a sensor that interprets the physical light into red, green, and blue color values. Computer software can then read the values and display the corresponding image. A 3D sensor uses infrared beams and depth processors to understand the distance of the physical elements. Almost every modern NUI sensor uses one infrared camera or a combination of multiple cameras.
Of course, one of the most significant innovations of these sensors is their affordable cost. You can acquire a capable device with 100-200 bucks.
Let’s have a closer look at the most popular sensors.
Kinect is currently the dominant solution for real-time body tracking. Kinect combines an HD color camera, a depth processor and an infrared sensor to effectively track up to 6 human bodies. Moreover, it can recognize a few thousand face points. The developer API is mature enough for business opportunities. Kinect originally debuted as a gaming accessory for the XBOX console. Soon after its initial release, hackers from all around the world started building Kinect apps on PCs and Macs. Four years after the first version, Microsoft announced the successor Kinect version 2, with higher-resolution cameras and improved accuracy. Kinect version 2 is the trusted solution for NUI business apps and games. You can submit your apps to Windows Store.
- Complete human body tracking – supports 25 joints
- Accurate face tracking – access thousands of facial points
- Detection of facial expressions
- Reliable voice recognition
- Requires a powerful Windows PC
- Device setup (wires, AC adapter, hub) should be simplified
- Purchase a Kinect for Windows sensor (or a Kinect for XBOX sensor with an adapter)
- Download the SDK
Manufacturer: Leap Motion
Leap debuted in 2012 and offered the most accurate hand tracking solution at the most affordable price. Leap uses 2 monochromatic infrared cameras and three LEDs, providing 300 frames per second. The sensor can recognize hand states, fingers and joints. Also, the SDK has native support for basic gestures out-of-the-box. Using Leap, you can control your PC without touching the keyboard. Gaming and medical fields will probably benefit from Leap in the future. If you are already building a Leap Motion application, consider uploading it to Leap’s online app store.[youtube code=”L-5ji6UPe_8″]
(Orthosense project, by Kinetisense)
- Complete finger and wrist tracking – supports 15 joints
- Supports all major operating systems (Windows, Mac, Linux)
- Sensor normally loses accuracy when you rotate your hands 180 degrees
Occipital grasped the need for mobile 3D cameras early on. While the rest of the world is waiting for Project Tango, Occipital released Structure sensor: the first 3D camera that needs no power supply and can easily connect to your iPad Air or iPad Mini tablet. The sensor gives access to depth and infrared data, and also provides a method to map the depth data to the RGB data of the iPad’s camera. Occipital is also maintaining OpenNI, a sensor-agnostic framework that enables human body tracking capabilities. The OpenNI is not fully ported to iOS yet, but we are eager to see the end result.
- Lightweight, mobile sensor
- No need of constant power supply
- OpenNI (when ready)
- “Hacker” cable to connect to a PC or Mac computer
- No actual body tracking on the iPad so far
- No face tracking
Intel could not stay out of the NUI game. RealSense, although still in beta, is an all-in-one solution for face tracking, hand tracking and voice recognition. RealSense is the best solution if you need both face and finger tracking. It is not as accurate as Kinect or Leap, however, it can decently do both; and at a relatively lower price. Moreover, RealSense is probably the prettiest and most convenient sensor. You can place it comfortably on top of your screen or your desk.
- Hand, finger, and face tracking
- Detection of facial expressions
- Reliable voice recognition
- The awesome commercials, starring Sheldon Cooper
- Accuracy – not suitable when you need high precision
- Orders will be available soon
- Download the SDK
Manufacturer: Thalmic Labs
Myo is quite different than the aforementioned sensors. Why? Because it’s wearable. Myo is an armband that communicates with your computer or smartphone using Bluetooth (or USB). The device integrates three sensors: a gyroscope, a magnetometer and an accelerometer. Measuring the energy of your muscles, it can understand how you move and rotate your arm or fist! Myo also has its own app store, called Myo Market.
- Accurate muscle motion tracking
- Connects with your PC, Mac, iOS or Android device
- Uncomfortable for every-day use
- Most use-case scenarios require additional equipment (such as drones or smart homes)
Which one should I use?
Everything depends on what you need to build. If you need accurate body tracking or face tracking, Kinect is the way to go. If mobility is your primary concern, prefer Structure. I strongly believe that, if Structure ever manages to resurrect OpenNI, it will open a new road to mobile gaming. Just imagine an iPad (the most popular tablet) performing body tracking…
If your project needs reliable finger tracking and finger range-of-motion, Leap is a decent solution. RealSense, which is in beta version right now, could be used as a cheap alternative to Kinect or Leap. The current version provides decent face and tracking mechanisms, but do not expect it to be better than Kinect or Leap.
Finally, it seems that Myo is creating a whole new market in the wearable computing industry. It’s lacking real-world scenarios, but its capabilities seem quite impressive so far.
The show must go on!
So, what do you think? Which sensor are you using and which one are you planning to use for your project? Let me know in the comments below.
PS: Developer information
If you are planning to develop for any of the above sensors, these are the languages, frameworks and platforms each SDK supports:
Languages / Engines
Windows Embedded 8
|Leap Motion||Windows 7+|
MacOS X 10.7+
MacOS X x64
(any OpenNI-supported language)
MacOS X 10.8+
PS 2: Extreme Reality
I recently discovered Extreme Reality, an amazing software product that converts any plain camera into a body-tracking sensor! Using their cross-plaform SDK, you can track human body joints using your PC’s webcam, your iOS or Android device. Extreme Reality may not (yet!) be as accurate as Kinect, however there seems to be really huge potential here. Download their trial SDK and start developing your programs without purchasing any additional hardware!
A few days ago, I was invited to talk about NUIs and the way they affect our lives in the National Game Developers Conference. I also performed a lecture to 4th-grade students in Athens University of Economics and Business. Here are the slides of my presentations, for your reference:
* I would like to thank our partners from Kinetisense for providing me with a RealSense device. Wish good luck to their new product, Orthosense.
Hi nice collective info you have given some topics i never heard only thanks for giving nice info on NUI really its War
Thank you. More NUI Wars posts coming soon!
LeapMotion has a Java API.
Thank you very much for your comment. I have updated the post 🙂
How about Softkinetic’s DepthSense (ToF) devices?
Hi Zofia. I do not own a Softkinetic DepthSense device. What I present in this article is a comparison of sensors I have hands-on experience with. Would be glad to write an article if I get one. Thank you for your comment.