Skip to main content

A few days ago, I had the honor to speak at the IT Pro | Dev Connections 2020 conference. As a software engineer, I love natural user interfaces, body tracking, and motion analysis. It’s no surprise that my presentation is focusing on the Azure Kinect. We’ll explore how the interaction between humans and machines evolved until we reach the era of Kinect and natural user interaction. You’ll have the chance to explore the anatomy of the sensor and investigate its discrete components. Amazing things happen when great hardware meets Artificial Intelligence. Our coding session includes body-tracking, angle calculations, and 3D point clouds! Last but not least, my friend Georgia Makoudi is connecting Kinect to the Cloud and enhances it with object tracking capabilities!

You can watch our 45-minute presentation here. I’m also attaching my presentation and notes below.

We are starting our journey with one of the very first computers – ENIAC. Programming the ENIAC was not trivial. Honestly, the original programmers were truly heroes.

ENIAC Women Programmers

Computers evolved, and so did the interaction with humans. Computer Scientists invented revolutionary means of interaction, such as the keyboard, mouse, and touch screen. In the meantime, however, Science Fiction was always a step forward. Iconic movies & series such as Star Wars, Star Trek, and Minority Report brought us to a world where people interacted with computers via holograms, gestures, and voice commands. What a marvelous world! For many decades, humanity believed that was only a fairy tale.

That’s until Microsoft revealed the Kinect. Unlike what you might think, Kinect is not a single device; instead, it’s a bundle of multiple components grouped together. These components include:

  • 1 video camera
  • 1 depth sensor
  • 7 microphones
  • 1 accelerometer & 1 gyroscope
Azure Kinect - Sensor Anatomy

You can configure the video camera to output a resolution of 720p up to 4K. Internally, the output is simply an RGBA array of bytes.

Azure Kinect Color Frame Pixels

The truly revolutionary module is the depth sensor. The depth sensor is an infrared emitter that can see the world in 3D. It supports a Wide and a Narrow field of view.

Azure Kinect Depth - NFOV and WFOV

The combination of depth data and cutting-edge Artificial Intelligence produces magic. Here’s how the Kinect SDK estimates the coordinates of the human body joints in three-dimensional space…

Azure Kinect Body Tracking 3D

… allowing us to see the human body from multiple angles!

Azure Kinect Body Tracking Unity3D – 3D Skeletons

Things can get even more exciting. What would happen if you combined the color, depth, and body-tracking data all at once? Well, here is a 3D point cloud that will blow your mind:

Azure Kinect Point Cloud

But, wait, there’s more! Azure Kinect is named “Azure” for a reason. Azure is a powerful cloud solution by Microsoft. Part of its functionality includes a set of APIs that provide remote AI Computer Vision solutions. Connect Kinect to Azure, and you’ll have a ton of additional functionality, such as real-time object detection!

Azure Cognitive Services - Computer Vision Object Detection (Living Room)

One more thing…

After leaving my job at Microsoft, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with me.

Sharing is caring!

If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep coding!

Vangos Pterneas

Vangos Pterneas

Vangos Pterneas is a professional software engineer and an award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies, and ambitious startups create demanding motion-tracking applications. He’s obsessed with analyzing and modeling every aspect of human motion using Computer Vision and Mathematics. Kinect programming started as a hobby and quickly evolved into a full-time business. Vangos shares his passion by regularly publishing articles and open-source projects that help fellow developers understand the fascinating Kinect technology.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.