Skip to main content

When Microsoft killed the original Kinect for XBOX peripheral, I was deeply disappointed as a Kinect developer. Upon introducing HoloLens 2 in 2019, Microsoft had a huge surprise for its fans: A new powerful Kinect device, named Kinect for Azure.

It was a genuine “shut up and take my money” moment. Kinect for Azure is a marvelous AI-powered device that allows developers to stream color and depth data, detect the human body joints, recognize speech, and much more.

I’ve been one of the lucky guys to receive my Azure Kinect sensor early on. Today, I am going to show you the anatomy of the Azure Kinect device and explore its hardware components.

If you want to start writing code for the Azure Kinect, refer to my Masterclass articles.

Did you know?…

As an award-winning Kinect MVP, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with me.

Azure Kinect Anatomy

What if I told you that the Azure Kinect is not a single camera?

What if I told you meme

Indeed, Kinect is a container of separate hardware components assembled into one unit. Each component offers its discrete functionality. In the photos below, I am highlighting every single module. More specifically, Kinect consists of:

  • 1 RGB Video Camera
  • 1 Depth Sensor
  • 7 Microphones
  • 1 Accelerometer + 1 Gyroscope (IMU)
  • 2 External Sync Pins

Kinect Front View

Looking at the device in front view, you’ll notice the RGB video camera and the Infrared depth sensor.

Azure Kinect Sensor โ€“ Front
1
RGB Video Camera
2
Depth Sensor

Kinect Back View

Kinect is connected to the computer via a dedicated USB-C port. It needs a separate power source to provide the full depth sensing capabilities.

Azure Kinect Sensor โ€“ Back
1
External synchronization pin (out)
2
External synchronization pin (in)
3
Power cable port
4
USB-C cable port

Kinect Top View

Yeap, I disassembled the sensor, so I can take that photo!

Azure Kinect Sensor โ€“ Top
1
Microphone
2
Microphone
3
Microphone
4
Microphone
5
Microphone
6
Microphone
7
Microphone
8
Inertia Measurement Unit

The RGB Video Camera

The RGB video camera is a traditional webcam, like the one you have on your laptop — only MUCH more powerful! The camera supports streaming of 720p up to 4K video. The aspect ratio is configured programmatically to either 16:9 or 4:3.

Read more about the Azure Kinect Color streaming.

The Depth Sensor

The Depth sensor is a Time-Of-Flight camera that allows Kinect to “see” the world in 3D. It’s using an Infrared beam to measure the distances between physical points and the device. The Depth sensor supports Wide and Narrow fields of view.

The combination of the Color and Depth data is what allows the AI software to track the human body joints.

The Microphone Array

If you are wondering how Kinect can understand your voice commands, have a look at its microphones. Microsoft has equipped the device with not one, but seven mics that allow it to filter the noise and record crystal-clear audio.

The IMU

Kinect is aware of the world, but it’s also aware of itself. How’s that possible? Similar to your smartphone, Kinect features an accelerometer sensor to measure its orientation and a gyroscope sensor to measure rotational changes.

The External Sync Pins

Last but not least, the External Sync Pins allow you to connect multiple Kinect devices to a single computer and synchronize their data properly. That’s particularly useful if you are developing 3D scanning applications.

Summary

In this Masterclass, we explored the Azure Kinect hardware components. Now, head to my developer tutorials to get started with Azure Kinect development.

Copyright notice

The photos above have been captured by Vangos Pterneas on behalf of LightBuzz. You may use the photos as long as you mention your source. Thank you for playing fair ๐Ÿ™‚

Before you go…

As an award-winning Kinect MVP, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with me.

Sharing is caring!

If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep Kinecting!

Vangos Pterneas

Vangos Pterneas is a software engineer, book author, and award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies and ambitious startups create demanding motion-tracking applications. He's obsessed with analyzing and modeling every aspect of human motion using AI and Maths. Vangos shares his passion by regularly publishing articles and open-source projects to help and inspire fellow developers.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.