Since the new Kinect for Azure device was out, I have been receiving only one question: how to animate an avatar in Unity3D?
Today, I am going to show you how to add avateering into your Kinect applications using a few lines of C# code!
Prerequisites
To run the demos, you need a computer with the following specifications:
- 7th Gen Intel® CoreTM i5 Processor (Quad-Core 2.4 GHz or faster)
- 4 GB Memory
- NVIDIA GeForce GTX 1070 or better
- Dedicated USB3 port
- Windows 10
To write and execute code, you need to install the following software:
Did you know?…
As a professional software developer & consultant, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to create your next Motion Analysis project, get in touch with me.
Azure Kinect Avateering step-by-step
Before even writing a single line of code, we need to prepare a 3D humanoid avatar to animate. Humanoid models contain built-in information about joints and bones.
Humanoid model structure
A humanoid model should have the following joint and bone structure:
The Unity Engine has already defined a hierarchy of joints. The hierarchy is a structure of interconnected bones that form the human skeleton.
Commonly used models, such as Unity Chan, follow these principles. Your models should follow the exact same hierarchy, too.
Opening the avatar file in the Editor, its joint/bone structure would look like this:
You can read more about configuring your avatar here.
You or your artist can design humanoid avatars in programs such as Blender or 3D Studio Max. However, in this tutorial, I am going to use the free humanoid avatars of LightBuzz. Personally, I like the Codeman avatar (the Codeman looks like Iron Man, but he’s a nerd programmer 😉 ).
So, after downloading the 3D model, do the following:
- Open Unity3D and import the Azure Kinect SDK.
- Create a new scene or use one of the demo ones.
- Drag-and-drop the avatar in your Unity3D Project folder.
- Finally, drag-and-drop the .fbx file from the Projects folder to your scene.
Your scene should look like this:
Step 1 – Connect the visual elements
Upon creating your scene and its core elements, it’s time to connect the avatar with the tracked body. Create a C# MonoBehaviour
script and import the Azure Kinect SDK namespace:
using LightBuzz.Kinect4Azure;
The Azure Kinect SDK for Unity3D includes its own Avatar class. To use that class, write:
using Avatar = LightBuzz.Kinect4Azure.Avateering.Avatar;
Then, create an Avatar member that will be visible in the Editor:
[SerializeField] private Avatar avatar;
Remember to drag-and-drop your Codeman to the Avatar Root element:
Step 2 – Configure the Kinect device
It’s now time to configure the Azure Kinect device. As described in my first Kinect for Azure article, starting and stopping the device is straightforward. Simply create a KinectSensor
reference and use the Start()
and OnDispose()
methods to start and stop it, respectively:
private KinectSensor sensor; private void Start() { sensor = KinectSensor.GetDefault(); sensor?.Open(); } private void OnDestroy() { sensor?.Close(); }
Step 3 – Update the Avatar
Finally, it’s time for the actual avateering part! Head to the Update
method and grab the latest Kinect frame. Then, use the BodyFrameSource
to acquire the closest skeleton Body
object. Lastly, feed the Avatar
reference with the skeleton data by calling its own Update
method.
private void Update() { Frame frame = sensor.Update(); if (frame != null) { Body body = frame.BodyFrameSource.Bodies.Closest(); avatar.Update(body); } }
That’s right! All of the avateering magic is happening with one line of C# code:
avatar.Update(body, floor);
Source code
You’ve made it to this point? Awesome! In this article, you’ve learnt how to develop your own avateering applications using the Azure Kinect sensor and Unity3D. Avateering can also be used in developing cool applications, such as virtual dressing rooms! You can have all of the functionality right out of the box in the Azure Kinect SDK for Unity3D.
Resources
You can refer to the following webpages for more in-depth information.
Before you go…
As a professional software developer & consultant, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to create your next Motion Analysis project, get in touch with me.
Sharing is caring!
If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep coding!
What would be the reason when avatar shakes a lot when animating?
Is there way to stable the shaking(seems as if it is frozen constantly every second)?
Hi Grace. You can change the smoothing factor of the avatar to make the movements less jittery. Other than that, a good GPU is recommended by Microsoft.
Hi Vangos! Amazing Blog Post!
can the principles of rigging and 3D model usage be applied for the ARKit in order to map a 3D object/mesh (character) on to a human skeleton? I guess Blender should work fine for the rigging part, or should it be rigged within unity?
Thank you very much and have a great day!
Hello Hans. Yes, the principles are the same regardless of the platform used. You only need an FBX model with the specified joint hierarchy (the same process will not work with USDZ model file formats).