Skip to main content

NOTE: This is preliminary software and/or hardware and APIs are preliminary and subject to change.

Kinect is awesome for body tracking, but Kinect version 2 is more than finding the human joints positions. Today we’ll see how we can track hands and thumbs and specify the hands’ state. In this article, we’ll extend the functionality we built in the previous blog posts, so you might want to revise them before going on.

Download the source code of this example

Requirements

Video

Here is a brief video I created demonstrating the functionality we are going to examine (cudos to my young sister).
[youtube code=”zM6Xcuja04I”] [Tweet “Hand states and hand tracking using Kinect”]

Hand tracking

Finding the position of a hand or thumb is just like finding the position of every other joint. Remember the snippet we used before? Here’s how you can get the position of each hand:


void Reader_MultiSourceFrameArrived(object sender, MultiSourceFrameArrivedEventArgs e)
{
    var reference = e.FrameReference.AcquireFrame();
    // Color
    // Display the color stream...
    // Body
    using (var frame = reference.BodyFrameReference.AcquireFrame())
    {
        if (frame != null)
        {
            canvas.Children.Clear();
            _bodies = new Body[frame.BodyFrameSource.BodyCount];
            frame.GetAndRefreshBodyData(_bodies);
            foreach (var body in _bodies)
            {
                if (body != null)
                {
                    if (body.IsTracked)
                    {
                        // Find the joints
                        Joint handRight = body.Joints[JointType.HandRight];
                        Joint thumbRight = body.Joints[JointType.ThumbRight];
                        Joint handLeft = body.Joints[JointType.HandLeft];
                        Joint thumbLeft = body.Joints[JointType.ThumbLeft];
                    }
                }
            }
        }
    }
}

Displaying these joints is easy. Simply call the DrawPoint method we created before. In today’s example, I have slightly modified the source code to draw different colors:


// Find the joints
Joint handRight = body.Joints[JointType.HandRight];
Joint thumbRight = body.Joints[JointType.ThumbRight];
Joint handLeft = body.Joints[JointType.HandLeft];
Joint thumbLeft = body.Joints[JointType.ThumbLeft];
// Draw hands and thumbs
canvas.DrawPoint(handRight);
canvas.DrawPoint(handLeft);
canvas.DrawPoint(thumbRight);
canvas.DrawPoint(thumbLeft);

Hand states

Currently, Kinect supports the following hand states. They are parts of the HandState enum:

  • Open
  • Closed
  • Lasso
  • Unknown
  • NotTracked

Note that the hand states are members of the body object, regardless of the corresponding hand joints.. Here’s the way to access the states:


if (body.IsTracked)
{
    // Find the right hand state
    switch (body.HandRightState)
    {
        case HandState.Open:
            break;
        case HandState.Closed:
            break;
        case HandState.Lasso:
            break;
        case HandState.Unknown:
            break;
        case HandState.NotTracked:
            break;
        default:
            break;
    }
    // Find the left hand state
    switch (body.HandLeftState)
    {
        case HandState.Open:
            break;
        case HandState.Closed:
            break;
        case HandState.Lasso:
            break;
        case HandState.Unknown:
            break;
        case HandState.NotTracked:
            break;
        default:
            break;
    }
}

Can’t be any easier!

After that, I simply created some <TextBlock> elements (named tblRightHandState and tblLeftHandState respectively) and updated each hand state. Here’s the complete source code:


using (var frame = reference.BodyFrameReference.AcquireFrame())
{
    if (frame != null)
    {
        canvas.Children.Clear();
        _bodies = new Body[frame.BodyFrameSource.BodyCount];
        frame.GetAndRefreshBodyData(_bodies);
        foreach (var body in _bodies)
        {
            if (body != null)
            {
                if (body.IsTracked)
                {
                    // Find the joints
                    Joint handRight = body.Joints[JointType.HandRight];
                    Joint thumbRight = body.Joints[JointType.ThumbRight];
                    Joint handLeft = body.Joints[JointType.HandLeft];
                    Joint thumbLeft = body.Joints[JointType.ThumbLeft];
                    // Draw hands and thumbs
                    canvas.DrawPoint(handRight);
                    canvas.DrawPoint(handLeft);
                    canvas.DrawPoint(thumbRight);
                    canvas.DrawPoint(thumbLeft);
                    // Find the hand states
                    string rightHandState = "-";
                    string leftHandState = "-";
                    switch (body.HandRightState)
                    {
                        case HandState.Open:
                            rightHandState = "Open";
                            break;
                        case HandState.Closed:
                            rightHandState = "Closed";
                            break;
                        case HandState.Lasso:
                            rightHandState = "Lasso";
                            break;
                        case HandState.Unknown:
                            rightHandState = "Unknown...";
                            break;
                        case HandState.NotTracked:
                            rightHandState = "Not tracked";
                            break;
                        default:
                            break;
                    }
                    switch (body.HandLeftState)
                    {
                        case HandState.Open:
                            leftHandState = "Open";
                            break;
                        case HandState.Closed:
                            leftHandState = "Closed";
                            break;
                        case HandState.Lasso:
                            leftHandState = "Lasso";
                            break;
                        case HandState.Unknown:
                            leftHandState = "Unknown...";
                            break;
                        case HandState.NotTracked:
                            leftHandState = "Not tracked";
                            break;
                        default:
                            break;
                    }
                    tblRightHandState.Text = rightHandState;
                    tblLeftHandState.Text = leftHandState;
                }
            }
        }
    }
}

What’s next?

We are expecting a new SDK update from Microsoft in the next few days, so stay tuned for more great staff like facial expressions.

PS 2: New Kinect book – 20% off

Well, I am publishing a new ebook about Kinect development in a couple months. It is an in-depth guide about Kinect, using simple language and step-by-step examples. You’ll learn usability tips, performance tricks and best practices for implementing robust Kinect apps. Please meet Kinect Essentials, the essence of my 3 years of teaching, writing and developing for the Kinect platform. Oh, did I mention that you’ll get a 20% discount if you simply subscribe now? Hurry up 😉


Vangos Pterneas

Vangos Pterneas is a software engineer, book author, and award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies and ambitious startups create demanding motion-tracking applications. He's obsessed with analyzing and modeling every aspect of human motion using AI and Maths. Vangos shares his passion by regularly publishing articles and open-source projects to help and inspire fellow developers.

84 Comments

  • Nelson says:

    I tried your code but I got some errors.

    I have Kinect v2 sensor for Windows and SDK 2.0

    When I open the proyect with my Microsoft Visual Studio 2013 it shows me two errors.

    1. At Window_Loaded function at “_sensor = KinectSensor.Default;” it says that “Default” does not exists, so I changed it to “GetDefault()” method.
    2. At “body.Dispose();” it says that body doesn’t have a Dispose method, so I commented it out.

    Then I ran the proyect and I got a runtime error saying that it can’t execute Process.Start(@”C:\Windows\System32\KinectService.exe”); because I don’t have that binary. I looked trhought my Kinect directories and I found “KinectManagementService.exe” file at “C:\Program Files\Microsoft Kinect Drivers\Service” so I replaced it. Not it runs but the hand stats only says “Unkown” or “Not tracked” and the curcles round my hands are not in my hands.

    Let me show you:
    http://puu.sh/cIqZJ/b296a5e1f7.jpg

    • Hi Nelson. This code was written using the preliminary software (beta version of the SDK and early hardware). Thank you for noting this. I will update all of my repositories by Monday. For the record, you do not need to launch KinectService any more. You can delete that line of code.

      Cheers,
      Vangos

  • Marina says:

    Greetings!
    Did you have a chance to update the code. I downloaded it from couple of times- it still has those dead methods.
    Thanks

  • Shawn says:

    hi! love your work! and if i wanted to do a check for handstate, like ill do an if loop:
    if(handstate.open)
    { //ill do something here}
    is it possible?

  • eple says:

    hi,thanks for your work !
    I have Kinect v2 sensor for Windows ,SDK 2.0 and Microsoft Visual Studio 2012.
    I tried your code and found that the curcles round my hands are not in my hands.
    Let me show you:
    http://imgsrc.baidu.com/forum/pic/item/ef9caa0a19d8bc3e89adf584818ba61ea9d345f0.jpg

  • Deepak Kumar says:

    hey,
    how can i count number of closed hand. i want have when a person close his hand 2 times for double click.
    since, HandState.Closed is an enum having value 3.
    how can i count 3 appear 2 times for my double click.

    thanks!!

    • Hi Deepak,

      That’s a little tricky. A double-click gesture occurs in approximately half a second (that is, 15 Kinect frames). So, you need to have a frame counter that will be reset to zero after the 15th time. You’ll need another counter that will count the number of closed hand states within the 15-frame period. The closed states should not be consecutive I guess: you are searching for a pattern Open-Closed-Open-Closed within the 15-frame period.

      Let me know if that helped you 🙂

  • vinayak says:

    hello vangos pterneas,
    hi its really awesome blog for kinect i learnt a lot with your blog but when i was running the 2 Hand tracking code it was showing late data after closing and opening it should show immediately right but after closing it was waiting for 3-5 seconds and showing closed so what was the problem i am not getting but when i run the configuration verifier at last the color and depth was showing warning is that issue or my machine is i3 and intel HD family graphics i know its not in recommended but i worked 10 days no issue but now am getting please help what is the problem what’s your machine config for 2 Hand tracking ?
    thanks a lot

    • Hi vinayak. I use an i5 processor with 8GB of RAM. Please run Kinect Configuration Verifier from the SDK Browser to find out if your system is capable to run Kinect apps. Alternatively, try another USB 3 port.

      • vinayak says:

        thanks vangos. i run the Configuration Verifier it was showing warning at last verify kinect depth and color streams and warning at usb contorller i am getting out put but after some 15 min that last option in Configuration Verifier kinect depth and color streams it will show red mark it is changing its status from warning to error so i think its because the usb and the i3 i will check in i5 anyway thanks for your valuable feedback to me i will try it and when can i except the kinect book from you ?

  • Arief Setiabudi says:

    Hi…What a nice code
    But after i tried your code there’s something missing….the drawhand and drawthumb doesn’t work @@
    The program work well but not draw any circle in my hand…you know what’s wrong with this?
    Thanks xD

  • omar ahmed says:

    Hello ..,
    Can I use the Kinect camera 360 in the project instead of the Kinect for Windows v2
    And how can I do the piece
    Thank you very much

  • TC says:

    Hi,

    Thanks a lot of your work, however I am facing a problem. The problem that I have is that while the circle does get drawn, it is not drawn over my hand. Instead it stays in the top left corner (0,0), so I’m guessing that it’s not getting updated properly. Would you mind to tell me what is going on or what the problem is? I would like it to be in the center of my hand (which is being tracked fine because the state of the hand gets updated immediately) and follow my hand as I move it.

    The code can detect my hand state and show it properly, just that the circle is always grey and stick to the top left corner. I have updated my nvidia gt740m driver but this doesnt help.

    Thanks a lot

  • Rashmo says:

    Great Work! It helped me alot to get started with Kinect. I was looking everywhere for something like this. Keep up your good work. Thank you

  • Arslan says:

    Hi Vangos,
    i download the code and run on Visual Studio 2013 . its works fine but not draw the circles on hands.
    circles remains at (0,0) location. kindly guide me for this.

    • Hi Arslan. Are you sure the hands are visible? Please use a Breakpoint to check the values (X-Y-Z) of each hand. If the values are zeros, it means that Kinect does not recognize your hands. Otherwise, it’s an issue about displaying the circles. Let me know 🙂

  • CC says:

    Hi Vangos,
    I encountered the same problem that the circle stays at (0,0) all the time. I think the system has detected my hands because
    it always told the right situation of my hands. Maybe there is something wrong in the description of mapper or Scale. Can you
    check it out. Thanks a lot.

  • CC says:

    Hi Vangos,
    I found there was an typing error in the function Scale of download file.
    point.X *= float.IsInfinity(colorPoint.X) ? 0.0 : colorPoint.X;
    point.Y *= float.IsInfinity(colorPoint.Y) ? 0.0 : colorPoint.Y;
    should be
    point.X = float.IsInfinity(colorPoint.X) ? 0.0 : colorPoint.X;
    point.Y = float.IsInfinity(colorPoint.Y) ? 0.0 : colorPoint.Y;
    This is the reason that the circle stick on the (0,0)

    anyway, I want to say “Good job” to you for this work.

  • Anna says:

    Hi Vangos,
    Thank you very much for the demo. I ‘m using it as a foundation to build my project, but I have an issue. Is it possible to draw in a particular area? My drawing area is at the center of the screen, on a canvas with the proportion of the depth camera. No matter what, the hands are presented at the background as well, and an offset to the right. I can’t figure out what I’ m doing wrong.
    Thanks in advance!

  • Abdullah says:

    Hi there,

    I have downloaded the source code and ran the project, its working without a problem!

    I have tried building up on this and tried to trace my hand position movements for just my right hand, so everytime I move my write hand, a trail of ellipses are left behind allowing me to draw, I have looked at your KinectPaint project as well and used a method as so but I am not quite sure how to put ellipses as a trail and not just on the joint position:

    public static void DrawTrace(this Canvas canvas, Joint thumb, CoordinateMapper mapper)
    {
    if (thumb.TrackingState == TrackingState.NotTracked) return;

    Point point = thumb.Scale(mapper);

    Ellipse ellipse = new Ellipse
    {
    Width = 40,
    Height = 40,
    Fill = new SolidColorBrush(Colors.DarkRed),
    Opacity = 0.7
    };

    Canvas.SetLeft(ellipse, point.X – ellipse.Width / 2);
    Canvas.SetTop(ellipse, point.Y – ellipse.Height / 2);

    canvas.Children.Add(ellipse);
    }

    Can you please help me out, I will really appreciate it, I am just trying to trace the thumb joint position to draw on the canvas of your application.

    • Hi Abdullah. That’s a great question.

      XAML Canvas is great for drawing and contains a lot of drawing components. If you want to display the trail more accurately, you’d better use a Polyline control. The Polyline control is ideal for drawing a trail.

      I wrote the source code, which I’m going to share in a new blog post. Check it out on GitHub:
      Kinect Drawing

      Cheers,
      Vangos

      • Abdullah says:

        Yes I have looked at Polyline and was very intrigued.

        Vangos, Thank you so much! I have just downloaded the source code and I will look through it tomorrow but from the looks of your youtube video, it seems to be exactly what I wanted to know.

        Really appreciate it and loving your work and blog!

        Cheers!

  • mugame says:

    hi there !
    i have a question. i have kinect x360 is it possible to run your project ?

    • Hello. You can migrate this code to support Kinect 360 and SDK 1.5. You’ll need to change the name of the classes to the corresponding SDK 1.5 classes, but the concept remains the same. Kinect 360 has no support for hand states, though, so Open, Closed, and Lasso will not be available.

  • Carlos says:

    Hi Vangos,

    Excellent code, works flawlessly!

    Just curious, I am trying to set up a system that tracks hands and draws the ellipses only when the hands are in 1.25m to 1.8m of distance from the depth camera and does nothing when the hands arent in that range, I understand I will have to manipulate the Z values, but can you suggest how i can set this range filter up?

    Thanks a lot!

    • Hi Carlos. Thanks a lot for your comment. This is how you can control the distance:

      Joint hand = body.Joints[JointType.HandLeft];

      if (hand.Position.Z > 1.25f && hand.Position.Z < 1.8f) { // Do something }

  • Neven says:

    Hi
    how i want to do something if both hand states are closed and i have not been able to do so using the if function
    this is what i want to do

    If(righthand.closed &&lefthand.closed)
    {
    dosomething
    }

  • Sriraman says:

    Hey Vangos ,

    I am very new to Kinect programming . I started off with an air drumming project using Kinect v2 . I got the source code from the net but am not able to run it on this version . So can you help me with this .

  • Miu says:

    Hi Vangos,
    Thank you very much for the demo.
    This is very helpful to my research.
    But I don’t understand the principle of hand states.
    Could you explain it for me? Thank you!

  • Florian says:

    Hi Vangos,

    I have been following your work since Kinect v1. I really like it! Thanks a lot for everything! 🙂
    I am having a few issues with hand tracking. I am trying to reproduce my wrist moves using an avatar.
    I find the result very unstable and inaccurate! Detecting hand states is ok but the hand joint orientations are really not clean.
    So my questions are :
    1/ Do you also have these issues?
    2/ Is there a way to solve them? Using some filtering would probably improve things but honestly I do not think that would be enough.
    3/ Do you have some sample codes or advices?

    Thanks a lot again!

    • Hi Florian. Thank you very much for your comment. Regarding your questions:

      Wrist rotation is not accurate. I am having these issues, too. I have partially solved them by adding more constraints to the avatar and by checking the position/rotation of the neighboring joints. For example, if I’m focused on the wrist movement, I set the wrist joints static or force them to follow the movements of the elbow. If you need 100% accuracy, though, you’d better use Leap Motion.

  • Hanan says:

    hello Mr.Pterneas,

    I start work on Xaml file …do you suggest any simple or clear resources that can help me to build my Xaml file?

    Thanks

  • Hanan says:

    hello Mr.Pterneas,

    i have an idea to detect the object or shape (rectangle for example) what is the simple method that i can use it with kinect?

    thanks

  • hanaan says:

    hello Mr.Pterneas,

    can you please help me in this point

    i have finished writing the code of my project ( i mean CS file ) and i need to start working on the Xaml file to get the interface result ….

    my question.. is there another way can I used it to get my result instead of Xaml file ( I need your suggestion about the simplest and easiest way )

    Thank you

  • Andrew Ting says:

    Hello Mr. Pterneas

    Other than the fingertips i need to track the joints as well, is there a way to do it?

    Thank you.

  • khalifa intissar says:

    hi mr i sent already email for you,
    i am working in work psychology recognition base on body gesture and in a first step i will focus in hand gesture and its interpretation so can you help me please , i want to use kinect with matlab 2016 if it is possible.
    i am waiting for your response and thank you in advance.

  • Chaoying Xue says:

    Hi Vangos,

    Do you have any information on how the Kinect recognizes the five hand states? I am currently working on a project using the Kinect and it depends mainly on recognizing the hand states. Due to the equipment constrains, it has to be set on a surface much taller than where the user stands and is bended down; because of this set up (I think) the hand state recognition has lot of noises (say I have my hands closed at all times, the color of my hand in Visual Studio would still jump between red and grey repeatedly). I am trying to come up with a way to reduce this “noise”, but I could not find any useful information. Any kind of suggestions or advice would be greatly appreciated.

    Thank you.

    • Hello, Chaoying. Thanks for your message. The hand states are recognized internally by the SDK, so we do not have any inside information about their implementation.

  • NoWind says:

    Hi Vangos !
    Your blogs are very helpful! Learning a lot. Thanks!
    I am developing a kinect project by Unity3D. I want to use hand to grab GameObjects with the “HandEventType.Grip” in everywhere. But I found that the HandEventType was not that accurate when the hand’s joint overlap with other joint. For example, HandState became Unknow since I grab a object under my chest. So I think the “TrackingMode.SeatedMode” which will ignore the lower part of body’s joints can help HandEventType more accurate. But I can’t find it in “Kinect SDK for Unity”. How can I open SeatedMode in Unity3D?

  • NoWind says:

    Sorry. I found that the question is not because of joints’ overlap. The Unknow HandState is due to the obstacle behind the hand.
    Is that mean I must keep a distance from other things to ensure the accurately of grip event?

    Thank you!

  • RM says:

    Hey, This blog is amazing and has helped me get a really good start with Kinect. I just have one question though, if I use the visual gesture builder and build a database of gestures, how do I add them to the existing gestures that are being recognized by the code provided. The existing gestures are a part of the SDK but I want to add more to it, could you please help me?

  • Umair says:

    sir i am a student and doing my Final year project by using kinect v2 to rotate, zoom in and out, highlight and take screenshot of a 3D object.
    i just want a starting point that gives me the direction and will this code help me in to perform my mention tasks ..
    Thankyou

    • Sure, you can get started with hand tracking. Consider checking my gestures article, too.

      • Muhammad Ahmed Khan says:

        Sir i got really stuck at rotating the 3D object in my wpf c# application using hand gestures through kinect v2 device and do not know where to start with or how to implement it(means the logic part). If you please guide me and help me with how to do it then you will surely save my life.
        Please reply, i am so tired trying to do it again and again but did not find any solution.

        • You could measure the distance between the hands (between zero and arm length) and scale/rotate the 3D object according to that percentage. Zero would be the minimum and arm length would be the maximum.

  • 唐毓媛 says:

    hello!I am doing a kinect, I want to ask if I want to track the movement of the hand to control the flashing of the led, how to set it up?

    • Hello. First, you need to track the hand position as described in the article. Then, you’ll need to map a particular gesture or sequence of positions to a specific action (e.g. on or off).

  • 唐毓媛 says:

    hi,I can do it by tracking open and closed now,I want to be based on the height of the hand.How can I write about the y-axis of my code?thanks

    • Hello. You can check the height of the hand by measuring the Y position of the Hand joint in the 3D space. Experiment with the Y values to see when it’s raised:


      float y = body.Joints[JointType.HandLeft].Position.Y;

  • Muhammad Ahmed Khan says:

    Sir how could i grab that 3D object on the screen and rotate it using hand gesture is the main problem…please answer for that also…it will be really helpful.
    Thanks

    • Hi Muhammad. Depending on the platform you are using, rotating a 3D object may differ. If you are using Unity3D, you can rotate any 3D object by changing the Rotation properties in the X, Y, or Z axis of the GameObject.

  • Jason says:

    Hello Vangos,

    This blog is amazing and has helped me get a really good start with Kinect. I just have one question. Can I implement more gestures for recognizion without visual gesture builder? How can I add them?

    Thank you.

  • Kevin says:

    Hi, Vango

    Kindly where can I find an in-depth book about Kinect?

    Regards,
    Kevin

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.