NOTE: This is preliminary software and/or hardware and APIs are preliminary and subject to change.
Kinect is awesome for body tracking, but Kinect version 2 is more than finding the human joints positions. Today we’ll see how we can track hands and thumbs and specify the hands’ state. In this article, we’ll extend the functionality we built in the previous blog posts, so you might want to revise them before going on.
- The color, depth and infrared blog post will guide you through the way of initializing the Kinect sensor and displaying the streams
- The body tracking blog post will show you how to display the human body joints
Download the source code of this example
Requirements
- Kinect for Windows v2
- Windows 8/8.1
- Visual Studio 2013
- USB 3.0 port
Video
Here is a brief video I created demonstrating the functionality we are going to examine (cudos to my young sister).
[youtube code=”zM6Xcuja04I”]
[Tweet “Hand states and hand tracking using Kinect”]
Hand tracking
Finding the position of a hand or thumb is just like finding the position of every other joint. Remember the snippet we used before? Here’s how you can get the position of each hand:
void Reader_MultiSourceFrameArrived(object sender, MultiSourceFrameArrivedEventArgs e)
{
var reference = e.FrameReference.AcquireFrame();
// Color
// Display the color stream...
// Body
using (var frame = reference.BodyFrameReference.AcquireFrame())
{
if (frame != null)
{
canvas.Children.Clear();
_bodies = new Body[frame.BodyFrameSource.BodyCount];
frame.GetAndRefreshBodyData(_bodies);
foreach (var body in _bodies)
{
if (body != null)
{
if (body.IsTracked)
{
// Find the joints
Joint handRight = body.Joints[JointType.HandRight];
Joint thumbRight = body.Joints[JointType.ThumbRight];
Joint handLeft = body.Joints[JointType.HandLeft];
Joint thumbLeft = body.Joints[JointType.ThumbLeft];
}
}
}
}
}
}
Displaying these joints is easy. Simply call the DrawPoint method we created before. In today’s example, I have slightly modified the source code to draw different colors:
// Find the joints
Joint handRight = body.Joints[JointType.HandRight];
Joint thumbRight = body.Joints[JointType.ThumbRight];
Joint handLeft = body.Joints[JointType.HandLeft];
Joint thumbLeft = body.Joints[JointType.ThumbLeft];
// Draw hands and thumbs
canvas.DrawPoint(handRight);
canvas.DrawPoint(handLeft);
canvas.DrawPoint(thumbRight);
canvas.DrawPoint(thumbLeft);
Hand states
Currently, Kinect supports the following hand states. They are parts of the HandState enum:
- Open
- Closed
- Lasso
- Unknown
- NotTracked
Note that the hand states are members of the body object, regardless of the corresponding hand joints.. Here’s the way to access the states:
if (body.IsTracked)
{
// Find the right hand state
switch (body.HandRightState)
{
case HandState.Open:
break;
case HandState.Closed:
break;
case HandState.Lasso:
break;
case HandState.Unknown:
break;
case HandState.NotTracked:
break;
default:
break;
}
// Find the left hand state
switch (body.HandLeftState)
{
case HandState.Open:
break;
case HandState.Closed:
break;
case HandState.Lasso:
break;
case HandState.Unknown:
break;
case HandState.NotTracked:
break;
default:
break;
}
}
Can’t be any easier!
After that, I simply created some <TextBlock> elements (named tblRightHandState and tblLeftHandState respectively) and updated each hand state. Here’s the complete source code:
using (var frame = reference.BodyFrameReference.AcquireFrame())
{
if (frame != null)
{
canvas.Children.Clear();
_bodies = new Body[frame.BodyFrameSource.BodyCount];
frame.GetAndRefreshBodyData(_bodies);
foreach (var body in _bodies)
{
if (body != null)
{
if (body.IsTracked)
{
// Find the joints
Joint handRight = body.Joints[JointType.HandRight];
Joint thumbRight = body.Joints[JointType.ThumbRight];
Joint handLeft = body.Joints[JointType.HandLeft];
Joint thumbLeft = body.Joints[JointType.ThumbLeft];
// Draw hands and thumbs
canvas.DrawPoint(handRight);
canvas.DrawPoint(handLeft);
canvas.DrawPoint(thumbRight);
canvas.DrawPoint(thumbLeft);
// Find the hand states
string rightHandState = "-";
string leftHandState = "-";
switch (body.HandRightState)
{
case HandState.Open:
rightHandState = "Open";
break;
case HandState.Closed:
rightHandState = "Closed";
break;
case HandState.Lasso:
rightHandState = "Lasso";
break;
case HandState.Unknown:
rightHandState = "Unknown...";
break;
case HandState.NotTracked:
rightHandState = "Not tracked";
break;
default:
break;
}
switch (body.HandLeftState)
{
case HandState.Open:
leftHandState = "Open";
break;
case HandState.Closed:
leftHandState = "Closed";
break;
case HandState.Lasso:
leftHandState = "Lasso";
break;
case HandState.Unknown:
leftHandState = "Unknown...";
break;
case HandState.NotTracked:
leftHandState = "Not tracked";
break;
default:
break;
}
tblRightHandState.Text = rightHandState;
tblLeftHandState.Text = leftHandState;
}
}
}
}
}
What’s next?
We are expecting a new SDK update from Microsoft in the next few days, so stay tuned for more great staff like facial expressions.
PS 2: New Kinect book – 20% off
Well, I am publishing a new ebook about Kinect development in a couple months. It is an in-depth guide about Kinect, using simple language and step-by-step examples. You’ll learn usability tips, performance tricks and best practices for implementing robust Kinect apps. Please meet Kinect Essentials, the essence of my 3 years of teaching, writing and developing for the Kinect platform. Oh, did I mention that you’ll get a 20% discount if you simply subscribe now? Hurry up 😉
I tried your code but I got some errors.
I have Kinect v2 sensor for Windows and SDK 2.0
When I open the proyect with my Microsoft Visual Studio 2013 it shows me two errors.
1. At Window_Loaded function at “_sensor = KinectSensor.Default;” it says that “Default” does not exists, so I changed it to “GetDefault()” method.
2. At “body.Dispose();” it says that body doesn’t have a Dispose method, so I commented it out.
Then I ran the proyect and I got a runtime error saying that it can’t execute Process.Start(@”C:\Windows\System32\KinectService.exe”); because I don’t have that binary. I looked trhought my Kinect directories and I found “KinectManagementService.exe” file at “C:\Program Files\Microsoft Kinect Drivers\Service” so I replaced it. Not it runs but the hand stats only says “Unkown” or “Not tracked” and the curcles round my hands are not in my hands.
Let me show you:
http://puu.sh/cIqZJ/b296a5e1f7.jpg
Hi Nelson. This code was written using the preliminary software (beta version of the SDK and early hardware). Thank you for noting this. I will update all of my repositories by Monday. For the record, you do not need to launch KinectService any more. You can delete that line of code.
Cheers,
Vangos
Greetings!
Did you have a chance to update the code. I downloaded it from couple of times- it still has those dead methods.
Thanks
Hey, I have already updated the code in this blog post. Try refreshing the page (F5 or Ctrl-F5) and download the new sample again. Let me know if you face any issues.
hi! love your work! and if i wanted to do a check for handstate, like ill do an if loop:
if(handstate.open)
{ //ill do something here}
is it possible?
Hi Shawn. Sure, you can check the state like the way you suggest.
hi,thanks for your work !
I have Kinect v2 sensor for Windows ,SDK 2.0 and Microsoft Visual Studio 2012.
I tried your code and found that the curcles round my hands are not in my hands.
Let me show you:
http://imgsrc.baidu.com/forum/pic/item/ef9caa0a19d8bc3e89adf584818ba61ea9d345f0.jpg
Hi eple. You can use CoordinateMapper to properly place the UI elements on your canvas.
CoordinateMapper is described here: https://pterneas.com/2014/05/06/understanding-kinect-coordinate-mapping. It is a more advanced topic and I tried to keep this blog post as simple as possible. However, for real-world apps, you’ll need to use CoordinateMapper
I have updated the source code. Simply refresh the page and download the attachment again. Let me know if you need any further help 🙂
Apologies for my late replies – I have been sick for the past few days…
Thanks! I’m sorry to reply so late,I haven’t received the E-mail.
hey,
how can i count number of closed hand. i want have when a person close his hand 2 times for double click.
since, HandState.Closed is an enum having value 3.
how can i count 3 appear 2 times for my double click.
thanks!!
Hi Deepak,
That’s a little tricky. A double-click gesture occurs in approximately half a second (that is, 15 Kinect frames). So, you need to have a frame counter that will be reset to zero after the 15th time. You’ll need another counter that will count the number of closed hand states within the 15-frame period. The closed states should not be consecutive I guess: you are searching for a pattern Open-Closed-Open-Closed within the 15-frame period.
Let me know if that helped you 🙂
hello vangos pterneas,
hi its really awesome blog for kinect i learnt a lot with your blog but when i was running the 2 Hand tracking code it was showing late data after closing and opening it should show immediately right but after closing it was waiting for 3-5 seconds and showing closed so what was the problem i am not getting but when i run the configuration verifier at last the color and depth was showing warning is that issue or my machine is i3 and intel HD family graphics i know its not in recommended but i worked 10 days no issue but now am getting please help what is the problem what’s your machine config for 2 Hand tracking ?
thanks a lot
Hi vinayak. I use an i5 processor with 8GB of RAM. Please run Kinect Configuration Verifier from the SDK Browser to find out if your system is capable to run Kinect apps. Alternatively, try another USB 3 port.
thanks vangos. i run the Configuration Verifier it was showing warning at last verify kinect depth and color streams and warning at usb contorller i am getting out put but after some 15 min that last option in Configuration Verifier kinect depth and color streams it will show red mark it is changing its status from warning to error so i think its because the usb and the i3 i will check in i5 anyway thanks for your valuable feedback to me i will try it and when can i except the kinect book from you ?
Hi vinayak. We are working on my book. Printing may take some time. Thank you.
Hi…What a nice code
But after i tried your code there’s something missing….the drawhand and drawthumb doesn’t work @@
The program work well but not draw any circle in my hand…you know what’s wrong with this?
Thanks xD
Hi Arief. Did you download the code or just used the snippets I include in the article?
Thanks,
Vangos
Hello ..,
Can I use the Kinect camera 360 in the project instead of the Kinect for Windows v2
And how can I do the piece
Thank you very much
Hi Omar. Kinect for XBOX 360 only supports hand tracking (similarly to Kinect v2). You can’t recognize finger gestures with Kinect for XBOX 360.
Hi,
Thanks a lot of your work, however I am facing a problem. The problem that I have is that while the circle does get drawn, it is not drawn over my hand. Instead it stays in the top left corner (0,0), so I’m guessing that it’s not getting updated properly. Would you mind to tell me what is going on or what the problem is? I would like it to be in the center of my hand (which is being tracked fine because the state of the hand gets updated immediately) and follow my hand as I move it.
The code can detect my hand state and show it properly, just that the circle is always grey and stick to the top left corner. I have updated my nvidia gt740m driver but this doesnt help.
Thanks a lot
Hello. Does the BodyBasics sample run properly? Is your skeleton displayed correctly? Seems like the circles are not updating their positions.
Great Work! It helped me alot to get started with Kinect. I was looking everywhere for something like this. Keep up your good work. Thank you
Thanks a lot for your comment. I’m glad you found my posts useful.
Hi Vangos,
i download the code and run on Visual Studio 2013 . its works fine but not draw the circles on hands.
circles remains at (0,0) location. kindly guide me for this.
Hi Arslan. Are you sure the hands are visible? Please use a Breakpoint to check the values (X-Y-Z) of each hand. If the values are zeros, it means that Kinect does not recognize your hands. Otherwise, it’s an issue about displaying the circles. Let me know 🙂
Hi Vangos,
I encountered the same problem that the circle stays at (0,0) all the time. I think the system has detected my hands because
it always told the right situation of my hands. Maybe there is something wrong in the description of mapper or Scale. Can you
check it out. Thanks a lot.
Hi Vangos,
I found there was an typing error in the function Scale of download file.
point.X *= float.IsInfinity(colorPoint.X) ? 0.0 : colorPoint.X;
point.Y *= float.IsInfinity(colorPoint.Y) ? 0.0 : colorPoint.Y;
should be
point.X = float.IsInfinity(colorPoint.X) ? 0.0 : colorPoint.X;
point.Y = float.IsInfinity(colorPoint.Y) ? 0.0 : colorPoint.Y;
This is the reason that the circle stick on the (0,0)
anyway, I want to say “Good job” to you for this work.
Hi there. You are absolutely right. This typo was causing the error. Thanks for noting!
The demo has been updated.
Hi Vangos,
Thank you very much for the demo. I ‘m using it as a foundation to build my project, but I have an issue. Is it possible to draw in a particular area? My drawing area is at the center of the screen, on a canvas with the proportion of the depth camera. No matter what, the hands are presented at the background as well, and an offset to the right. I can’t figure out what I’ m doing wrong.
Thanks in advance!
Hi Anna. I guess you are designing a XAML element according to a specific X-Y position. You need to use Canvas.SetLeft and Canvas.SetTop.
Canvas.SetLeft(element, x – element.Width/2);
Canvas.SetTop(element, y – element.Height/2);
Thank you very much! I’ll give it a try.
Hi there,
I have downloaded the source code and ran the project, its working without a problem!
I have tried building up on this and tried to trace my hand position movements for just my right hand, so everytime I move my write hand, a trail of ellipses are left behind allowing me to draw, I have looked at your KinectPaint project as well and used a method as so but I am not quite sure how to put ellipses as a trail and not just on the joint position:
public static void DrawTrace(this Canvas canvas, Joint thumb, CoordinateMapper mapper)
{
if (thumb.TrackingState == TrackingState.NotTracked) return;
Point point = thumb.Scale(mapper);
Ellipse ellipse = new Ellipse
{
Width = 40,
Height = 40,
Fill = new SolidColorBrush(Colors.DarkRed),
Opacity = 0.7
};
Canvas.SetLeft(ellipse, point.X – ellipse.Width / 2);
Canvas.SetTop(ellipse, point.Y – ellipse.Height / 2);
canvas.Children.Add(ellipse);
}
Can you please help me out, I will really appreciate it, I am just trying to trace the thumb joint position to draw on the canvas of your application.
Hi Abdullah. That’s a great question.
XAML Canvas is great for drawing and contains a lot of drawing components. If you want to display the trail more accurately, you’d better use a Polyline control. The Polyline control is ideal for drawing a trail.
I wrote the source code, which I’m going to share in a new blog post. Check it out on GitHub:
Kinect Drawing
Cheers,
Vangos
Yes I have looked at Polyline and was very intrigued.
Vangos, Thank you so much! I have just downloaded the source code and I will look through it tomorrow but from the looks of your youtube video, it seems to be exactly what I wanted to know.
Really appreciate it and loving your work and blog!
Cheers!
hi there !
i have a question. i have kinect x360 is it possible to run your project ?
Hello. You can migrate this code to support Kinect 360 and SDK 1.5. You’ll need to change the name of the classes to the corresponding SDK 1.5 classes, but the concept remains the same. Kinect 360 has no support for hand states, though, so Open, Closed, and Lasso will not be available.
Hi Vangos,
Excellent code, works flawlessly!
Just curious, I am trying to set up a system that tracks hands and draws the ellipses only when the hands are in 1.25m to 1.8m of distance from the depth camera and does nothing when the hands arent in that range, I understand I will have to manipulate the Z values, but can you suggest how i can set this range filter up?
Thanks a lot!
Hi Carlos. Thanks a lot for your comment. This is how you can control the distance:
Joint hand = body.Joints[JointType.HandLeft];
if (hand.Position.Z > 1.25f && hand.Position.Z < 1.8f) { // Do something }
Hi
how i want to do something if both hand states are closed and i have not been able to do so using the if function
this is what i want to do
If(righthand.closed &&lefthand.closed)
{
dosomething
}
if (body.HandLeftState == HandState.Closed && body.HandRightState == HandState.Closed)
{
dosomething
}
I figured it out, thank you! great code!
Awesome! Thank you.
Hey Vangos ,
I am very new to Kinect programming . I started off with an air drumming project using Kinect v2 . I got the source code from the net but am not able to run it on this version . So can you help me with this .
Hi Sriraman. You can post any questions in this blog, as long as they are related to the blog content.
Cheers,
Vangos
Hi Vangos,
Thank you very much for the demo.
This is very helpful to my research.
But I don’t understand the principle of hand states.
Could you explain it for me? Thank you!
Hello. HandState is an enumeration, provided by the Kinect SDK. It’s just a value that determines whether a hand is open, closed, or doing the “lasso” gesture.
Sorry,I meant the logic of hand state.
Could you tell me how to determine whether hands open,closed or lasso.
Thanks!
var state = body.HandRightState;
if (state == HandState.Lasso) // Do something.
Is this what you need?
Thank you very much!
Hi Vangos,
I have been following your work since Kinect v1. I really like it! Thanks a lot for everything! 🙂
I am having a few issues with hand tracking. I am trying to reproduce my wrist moves using an avatar.
I find the result very unstable and inaccurate! Detecting hand states is ok but the hand joint orientations are really not clean.
So my questions are :
1/ Do you also have these issues?
2/ Is there a way to solve them? Using some filtering would probably improve things but honestly I do not think that would be enough.
3/ Do you have some sample codes or advices?
Thanks a lot again!
Hi Florian. Thank you very much for your comment. Regarding your questions:
Wrist rotation is not accurate. I am having these issues, too. I have partially solved them by adding more constraints to the avatar and by checking the position/rotation of the neighboring joints. For example, if I’m focused on the wrist movement, I set the wrist joints static or force them to follow the movements of the elbow. If you need 100% accuracy, though, you’d better use Leap Motion.
hello Mr.Pterneas,
I start work on Xaml file …do you suggest any simple or clear resources that can help me to build my Xaml file?
Thanks
Sure. You can get started with XAML here:
http://www.i-programmer.info/programming/wpf-workings/446-how-xaml-works.html
https://www.amazon.com/Microsoft-Visual-2013-Step-Developer/dp/073568183X/
hello Mr.Pterneas,
i have an idea to detect the object or shape (rectangle for example) what is the simple method that i can use it with kinect?
thanks
Hello. You could use EmguCV for object tracking: http://www.emgu.com/wiki/index.php/Main_Page
hello Mr.Pterneas,
can you please help me in this point
i have finished writing the code of my project ( i mean CS file ) and i need to start working on the Xaml file to get the interface result ….
my question.. is there another way can I used it to get my result instead of Xaml file ( I need your suggestion about the simplest and easiest way )
Thank you
Hello. You can use XAML or Unity3D with Kinect. Just pick the one that’s easier for you.
Hello Mr. Pterneas
Other than the fingertips i need to track the joints as well, is there a way to do it?
Thank you.
I meant the finger joints. Sorry.
Hi Andrew. Kinect cannot easily detect the finger joints. I suggest you used Leap Motion instead. This will be much more accurate.
hi mr i sent already email for you,
i am working in work psychology recognition base on body gesture and in a first step i will focus in hand gesture and its interpretation so can you help me please , i want to use kinect with matlab 2016 if it is possible.
i am waiting for your response and thank you in advance.
Hello, Khalifa. You can get started with my Kinect Finger tracking project, as well as Vitruvius. Hope this helps you.
Hi Vangos,
Do you have any information on how the Kinect recognizes the five hand states? I am currently working on a project using the Kinect and it depends mainly on recognizing the hand states. Due to the equipment constrains, it has to be set on a surface much taller than where the user stands and is bended down; because of this set up (I think) the hand state recognition has lot of noises (say I have my hands closed at all times, the color of my hand in Visual Studio would still jump between red and grey repeatedly). I am trying to come up with a way to reduce this “noise”, but I could not find any useful information. Any kind of suggestions or advice would be greatly appreciated.
Thank you.
Hello, Chaoying. Thanks for your message. The hand states are recognized internally by the SDK, so we do not have any inside information about their implementation.
Hi Vangos !
Your blogs are very helpful! Learning a lot. Thanks!
I am developing a kinect project by Unity3D. I want to use hand to grab GameObjects with the “HandEventType.Grip” in everywhere. But I found that the HandEventType was not that accurate when the hand’s joint overlap with other joint. For example, HandState became Unknow since I grab a object under my chest. So I think the “TrackingMode.SeatedMode” which will ignore the lower part of body’s joints can help HandEventType more accurate. But I can’t find it in “Kinect SDK for Unity”. How can I open SeatedMode in Unity3D?
Sorry. I found that the question is not because of joints’ overlap. The Unknow HandState is due to the obstacle behind the hand.
Is that mean I must keep a distance from other things to ensure the accurately of grip event?
Thank you!
Hello. Yes, that’s right. The player should stand in front of the sensor (between 1.5 and 6 meters). The space should be as clear as possible.
Hey, This blog is amazing and has helped me get a really good start with Kinect. I just have one question though, if I use the visual gesture builder and build a database of gestures, how do I add them to the existing gestures that are being recognized by the code provided. The existing gestures are a part of the SDK but I want to add more to it, could you please help me?
Hello. The code provided here is an algorithmic way to detect a gesture. The Visual Gesture Builder component works a little differently. You could use them side-by-side, but it is not possible to somehow combine them into a single module.
Do you have any tutorial of an application that uses the gestures built using visual gesture builder?
Unfortunately, I do not have such a tutorial.
sir i am a student and doing my Final year project by using kinect v2 to rotate, zoom in and out, highlight and take screenshot of a 3D object.
i just want a starting point that gives me the direction and will this code help me in to perform my mention tasks ..
Thankyou
Sure, you can get started with hand tracking. Consider checking my gestures article, too.
Sir i got really stuck at rotating the 3D object in my wpf c# application using hand gestures through kinect v2 device and do not know where to start with or how to implement it(means the logic part). If you please guide me and help me with how to do it then you will surely save my life.
Please reply, i am so tired trying to do it again and again but did not find any solution.
You could measure the distance between the hands (between zero and arm length) and scale/rotate the 3D object according to that percentage. Zero would be the minimum and arm length would be the maximum.
hello!I am doing a kinect, I want to ask if I want to track the movement of the hand to control the flashing of the led, how to set it up?
Hello. First, you need to track the hand position as described in the article. Then, you’ll need to map a particular gesture or sequence of positions to a specific action (e.g. on or off).
hi,I can do it by tracking open and closed now,I want to be based on the height of the hand.How can I write about the y-axis of my code?thanks
Hello. You can check the height of the hand by measuring the Y position of the Hand joint in the 3D space. Experiment with the Y values to see when it’s raised:
float y = body.Joints[JointType.HandLeft].Position.Y;
Sir how could i grab that 3D object on the screen and rotate it using hand gesture is the main problem…please answer for that also…it will be really helpful.
Thanks
Hi Muhammad. Depending on the platform you are using, rotating a 3D object may differ. If you are using Unity3D, you can rotate any 3D object by changing the Rotation properties in the X, Y, or Z axis of the GameObject.
Hello Vangos,
This blog is amazing and has helped me get a really good start with Kinect. I just have one question. Can I implement more gestures for recognizion without visual gesture builder? How can I add them?
Thank you.
Hi, Vango
Kindly where can I find an in-depth book about Kinect?
Regards,
Kevin
Hi Kevin. For Kinect v2, you can check Beginning Microsoft Kinect for Windows SDK 2.0: Motion and Depth Sensing for Natural User Interfaces. If you are interested in the Azure Kinect, instead, I’ve authored Mastering the Microsoft Kinect: Body Tracking, Object Detection, and the Azure Cloud Services .