Skip to main content

Need LiDAR support?

Update April 2020: If you are looking for the latest iPad LiDAR body-tracking support, check my latest article.

Apple is officially the newest player in the Body-Tracking industry! With its new pose estimation capabilities, ARKit is a Kinect alternative for mobile devices. People occlusion and human pose estimation are now core parts of the latest ARKit 3 framework.

So, without further ado, I am going to show you how to develop body-tracking apps for iPhone and iPad devices!

ARKit 3 body tracking

Prerequisites

Since we are developing for the Apple ecosystem, we need the proper Mac computer to develop our applications and the proper iOS device to run them.

Hardware

In terms of hardware, you need a MacOS computer that is compatible with MacOS Catalina. Also, body tracking applications need the powerful Apple A12 Bionic processors to run properly. The following Mac computers and iOS devices would be eligible:

ComputersMobile devices
12-inch MacBookiPhone XS
MacBook Air, 2012 and lateriPhone XS Max
MacBook Pro, 2012 and lateriPhone XR
Mac mini, 2012 and lateriPad Pro 11-inch
iMac, 2012 and lateriPad Pro 12.9-inch
iMac Pro
Mac Pro, 2013 and later

For this guide, I am using a Mac Mini computer with an 11” iPad Pro.

Software

To run the demos, you need to install the following software on your Mac computer:

  • Unity3D 2019.1.5f1 with iOS build target
  • MacOS Catalina 10.15 (Beta)
  • XCode 11 (Beta)

Your iOS device should be updated to iOS 13 (Beta) or iPadOS 13 (Beta).

As you can see, at the date of writing, most of the software is in Beta. Keep in mind that the devices may become unstable or unresponsive, so be extra careful not to lose valuable data. New articles will follow with the public release of ARKit 3, iOS 13, and MacOS 10.15.

If you are in a hurry, download the complete source code on GitHub. Keep reading to understand how to create your own body-tracking apps!

During the past 10 years, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with me.

Contact me

Body Tracking step-by-step

Enough said… Let’s dive right into the ARKit magic. On your computer, launch Unity3D 2019.1 and create a new project.

Unity3D 2019 New Project

Step 1 – Set up the main scene

Unity3D will start with an empty scene. Before adding any visual objects or writing any code, we first need to import the proper dependencies. The skeleton-tracking functionality is part of the ARKit toolkit. As a result, we need to import the ARKit and ARFoundation dependency packages.

Now, create a new scene and add an AR Session and an AR Session Origin object. These objects are controlling the iOS camera while providing a ton of ARKit goodies.

Unity3D XR - Add new AR Session Origin

Also, add an empty game object, name it e.g. Human Body Tracking, and attach a new C# script (HumanBodyTracking.cs).

The structure of the scene should look like this:

Body tracking ARKit - Unity scene setup

Step 2 – Set up the Skeleton

Since the visual elements are in place, we can now start adding some interactivity. Open the HumanBodyTracking.cs script and add a reference to the ARHumanBodyManager class. The ARHumanBodyManager is the primary script that analyzes the camera data to detect human bodies.

[SerializeField] private ARHumanBodyManager humanBodyManager;

To display the joints, we’ll use some simple Unity3D spheres. Each sphere will correspond to a specific joint type. Add a C# Dictionary class to update the joint data, frame-by-frame.

private Dictionary<JointIndices3D, Transform> bodyJoints;

Finally, add references to the user interface elements of the skeleton. We’ll need a sphere object for the joints and a line object for the bones.

[SerializeField] private GameObject jointPrefab;
[SerializeField] private GameObject lineRendererPrefab;
private LineRenderer[] lineRenderers;
private Transform[][] lineRendererTransforms;

You can find the complete C# code in the HumanBodyTracking.cs class on GitHub.

Step 3 – Detect the Tracked Bodies

This is the most important part of the tutorial! ARKit has made body-tracking incredibly easy and accessible. All you need to do is use the ARHumanBodyManager object and subscribe to the humanBodiesChanged event.

private void OnEnable()
{
    humanBodyManager.humanBodiesChanged += OnHumanBodiesChanged;
}
private void OnDisable()
{
    humanBodyManager.humanBodiesChanged -= OnHumanBodiesChanged;
}

The event handler is where the magic happens. The information about the tracked bodies is part of the event arguments. This is how to acquire the bodies:

private void OnHumanBodiesChanged(ARHumanBodiesChangedEventArgs eventArgs)
{
    foreach (ARHumanBody humanBody in eventArgs.added)
    {
        UpdateBody(humanBody);
    }
    foreach (ARHumanBody humanBody in eventArgs.updated)
    {
        UpdateBody(humanBody);
    }
}

Piece of cake, right? So, let’s bring everything together and display the skeleton in the Unity user interface we created in the previous steps.

Note: as of the time of this writing, the ARKit only supports one tracked body.

Step 4 – Display the Skeleton

The following lines of code update the positions of the joints in the camera space. The spheres and lines are overlayed on top of the iOS camera feed.

private void UpdateBody(ARHumanBody arBody)
{
    if (jointPrefab == null) return;
    if (arBody == null) return;
    if (arBody.transform == null) return;
    InitializeObjects(arBody.transform);
    NativeArray<XRHumanBodyJoint> joints = arBody.joints;
    
    foreach (KeyValuePair<JointIndices3D, Transform> item in bodyJoints)
    {
        UpdateJointTransform(item.Value, joints[(int)item.Key]);
    }
    for (int i = 0; i < lineRenderers.Length; i++)
    {
        lineRenderers[i].SetPositions(lineRendererTransforms[i]);
    }
}

Apple supports 92 joint types (indices). However, not all of these joint types are actually tracked! Most of them are inferred, based on the positions of their neighboring joints. For your convenience, I have selected 14 joint types, so I can have a fair comparison with the Kinect camera.

This is how to connect the proper joints and form the human bones:

private void InitializeObjects(Transform arBodyT)
{
    if (bodyJoints == null)
    {
        bodyJoints = new Dictionary<JointIndices3D, Transform>
        {
            { JointIndices3D.head_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.neck_1_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_arm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_arm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_forearm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_forearm_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_hand_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_hand_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_upLeg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_upLeg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_leg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_leg_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.left_foot_joint, Instantiate(jointPrefab, arBodyT).transform },
            { JointIndices3D.right_foot_joint, Instantiate(jointPrefab, arBodyT).transform }
        };
        lineRenderers = new LineRenderer[]
        {
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // head neck
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // upper
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // lower
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>(), // right
            Instantiate(lineRendererPrefab).GetComponent<LineRenderer>() // left
        };
        lineRendererTransforms = new Transform[][]
        {
            new Transform[] { bodyJoints[JointIndices3D.head_joint], bodyJoints[JointIndices3D.neck_1_joint] },
            new Transform[] { bodyJoints[JointIndices3D.right_hand_joint], bodyJoints[JointIndices3D.right_forearm_joint], bodyJoints[JointIndices3D.right_arm_joint], bodyJoints[JointIndices3D.left_arm_joint], bodyJoints[JointIndices3D.left_forearm_joint], bodyJoints[JointIndices3D.left_hand_joint]},
            new Transform[] { bodyJoints[JointIndices3D.right_foot_joint], bodyJoints[JointIndices3D.right_leg_joint], bodyJoints[JointIndices3D.right_upLeg_joint], bodyJoints[JointIndices3D.left_upLeg_joint], bodyJoints[JointIndices3D.left_leg_joint], bodyJoints[JointIndices3D.left_foot_joint] },
            new Transform[] { bodyJoints[JointIndices3D.right_arm_joint], bodyJoints[JointIndices3D.right_upLeg_joint] },
            new Transform[] { bodyJoints[JointIndices3D.left_arm_joint], bodyJoints[JointIndices3D.left_upLeg_joint] }
        };
        for (int i = 0; i < lineRenderers.Length; i++)
        {
            lineRenderers[i].positionCount = lineRendererTransforms[i].Length;
        }
    }
}

ARKit is giving us the position and rotation of the joints in the 3D space! This is how to update the scale, position, and rotation of the sphere in the 2D screen space:

private void UpdateJointTransform(Transform jointT, XRHumanBodyJoint bodyJoint)
{
    jointT.localScale = bodyJoint.anchorScale;
    jointT.localRotation = bodyJoint.anchorPose.rotation;
    jointT.localPosition = bodyJoint.anchorPose.position;
}

This is it! Let’s build and run our project on an actual iOS device!

Step 5 – Build and Deploy

Finally, we need to build and run the project on an actual device. Given that ARKit is part of iOS and iPadOS, we cannot test our code on MacOS (I would love to see a simulator, though).

In Unity, select FileBuild Settings. Click the iOS build target and hit the Build button. You’ll need to specify a location to store the generated project. Wait patiently until Unity finishes with the build process.

Unity will create an XCode project (.xcodeproj). Open the project with XCode 11 Beta. If you use a previous version of XCode, you’ll get an error and your project will not run properly.

When the project is launched, provide your iOS Development credentials, connect your iOS 13 device, and click the Run button. This way, the project will be deployed to the device.

When finished, you should point the camera to a person and you’ll start seeing the 3D overlay on top of the tracked body!

During the past 10 years, I have been helping Fortune-500 companies and innovative startups create amazing body-tracking applications and games. If you are looking to get your business to the next level, get in touch with me.

Contact me

If you liked this article, remember to share it on social media, so you can help other developers, too! Also, let me know your thoughts in the comments below. ‘Til the next time… keep coding!

Vangos Pterneas

Vangos Pterneas is a software engineer, book author, and award-winning Microsoft Most Valuable Professional (2014-2019). Since 2012, Vangos has been helping Fortune-500 companies and ambitious startups create demanding motion-tracking applications. He's obsessed with analyzing and modeling every aspect of human motion using AI and Maths. Vangos shares his passion by regularly publishing articles and open-source projects to help and inspire fellow developers.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.