Azure Kinect bone tracking

I may be misunderstanding things, but I cannot work out how to map Azure Kinect relative bone rotations to a rigged humanoid model.

I have switched on Relative Bone Rotations in the Kinect Azure CHOP. But rather than seeing bones (e.g. humerus, femur) like the old Kinect CHOP all I see is joint rotations (e.g. shoulder, elbow, wrist).

What does this require me to do? In order to change the rotation of a bone in a rigged character, I need to know the rotation of a bone. Is there a way of switching the Kinect Azure CHOP to show bones instead? Or alternatively, do I need to calculate bones by somehow making lines out of the joints and working out their rotations from that? E.g. make a Line from the shoulder and elbow points (i.e. femur) and the work out the rotation. And if so, how?

Has anyone successfully rigged a model and got it working with Azure Kinect skeletal tracking? I’m interested to learn how you approached this.

This page has some information on how the Kinect Azure skeleton is rigged: Azure Kinect body tracking joints | Microsoft Learn

Everything depends on how your skeleton is rigged, but you should just be able to map the bones to the joint above them in the heirarchy i.e. the upper arm is the shoulder rotation, lower leg is the knee rotation, etc.

Hope that helps.

This is really helpful, thanks. I hadn’t understood the data correctly, but you have helped me in this regard. Really appreciate it.