I may be misunderstanding things, but I cannot work out how to map Azure Kinect relative bone rotations to a rigged humanoid model.
I have switched on Relative Bone Rotations in the Kinect Azure CHOP. But rather than seeing bones (e.g. humerus, femur) like the old Kinect CHOP all I see is joint rotations (e.g. shoulder, elbow, wrist).
What does this require me to do? In order to change the rotation of a bone in a rigged character, I need to know the rotation of a bone. Is there a way of switching the Kinect Azure CHOP to show bones instead? Or alternatively, do I need to calculate bones by somehow making lines out of the joints and working out their rotations from that? E.g. make a Line from the shoulder and elbow points (i.e. femur) and the work out the rotation. And if so, how?
Has anyone successfully rigged a model and got it working with Azure Kinect skeletal tracking? I’m interested to learn how you approached this.