Drive a skeleton with kinect

Hello,

I’m attempting to drive a skeleton with bound skin exported from Maya from the kinect tracked positions.

I wanted to use single bone ik chains (what gets created using the “forward kinematics” mode when creating bones), and constrain the ik goal to the tracked positions, however importing the fbx model create nulls in Touch and not bones, and the inverse kin chop only wants bones.

I don’t want to do rigging/binding in touch, is there any way to have an imported fbx turned into bones?

I guess the other option is to use the bone rotation, though I’m wondering what the initial pose should be like.
I tried a simple chain like this :

root->spine1->spine2->clavicle_l->upperarm_l->forearm_l->hand_l

If I plug the values from the kinect chop with a straight bone chain and original angles at 0 I’m not getting good results. I tried adding 90/180 offsets to some of the bones but I’m not getting super close.

Would be great to have an example skeleton ready to go shipped with touch.
Going to do more research, but appreciate any input!

Thanks a lot
Vincent

I seem to recall that doing your rig in houdini and then bringing it in works fine, I’m going from memory and this was about 2-3 years ago so I may be wrong.

Thanks Richard.

Using FBX I get somewhat the same results from Maya or Houdini, skin weights and hierarchy come through fine, but I end up with nulls instead of bones, making the ik chops useless in Touch.

Else from houdini I’ve used hclassic for geometry but I don’t think hierarchy comes through.

Going to spend more time and I’m sure I’ll eventually figure it out, was wondering if there was any tested/recommended workflow.

Bone rotation is the correct way to do this, as that already has all the rotations solved for you. I know we had a sample skeleton at one point but I’m not sure where that is now.

Thanks Malcolm. It makes sense.
Though I was also given some test recorded data that has only the point positions, which is why I was wondering about constraining a rig as well. Else I guess I need to do the math to end up with rotations values.

By trial and error I’m able to figure the correct initial orientation for the bones, but it’s a bit tedious, and not really intuitive.

Is this supposed to be in the SDK docs somewhere?
All I could find was msdn.microsoft.com/en-us/library/hh973073.aspx

First, it feels Right and Left are switched, as if the pose was mirrored.
Else for the clavicles, I’m getting for left and right rx and ry at -180 all the time, which seems a little weird because when I put them back at 0,0 the arms are straight up.

And last, left and right limbs have the same orientation, instead of being mirrored.
So for a symmetrical pose, I’m getting rz -60 for left, and +60 for right.

I’ll probably continue a bit at it but if you find a working skeleton it would be awesome.
I’ll probably have another go at making ik and using positions, looks like a more flexible approach.

Yeah, I’ve spent days trying to figure out how to get Kinect to drive an imported rigged model with no luck . . . tried swapping sides like you say, changed rotation orders, swapping XYZ data channels, tried taking bone positions and calculating the rotations with an Object CHOP . . . all to no avail. There’s some very specific set up with Kinect rotations vs “standard” that I can’t figure out. FWIW I have been able to drive a rigged character using a stream of FK (rotational) data from a Phase Space system and it worked fine and very quickly.

So I don’t know, but yeah would sure be nice to have a working rigged character driven by Kinect.

I had a look at the sdk, they’re using quaternion to convert from kinect skeleton to target skeleton.

Things like this

// Kinect = +Y along arm, +X up, +Z forward in body coordinate system // Avatar = +X along arm, +Y down, +Z forwards Quaternion kinectRotation = KinectHelper.DecomposeMatRot(tempMat); // XYZ Quaternion avatarRotation = new Quaternion(kinectRotation.Y, -kinectRotation.X, kinectRotation.Z, kinectRotation.W); // transform from Kinect to avatar coordinate system tempMat = Matrix.CreateFromQuaternion(avatarRotation);

Would be cool to have some options on the kinect chop to specify how the joints on the target skeleton are oriented.

Else would be cool to have a aim constraint with more options. The look at on the object comp doesn’t allow to specify the aim axis.

I know there are other priorities but rigging in touch could use some love :wink:

edit : just remembered there were a bunch of built-in matrix functions in python, this is going to be useful.
derivative.ca/wiki088/index. … trix_Class

I love matrices! Still work in progress but I got the skinned mesh to behave as expected.
I’m using the tracked positions and made my own aim constraint using the lookat python function, which allows me to specify aim axis(in the attached screenshot x along the bones).
kinectdeform.JPG

Damn, very nice ! Where are you getting your rigged character ? I have been using Mixamo, which I think is fairly standard Biped based rigging.

Hey Guys!

About to start a project where I will need to rig an imported 3D character with a kinect skeleton, and would love push in the right direction by anyone if possible?

Cheers,

O.

+1 to Oliver’s request.

I’ve started messing around with this, and an example rig would be immensely helpful so I can work through how it’s all set up. Right now I’m a bit confused with how the Kinect data relates to the imported rigs I’ve tried from Houdini and Mixamo. For instance, the nulls created in the rig have rotation imports as well, but the Kinect CHOP only outputs translation values. Unfortunately, my understanding of how Houdini works is limited/non-existent.

Thanks in advance for any help!

Rotational values from Kinectv2 havent been implemented in TD yet i dont believe - they are however implemented with the KinectV1…

How are you going with your comp L05?

O.

I’ve been looking back into this recently now that the Kinect v2 is outputting rotation values. Honestly, I’ve put a bit of time into this and I’m still quite lost! Part of the issue may be that I don’t really understand how to use rigged skeletons in TouchDesiger, let alone with a Kinect as a control device. I tried importing the basic cartoon rig from Houdini into Touch, but the rigging a) seemed to be a completely different scale from the model and b) didn’t deform the character at all when I changed its values. Having heard that Touch shares some similarities with Houdini, I thought doing the Houdini overview and character animation tutorials might better help me how things work in Touch, but I’m still not sure if and where that knowledge crosses over.

I also tried importing a Mixamo model, and the rigging seemed to make import properly and deform the character when changed, but I can’t seem to figure out how to get the values from the Kinect to manipulate the rig in any useful way. Should I be looking at the relative rotation values from the Kinect?

If anyone has an example file showing how to connect the Kinect to a rigged character, that would be immensely helpful (Derivative, maybe?).

We are working on a sample file, hopefully we can get something together. Yes you’d want to use the relative rotations. Attached is some sample files that show a working skeleton, but not rig/skinned with a mesh. If you change your pane to a Geometry Viewer you’ll be able to see the skeleton in action. Our plan is to release a .fbx file that you can drag drop into TD and it’ll instantly start working with the Kinects relative rotation channels.
Maybe these will help you a bit though while we work on it.
Requires builds above 30000 of TouchDesigner 088
skel_rel.toe (7.46 KB)

Thanks Malcolm! That file is helpful in seeing how the Kinect rotation data can be used to build a basic skeleton. How would you suggest changing the length of the “bones” in the skeleton to make it look more proportional? I took a shot at it myself using the bone lengths from the Kinect CHOP and have attached the result, but I’m not sure if that’s the smartest/most efficient approach.

Looking forward to the FBX example! Thanks again to the Derivative team for your help and responsiveness.

Doh! I just looked back and realized I had forgotten to post that example file. :blush:
skel_rel_proportional.toe (9.23 KB)

Hey vinz99 !

Could you give us more clues on how are you getting this? Are you working with a .fbx or a .dae?

Thanks!

V.

Hi,

I asked mixamo for giving us kinectready fbx, the answer was it could be possible if there is enough interest…

please leave your interest here:

community.mixamo.com/hc/communi … rientation

Thanks for the link, fHainer. I added a +1.