Where can I find prerecorded mock live data for Kinect v2


I’m working on a project for someone else who has a Kinect V2 (not azure). They have sent me an .xef file exported from kinectstudio, but I’m unable to open that file and stream it into TD because I don’t have the Kinect myself.

I’m looking for a pre-recorded video of someone moving in front of the Kinect V2, and the accompanying CHOP data. I’ve been able to read the chop data before from a .bclip file type.

Is there anywhere I can download pre-recorded TOP and CHOP data to test my project?

If you need notns pecific movements, want I can record a chop file anout me moving in front of the Kinect and send it to you, in you want

Yes please, that would be super appreciated!

If possible, can you send:

  1. A 5-10 second file that can be loaded as chop data. (.bclip?)
  2. The accompanying video from the Kinect as any Mov File In type. (.MP4, or .mov?)

My goal is to make sure the joint data from the Kinect is aligned perfectly with the video data of a tracked user in a render network.

Thanks in advance!

Here is a sample.
I hope the locked Kinect CHOP will work to recover the channel’s names losts in the saved file.
I saved all the channels.
Use the select chop to choose the joint you want to track.
Let me know if it does…

This did work, thank you!

I think i’ve also learned something, that for mapping player joint values onto a 2d that overlaps the camera feed, I should be using the U, V values and not the XYZ?

Do I have the right idea?

Exactly! Tx and Ty refers to space and, in 2D, does not correspond to perspective

Is there a way to map the values to a Sphere SOP instead of the circle TOP so that the aligned can be preserved but in 3D space?

I think I didn’t understand you mean… the circle is just a 2D placeholder to show where the hand u and v are pointing to in the top world. If you want to switch to the al3D world you have to use tx ty and tx but they are not aligned to te kinects top

Yes this is exactly what I mean. Is there any way to align the tx ty tz with the Kinect Top?

I don’t know exactly your goal, but try to ise u and v for the alignment and tz for depth

So I tried this, and using U instead of Tx and V instead of TY, and regular TZ still doesn’t line up with the Infrared Camera data.

I’m using a sphere at each point, merging all sops, and plugging into a standard default Geo, Camera, Render TOP network. Compositing the camera behind it at the end, but no matter what I tweak in the camera transform I can’t seem to get them to line up.

II think I’d help you better if I see the patch. Can you please send me only the part involved?

Sorry for the late repsonse!

Here is the base that I’d like to have aligned with the camera. Occasionally it lines up, but as the user moves closer or farther away, the joints become unsync-ed. I’ve tried using the UVs instead, but no success.

KinectAlignmentNetwork.toe (18.9 KB)

As of right now, since I cant get any of the instanced 3d points to align with the render, I’m using ONLY the UV values to create TOPs and SOPs that align perfectly.

The main issue I’m having is that, I need to extract the 3D velocity of the joints for things like particle emission. With the UV approach, there is no Z data for me to work with. If I include the tz values, it becomes unaligned with the camera again.

Hi, I think there is no way to align the 3D coordinates with TOP because of the perspective, the only way are UV kinect coordinates that have been created for this scope, I think! :smiley:
Just because you need the XYZ only to get velocity, I think you can use them for CHOPS and use the UV for displaying.
Another idea, if you want to work in TOPs world, is to use UV to display the right coordinates and to use the Z channel (or the XYZ slope) to scale the circle diameter, for example, to generate big or small particles sources.
I think there is no need to use XYZ for just displaying.
I hope I helped you in some way.

1 Like