I just finished a first version of a shared memory app that makes the depth data of a Kinect available in TouchDesigner. Not really tested, so I hope it works on your machines as well.
Instructions here achimkern.de/touchdesigner/kinec … hdesigner/
yeah, commercial only. sorry jesse. nothin I can do. Weren’t you supposed to get one?
anyway, i’d be great if whoever tests this could report the cootime of the sharedmem TOP (both in regular and in perform mode, along with some info about their hardware). I get ~ 0.3 ms on my system, which is exellent, but James had a cooktime of 3 ms…
Hi,
I couldn’t get it going on win7/64. The SensorKinect installer wants a 32 bit build. I emailed the guy who wrote it but I suspect he’s getting more emails than he should.
Great Stuff! Thanks for sharing : )
I got it all installed after removing other Kinect drivers and ignoring Window’s attempts to install Xbox NUI drivers. I had to run the installer a couple times and used the directions in the discussion thread your forwarded. I could tell it was going to work once I saw Prime Sense drivers for Camera and Motor and nothing else Kinect related in my Device Manager.
This was all on Windows 7/64 bit.
As for performance, cook time reports .340
My BoXX is pretty fast. First thing I did was write a very simple vertex GLSL shader to displace vertices on any geometry based on the Kinect Depth using the texture2D lookup and feeding in the output of your component. I was able to reproduce your ‘pin table’ effect which is tons of fun of play with.
In the current code you’ve made available to us (thanks again! it’s awesome)
I’m able to process the Kinect depth image to my liking in Touch. I’m also looking for a way to get at the RGB camera image as well as the Kinect’s accelerometer data and a way to control the motor on the camera. Any ideas on how to get that all integrated into one C++ Kinect/Touch bridge?
Once that’s all in place I am hoping the Touch community can start playing around and sharing projects. I’m interested in:
people detection using 2 kinects
gesture recognition (hand pointing)
skeleton re-build (as in Theo Watson’s puppet example using oF)
Thanks again to you and Malcolm and Derivative for opening this up.
Also, it would be great to start a discussion on performance tricks/tips when processing this kind of data in realtime. Things slowed down quite a bit when I was working with the data in CHOPs and trying to remap to SOPs. The only decent performance I got was when I routed stuff directly into a GLSL mat and did things on the GPU (sensible) but I’m looking to take advantage of all the cool Touch operators.
I checked again and I am getting similar results now. Maybe I had something running in the background before. Still seeing spikes up to 1ms though, but that’s not a big deal. Also the network inside the component is very far in the negative y direction making the f key not work.
Glad you guys get fast cooktimes. And sorry for the y offset in the network. it was caused by constantly cut n paste the network to reset shared memory during testing.
As far as I know, OpenNi has no support for the accelerometer and the motor, as these are not in the primesense reference design. You’d have to either use cl nui or openkinect if you need these. I’ll stick with OpenNi, but if I find some time I’ll try to make a version that also gives you the rgb signal (which is pretty bad quality).
Not sure I’m gonna use the skeleton tracking that comes with OpenNi, as it requires you to perform a calibration pose for about 3 sec, which imho is unacceptable for any installation/exhibition work …
yes, you have to basically stay on the gpu all the time. you can do a glsl depth key and then feed that to the blobtracker. this way you only have to download few data from the gpu…
next step would be to create some CHOPnets in Touch to
set thresholds for gesture event detection. For example “JUMP” action would happen when both feet Y-coordinates in the skeleton jump higher than a set threshold like 5 inches.
Do you have an example of the shared memory breaking down? Using the latest classes it should automatically re-create the segment and tell the other process where to look for it.
Where you deleting and recreating the UT_SharedMem class, or calling resize() on it?
Hello there : ) we are bunch of new touch users from sweden. and we are going td all way
anyway great to see the kinect getting into touch, i still cant get the tox achim shared to run tho.
kinect is working, bridge is running but i cant seam to get any data inside touchdesigner.
and did you install the latest stable or unstable (you need stable)?
And finally, unfortunately touch doesn’t support shared memory in the free edition, so if you didn’t buy a license, my bridge unfortunately won’t work for you.
btw, on the left side of TouchDesigner, check the /browser/video/filter/PushPins
the openni samples is working but it’s running the experimental build : ) so that might be it.
we got a pro license and i know about the pushpin. was curiose about the data between the kinect and the pins. we are kinda new with touch but i have been looking into it for a couple of years but never made the step : P finally we decided to go with touch instead of vvvv or max msp.
I looked at the skeleton example in VVVV somebody made and posted to their website and it looks as though it uses an osc bridge (NI Tracker Bridge) with the joint rotations ready to be mapped, I was just going to use the bridge part and then bring it into Touch if it’s possible (haven’t really tried yet)
Also started looking at Shinect which I think may have just been released which allows for kinect to be used as a midi controller so that’d be an easy option for any simple gesture based stuff hopefully. I might message the guy who’s made that on twitter and play around.
Has anyone made anymore progress with a bridge to touch. I Can’t get kinect working with touch either with Achim’s tox. An application that sends osc to touch would be awesome, unfortunately I’m not a coder and have little time to dive into it in depth.
there is osceleton for getting the openni skeleton to touch.
you don’t need the stable touch version, experimental is fine. But you might need the stable version of openni along with the matching stable version of the avin2 driver. I’ll have a look at supporting the unstable openni build when i’m back fro transmediale
I’ve got unstable openNI working with OSCeleton coming into touch, seems that you can’t mix the stable and unstable versions together in terms of NITE etc or it all breaks down.
In order to get the motor controller working you can use the motor driver from code laboratories whilst keeping the kinect sensor driver the same. Of course it isn’t all in one nice, easy bridge but as far as I’m aware the Open NI drivers don’t support the motor or the audio yet?