live mocap stream into Touch

Does anyone have any info or production setup tips please for streaming live mocap data into Touch in the most optimal way possible?

we can code in C++ for the longterm but could also use a quick temp setup as proof of concept with any existing OPs or script tricks.

we are setting up a phasespace system normally configured to stream live data to Motionbuilder (which we don’t have).

thanks all!

Hi David,

I’ve done that using the Pipe In Chop, we found some good documentation in the /touch/docs/pipe folder in the Touch install directory. Pretty much just copied the source code examples into the program generating positional data, set up the ports and it worked, took about 1/2 hour. This was with a custom program running a depth sensing camera (like Natal), don’t know what the process would be like using PhaseSpace.

Jeff

You can prototype it if you can send the mocap data via TCP/IP to the TCPIP In DAT.

Format it as one ASCII message per joint, body, angle… up to you … mocap data is typically around 100 floats…

Parse it with a script, maybe similar to the way the example in the wiki for the Multi Touch In DAT works. A script gets called for every message, and from that you fill a table containing the mocap data in TouchDesigner. You can convert that table to CHOPs (DAT to CHOP) or export directly from the DAT (type DAT Export in wiki).