Just wanted to share a live performance video I just released for my project ARIADNE which showcases the live visuals I built for our performance at Currents New Media Festival 2016.
As a challenge to myself to get as comfortable as possible in Touchdesigner, I decided to do 100% of the video production in TD. The “screen” behind us shows an audio synced montage of the 5 scenes of live visuals I built for our performance which were rendered and exported in real time. The kinect geometry of us performing was captured with a Kinect v2 in touch. I then created the environment you see in the video, integrating the synced kinect footage and live visuals footage which was rendered, again in touch (not in real time though).
I only started learning TD this January when our proposal (to create and perform with these visuals) was accepted and had absolutely no experience in either real time rendering or programming (outside of some very basic Ableton hacking with Max/MSP). To say I was in over my head feels like a massive understatement. A huge thank you to:
-Patrick Lechner “hrtlacek”, who’s book “Multimedia Programming Using Max/MSP and TouchDesigner” was an invaluable resource https://www.packtpub.com/hardware-and-creative/multimedia-programming-using-maxmsp-and-touchdesigner
-Elburz Sorkhabi “elburz”, who’s book “Introduction to Touchdesigner” was also an incredible resource http://book.nvoid.com/
-Mathew Ragan “raganmd”, who’s blog, videos, lessons, .toe/.tox examples, and incredible support on the forum and Facebook group we’re absolutely essential to my learning. Matthew’s passion for helping and teaching others is truly inspiring. https://matthewragan.com/
-Anyone who has shared projects or examples on the forum, FB group or Github. Chances are if you’ve posted something there, I’ve downloaded it and learned something from it, so thank you!
Not sure if there’s any interest but I’d like to attempt to give back to the community at some point by releasing a few of the things I built while making this. Mainly the 16bit kinect depth/color capturing and the obj sequence player. As an artist with no dev background who built these things in an intense rush, these tools are probably built in the worst possible way. They are super messy and need some serious cleaning, commenting and optimization. It’s probably in my best interest to do this myself so I can learn but we’re releasing an album and going on a US tour in October so it’s just not going to happen any time soon. If anyone wants to help me or provide some guidance to get these tools into an acceptable form for other humans, please hit me up. Otherwise I’ll hopefully be able to clean them up and release them sometime in 2017.