As a working fine artist and a VJ, I’ve very much grown to like composing and performing a mashup of video art, my fine art, and clips from movies, internet videos, and film into Resolume and playing a midi controller (in my case, an APC40) like an instrument for violently hallucinatory visuals, and sometimes even playing an electric bass hooked to a midi controller.
But I feel that I have reached the limit of what Resolume can do to some extent. I was turned on to touchdesigner by a friend who works for a company similar to Leviathan, and would like to explore using it to create full environmental projections that are driven by my live video and audio performances with my partner, maybe even by linking it with the Kinect, to take my audience completely inside of my work. Between the two of us, we have architecture, traditional/stop motion/3d animation, live video performance, electronic music production, and successful fine art careers and using a software like this to link it all together is very exciting. So forgive me if I sound scattered.
I was wondering if there was any way to harness the video output of Resolume and link it into Touchdesigner so that I can expand the posibilities of what can be done with my performances? Ideally, the flow would go something like serrato/ableton+resolume+kinect->touchdesigner->madmapper or similar pc option. I’m not tied down to a platform. My partner’s a mac and I’m a PC.
Thanks guys, and I can’t wait to start being more active here. What is being done with this software is so damn amazing. I’ve waited for what feels like an eternity to be able to bring all of these disparate creative endeavors together into a single cohesive audiovisual experience.
I’m not sure what kind of custom SDK options Resolume has. If it has a way for you to write a custom C++ plugin, you could send the data over to TouchDesigner using the Shared Mem In TOP. If not then you’ll need to output Resolume’s video through the video card and loop it back into TouchDesigner using the Video In TOP and a DVI capture card.
The question that I have is: why not just use TouchDesigner for it all? It can be driven by Ableton. The video playback and processing engine is extremely powerful. There are all kinds of examples and help on using Kinect inside TouchDesigner. For mapping you can use the Kantan Mapper component we have posted on the forums, which is compatible to Madmapper in many ways. You’ll get better performance than running two GPU accelerated applications on one computer at the same time also (Resolume + TouchDesigner).
Thanks for the quick response and for helping me think out a pipeline for this new process.
In terms of looping it back in on itself, that sounds irritating, but It could be done i suppose. Resolume has a developers section of it’s forum, and I’m unsure about what sort of SDK they have available but I do see a lot of people talking about that sort of stuff on their forums here. I was drawn to Touchdesigner because it looked like a more visually based approach, like the node hierarchy that I am used to using in Maya or Houdini to a lesser extent. But I’m not much for programming, the organized thinking hurts my head. I’m much more of a painter.
Care to take a look? Does any of that look promising? resolume.com/forum/viewtopic.php?f=14&t=6568
It says you can write something using the freeframe plugin standard, so that might be a solution. Resolume can also stream via something called vidnet and be controlled over a LAN. Maybe it can also stream over VLC?
I could definitely use Ableton and ditch Resolume, since I’m already using an APC, but it’s kind of assbackwards to how I use the APC or Resolume. I play the APC like it’s an instrument controlling my clips to the music with all of the knobs and sliders and buttons, sometimes so fast that they short or break. What I end up with is very different from the slow moving visuals that I see a lot of VJ’s creating with other software packages, and the interface of ableton doesn’t lend itself to the style or workflow I’ve developed. If you check out my work, notice every change in the video or effect is being controlled by hand. None of that is music driven or automated in any way.
I don’t know how to use touchdesigner for all of it, but I’d like to learn. Also, I have the added bonus of working with a partner who is physically disabled, so there are certain parts of the performance that using a kinect would be awesome for him to be able to control. We work and perform together, and were thinking that he could control the feed from my resolume and serrato performances using the kinect and the touchdesigner software. So having the extra step isn’t necessarily a bad thing.
I don’t know if vidnet would work, I get the feeling it’s a Resolume specific video stream format (uncompressed).
I’m pretty sure a freeframe plugin could be written that takes the video buffer from Resolume and writes it into a shared memory space that Touch can read with the Shared Mem In TOP (FYI you need FTE Commercial to use this TOP).
You could likely find a programmer online to do this work, it wouldn’t be too difficult for someone familiar with working with images in C++.
You know, I got to thinking about how our performances would actually work. Since there are two of us, we will be using two computers live, the dvi feed might not be such a bad idea to start out with. My laptop running resolume and serrato could feed his running touchdesign and drive the projectors. I could use the HDMI and DVI out of my laptop to drive a large monitor and a Blackmagic Design Intensity Shuttle w/ USB3, feed it into an identical laptop, and use the Video In TOP on his machine and we have liftoff. From there, he can use any number of available equipment to drive the projections. Would that work?
I just wanted to add that the APC40 can be used as an input device in TouchDesigner as well. TouchDesigner has full MIDI support, as well as other protocols like OSC so you can use your iPhone, iPad (or Andriod etc) devices as inputs too. That said, I understand the investment of time and effort you have put into your Resolume work, making a hybrid-pipeline that takes that work into TouchDesigner and lets you do even more with it is probably a good place to start.