TouchDesigner, Max MSP, Ableton, Houdini, Unreal Engine/Unity

Hi there!

I’m a musician and I mostly use Ableton + hardware + Max MSP. I’ve been practicing TouchDesigner for about a year now and I’ve used it for audio as well as MIDI/OSC reactive visuals for my performances. The control data that I send to TouchDesigner usually comes from my LeapMotion, various MIDI controllers, sequencers and MAX as well as Ableton. I see how useful TouchDesigner is in receiving, processing and sending data, but I find it very limiting in terms of the real-time 3D graphics output. I want to create interactive installations and performances but in simple terms I want them to look better than what TouchDesigner SOPs offer. Mainly I want my installations to feature interactive high detail fluid/ smoke/ collision/ particle simulations and 3D environments as well as characters. What additional softwares come to mind for this kind of work?

After my initial research I imagine that a combination of MAX MSP, TouchDesigner, Houdini and Unreal Engine should let me achieve the desired result? Do these softwares complement each other nicely? I saw a couple of performances where people used TouchDesigner to process data and Unity/Unreal engine to drive the visuals and they looked absolutely gorgeous, something that I don’t think TouchDesigner alone would be able to generate. Am I missing something? Maybe there are other tools I am not aware of?

I chose Houdini since as far as I’m aware it kicks ass in simulations and is node based and somewhat similar to TouchDesigner in terms of workflow. Am I right to think that it could be integrated with Unreal Engine to manipulate these simulations in real-time?

I chose Unreal Engine because I read somewhere that it is easier for beginners and also easier to integrate than Unity? Also it is free and does not have watermarks or restrictions like Unity.

I should clarify that I have no experience in coding C++ or Python.

One of my biggest worries is that I’m trying to bite more than I can chew. Would it be feasible for me to learn all of this as a single artist from scratch? How else would I achieve my desired functionality and looks?

Thanks!

EDIT: Additional question.

One of my future projects would be an audio-visual album app for Android and iOS users where music and visuals would not only be simply played back but also affected based on gyroscope and touch data. Would it be possible to create something like this using the softwares mentioned above or is it an entirely different beast?

A lot of the visual features you mention exist in TouchDesigner

https://youtu.be/NbZ2YDwuxAQ?t=7118 for demo of character animations
https://docs.derivative.ca/Nvidia_Flow_TOP for fluids smoke, clouds etc

There are now two sophisticated systems for collisions
https://docs.derivative.ca/Flex and https://docs.derivative.ca/Bullet_Dynamics

Anything more advanced than those things I kinda believe would still be hard in other software: Unity, Houdini, Unreal. Something like Houdini is great for VOPs, advanced SOP simulations, non-realtime stuff that Unity/Unreal can’t do. You didn’t mention vvvv but I think TouchDesigner beats it in most cases.

One of my future projects would be an audio-visual album app for Android and iOS users where music and visuals would not only be simply played back but also affected based on gyroscope and touch data.

That’s an ok reason to use something like Unity because TouchDesigner has limited distribution ability. You can’t put TouchDesigner on an iPhone/Android device (Unity would be ok), but you can put TouchDesigner on a Windows tablet PC. Another idea would be native mobile app development like Swift for iOS.

1 Like

Thank you for the reply!

Yes I am aware of Flex, Flow and Bullet Dynamics, I have tried them multiple times, but they seem visually lacking compared to Houdini. I do of course realise that Houdini is non-realtime, that’s why I’m wondering if it’s possible to combine Houdini and Unreal/Unity to achieve that high quality polished look but in real-time, and to be controlled by input data like leapmotion, midi, osc, kinect?

Also would I benefit learning Unity or Unreal engine more? If I was to learn C++ and Python languages wouldn’t it be more beneficial for Unreal in the long run? Since I could use C++ for both, Unreal and TouchDesigner?

Houdini can export plenty of things TouchDesigner can use. The founding of Derivative is related to the history of SideFX, which makes Houdini. So a lot of the Houdini / Unreal / Unity synergy can happen with TouchDesigner too.

Unity/Unreal offer some hardware integration support like leapmotion, midi, osc etc. It’s kinda a matter of preference. Personally I think TouchDesigner excels at hardware / protocol support whereas my co-workers have to use unofficial OSC plugins for Unity. Community made Unity things have more bugs than built-in TouchDesigner nodes from my experience.

I don’t use Unity or Unreal, so I can’t give a good opinion on which to use.

I think Python is a great language to master. C++ is ok but only necessary for advanced circumstances. C# is too specialized, like coffeescript to javascript, but the basics aren’t hard to learn.

1 Like

Thank you David, you’re literally making my anxiety go away haha! By the way I check out your works on Vimeo and they are stunning! Great job!

To further propel the discussion and clarify what I have in mind for my installations and performances I’ll provide some examples below:

These are just some basic examples of what I’m looking for, not the aesthetic per se but rather the technical possibilities of including high end graphics and simulations. However instead of them being being pre-rendered I want to trigger and manipulate them using MIDI and OSC inputs, the same way one would play a game using a keyboard+mouse or controller. Additionally I want to be able to projection map scenes like this on different surfaces, and in the future incorporate lasers as well. As far as lasers and projection mapping go I’ve seen it done and know that TouchDesigner is more than capable, but based on graphics quality I doubt that I would be able to achieve something like this in TouchDesigner. That’s why VJ’ing on a game engine came to mind, Unreal Engine in combination with Houdini would seem more than capable to do this in real-time? I’m curious why more people are not doing this sort of work, particularly VJ’ing using game engines. I guess my biggest question would be how well do all these softwares integrate together, particularly running real-time data from TouchDesigner to Unreal.

Any ideas?

1 Like

Hello, just my little experience. Before using TD, I had made some video work for dance using Unity (for 3D and mapping and real time texturing) beside Isadora (for media management and final multiscreen output). Yes the 3D realtime renderer in Unity is awesome (as Unreal probably but I never tried it). But the programming stuff in Unity is really well for game, when a big team spend a lot of time to polish a perfect system for the game economy. I didn’t know about VJing but in dance/theater economy and schedule its really not workable. Planning the workflow, write in C#, finding the good library for osc, midi, spout or ndi, compiling, testing cannot work in our landscape. I continue to have Unity ready for some very specialized or polished work but not used it since two years.
At the moment I prefer to prepare procedural assets in Houdini, export it as alembic with a timeline and render it with pbr and a timeline in TD. The quality is quite good, I can quickly return in Houdini to make some change and export it and I have the great and reactive TD interface, with the possibility to follow the process on any node. Sure a Houdini TD engine would be awesome!
But I would be interested to know your workflow if you made it.

1 Like

Good to see you again Jacques! Thank you for your reply! Could you expand a bit regarding exporting assets from Houdini and using them in TD? These terms are still quite new to me. Does this mean that you would export a baked animation or a vfx from Houdini into TD and render/playback it there in real-time? One of the reasons I am tempted by Unreal is the node based system, as I understand it offers a much much quicker workflow for someone who’s a single person team and has no experience im codinf. I hear that Unity has now implemented a similar node based interface as well!

Just 2 cents :slight_smile:
The two videos you posted are easily doable in TD. With the PBR Workflow you can create really great results (maybe even looking better then the videos). A lot of this depends on your ressources (textures, maps. As high resolution as possible!).
I highly suggest checking out this talk by the gerritsons:

Really great minds and they create realy great renders in Touch!

2 Likes

I export alembic from Houdini (I have followed a tutorial explainig how to do it for Blender), for example an object with a timeline, the tutorial was with a skull and a box with a boolean between them. So the mesh are precalculed in Houdini and baked in the alembic. In TD you can move inside the timeline, change materials, lights, cameras etc.
I am not really on TD at the moment so I cannot post the project but I will try later.
Concerning Unity and Unreal, there is no real nodal network, (for my knowledge), not as in TD, Nuke or Fusion. At one point, you have to write code and compile, but for the kind of render you want;, perhaps its the good solution.

1 Like

It is possible to combine TD and UE4 via OSC :slight_smile:

https://youtu.be/23R8YnZTJ6c

for simulations there is a nice new tool call embergen.
https://jangafx.com/software/embergen/
its in beta atm and you can use it free with no restrictions for two weeks.
more “casual” than houdini & you can export vdb files with it.

I would also have a look at blender. its by far the most comprehensive free 3d tool out there.
for rigging characters the workflow with fbx to touchdesigner works really well;
and for simulations the new mantaflow sim engine is very powerfull and uses vdb for its cache files as well. i posted a tutorial recently for this kind of workflow. here

in regard to “playing live” or how to create interaction with the simulation environment i can recommend the usage of midi controllers with build in sequencers. this or this or this can do the trick f.e.
another tutorial link
and a video i have made link

i would love to have a proper osc controller that can do the same, but tbh, besides software solutions for android and ios the format is not what it was promised to be and kinda dead.
the industry just denied it. sadly.
i saw a used lemur on sale for 200us recently and there were no takers.

1 Like