So you mean that in your testing, the data generated by a TouchEngine instance would be passed to the “master”, and then from the Unreal master to other nodes of the cluster, those nodes were all on sync ?
On my nDisplay master computer i have touchdesigner and unreal running.
I have 4 kinects connected to that computer.
I merge the kinects in touchdesigner in one big image do some processing to it then send it to the ndisplay master that is running on the same computer and to two ndisplay slaves that are running on two other computer with touchOut->touchIn.
In unreal i use a tox loaded with the touchengine plugin that get the image via touchIn and pass it to Niagara system to drive some particle.
I will present it during the Ars Electronica Festival in the deep space projection room.
They use two computer for the rendering one for the wall and one for the floor.
If anybody is interested on how they use unreal and nDisplay they provide a template that is available to everybody on github.
It’s working when i do it in touchdesigner or if i load my tox with the touchengine in touchdesigner but not when i’m loading my tox with the unreal plugin.
In the samples project, you see on the left a case where the samples for Engine perform TOP is unmodified, but in the second case on the right, it is updated with the texture being sent out being 16bits RGB rather than RGBA.
On that image you can see my setup running locally on my machine.
I have 3 ndisplay instance running they receive a image from touchdesigner via the UE plugin and use it to drive a particle system.
I tried to run that setup on multiple machine (one per ndisplay instance) and it’s also working
Thanks a lot for your help.
Cheers,
Colas
I’m surprised that a particle system is fine over 3 machines. Particle systems are notoriously bad at being spread across clusters. Or you need at the very least to compute all positions / physics in one machine.
I guess that is what you do ? Then you render on the others ?
It’s a decent setup, It’s just one machine is under heavy stress, likely, compared to the others…
Anyways, good luck with the project and feel free to shoot an email at support@derivative.ca when you are done with the project… We’d be curious to hear more about it and get into some of the technical details + get feedback.
Hey @Tyrell,
This is really cool! I’m also working on some integration with Unreal and Touch over NDisplay.
Maybe you’ve already outlined this, but for your setup, did you also need a Touch player license on each of the slave computers? Currently working on an LED wall with 4 slave computer and was wondering how this would work?
Edit: I see above where you asked the same question, I think I understand how this would work now!
Amazing to hear you are bringing TouchDesigner back to the Deep Space 8K room! Shuhei Matsuyama and our friends at Think and Sense worked on a 8K VR project with NHK back in 2019 in the same venue. Crushing amounts of pixels were served!
I had to install 3 workloads in Visual Studio installer and then everything worked fine with the samples project!
Now I have something else:
I created a new project, following the instructions from Github:
Everything works as described but I have an issue of how to get the information out of the Get TouchEngine Output node. For example, I am outputting from the tox file just one channel of 1 sample (an LFO CHOP), and I want to get this float number in my blueprint. How can I do that? What is the format of the Value output?
I also noticed somewhere in the samples project there is a Get Channel node that may be needed for the task but I cannot find it with search on my project’s blueprint.
Hi @JetXS,
I have strange problem with the touch engine in the deep space setup of the ars electronica museum.
They are using two quadro RTX A6000 linked with NVlink.
Each graphic card is connected to two projectors, the 4 projectors are blended together.
we are running one instance of unreal that is using the two graphic cards at the same time.
One graphic card is getting the image coming out from the touch engine but the other one doesn’t.
Do you have any idea how we could solve that problem ?
Cheers,
Colas
EDIT : ok we found the solution that has nothing to do with the touchdesigner plugin we have the same problem with the spout plugin.
It’s a bug of unreal with directx 11 it seems that some niagara buffers are not correctly shared between the two graphic card switching to directx 12 fix the problem.
Thank you for all the support, plugin works great!
I have one question, can we have the Component to run in Editor Mode? I managed to get a tick in the Event graph without having to press Play and I would like for development purposes to have the plugin run in Editor Mode. Is that possible? I have tweaked some options in the Component’s details but still I have not found the way.
Yes, while editing BPs. I managed to run OSCServer in editor (there is a function called SetTickinEditor: Set Tick in Editor | Unreal Engine Documentation), and I was wondering if TouchEngine nodes could have this option so we don’t have to do the back-and-forth Play-edit-play-edit while developing.
Hi @JetXS,
The problem we had with unreal was solve by switching to DX12 but then the plugin was not working anymore.
Therefore i finally didn’t use the touch engine plugin.
I run touchdesigner separatly and used spout to send the data to unreal.
I will do a blog post with more details about the projects and the technical setup.
Cheers,
Colas
It is available in the UE5.0-Dev and UE5.1-Dev branches.
So far it is tested by a small team, so you will want to test extensively if it’s for a production environment.
You’ll want to compile from source and you will need a new TouchDesigner build that should come in the next 10 days. Or reach out by email to support@ for a dev build.
A precompiled version of the plugin should be available by the end of the year.