Is TouchDesigner good for 360 video, AR, VR, etc?

Hi all,

I’m brand new to TouchDesigner–I’ve recently taken one very exciting workshop on projection mapping–and I’m curious about its potential. I’m also a teacher at a local university, and am looking at possible ways to integrate TouchDesigner into my classroom.

If you know the capabilities and limits of this software well, could you let me know if you think that TouchDesigner would be an appropriate tool for an undergraduate art course that covered the following implementations:

Editing content for:

  • Augmented Reality
  • Virtual Reality
  • 360 degree video

Output for:

  • Augmented Reality
  • Virtual Reality
  • 360 degree video

Again, I’m extremely new to this software so if you can explain why you think TouchDesigner would be good or not good for any of the above 6 tasks I would be immensely grateful. And if you think TouchDesigner might not be the best tool for one of these tasks, and you think there’s a better one, I’d love to hear which one you’re thinking of.

Thanks so much,

From my experience, Virtual Reality is ok, 360 degree video is ok… but where the problems arise is when you want to record 360 stereoscopic; ie SBS or Top & Bottom stereo. It’s difficult.

You can do all of these things with Touch but it may not be the best for your class.

I have done a bunch with VR and 360 video in touch but for a few reasons it may not be the best introduction.

I never ever suggest people not use touch but in this case, I would recommend Unity.

Unity has tons of plugins for AR including really easy integrations with IOS and Android devices. Touchdesigner does not have any built in AR or mobile integration like that and the learning curve to integrate it yourself will be significant.

Also, making a living as a Touchdesigner developer is challenging and niche, it’s not specifically ideal for people who want to make VR games to share or sell. Unity lets you export projects and share/sell them to anyone and may be a good skill if your class is interested in VR/AR and accessibility primarily.

I think the growing community in touch is amazing and I can’t wait to see these things start to happen in TD, but for now AR is much more solid and accessible on other platforms.

Thank you so much to you both for replying to this question. After doing a little more research on my own, I more-or-less came to this conclusion as well.

I’m wondering if you could weigh in on TouchDesigner’s capacities with live interactive video. For example, you have some pre-edited 4K clips and then through user interaction effects or filters are added or removed, clips are swapped, overlaid, or transitioned, or clips are sped up or slowed down.

Thanks again!

Live interactive installations is what touch is very good at. You can do all of those things and more if you want to. Unity requires you to compile for each change you make, Touch is all realtime so prototyping and building quickly is way easier.

Check out Rouge, it is a pipeline built in Touchdesigner for realtime performance and live visuals. This will give you an idea of what is possible.

Live visuals for movies, performances and interactive installations are exactly what touch is perfect for, getting what you make out into the world on other devices is not what Touchdesigner excels at.

I’m trying to get into VR, 360 videos and prototyping installations using VR with Touchdesigner this year using a Quest 2 as my headset and it looks like Unreal in combination with Touchdesigner could be a good combination.

Can anyone recommend any good VR based TD tutorials or even paid courses out there? I have found some tutorials but I’m hoping to apply for funding for some courses if there are any good ones.

I have worked a lot with 360 and my recommendation is to avoid doing the stitching process inside TD.
First, TD does not offer nifty features that make life easier in dedicated image/video stitching tools (mistika, mocha, ptgui, …) and secondly If you do the stitching inside touch, this will be quite “expensive” when it comes to performance if you have a lot of hi-res individual streams.

There are camera system that already do the stitching onboard; in this case you will most propably get an equirectangular result that you can use like any other live signal in TD; you just need to project it correctly. the video from a professional 360 camera will be very hi-res (4k + at least) and also 60fps sometimes, so its not lightweight either but a lot less expensive than the other option with several camera streams.

If you have a camera system that gives you 4 or 6 or more even more idividual streams(f.e. 360rigs using gopros or blackmagic cameras) and you need to do the stitching inside TD, i would do the following:
stitch the video streams correctly in a stitching software or nuke or houdini in advance and export the position/displacement and blending information of each camera. for the blending you get a map of course.
for positions/displacement you can use a map as well or save this to an xml(only positions).
with that you can re-create your setup in TD.

For TD and 360 have a look at the following nodes/combinations:
TOPrender / Render Mode / cubemap (you can render any 3D scene in TD to a cubemap)
COMP / Environment Light + GEO + PBR material.

For VR there is this excellent guide for HTC VIVE, which explains all the essentials:

Just a note on a component in the palette’s tool section. There is a component called stitcher

which reads files saved out from ptgui (PTGui must be licensed to output ASCII or JSON format .pts files). A current limitation is that it only accepts equirectangular or fisheye stitches.