Finally tried it out, it seems to work alright. Generated a timecode audio file on this site: http://elteesee.pehrhovey.net/ (note that the website only can generate up to 30fps so youāll have to use another tool if you need 60fps. Record Touch Designers LTC out to an audio file maybe?) Imported it into ableton on its own audio track and set it to send audio on my interfaces loopback channel (note that if you donāt have one, you can loop your physical SPDIF ins and outs or just normal analog ins and outs. Watch out for feedback loops though)
Then in touch I used an audio device in to listen to the loopback channel and set up an LTC in to listen to the timecode audio. Then used the total_seconds channel to drive a moviefile.
Only thing I didnāt figure out was how to use the timecode to drive touches timeline, maybe someone else knows how to do it. Probably has something to do with /local/time but I donāt have the time today to look into it.
This method seems to work fine, the two clocks sync and you can jump around the timeline as expected. The only downside is if you want to do on the fly tempo changes, it wonāt work as you canāt warp the timecode audio file. Pre-programmed tempo changes shouldnāt be an issue though.
Thereās a M4L device that seems to support on the fly tempo changes: Showsync | Free Tools for Ableton Live but for whatever reason itās mac only. I suppose if you have the time you could look into how timecode signal is generated and create your own M4L LTC generator for windows (or maybe someone built one?)
Screenshot of my setup:
Side note: usually when I want to sync visuals to ableton I use the TD Ableton package in the browser, but that doesnāt have frame accurate sync to my knowledge
Another note! If you plan on having a live set full of a bunch of different songs it can be helpful to generate timecode starting at different times. Hour 0 can play under the first song, hour 1 under the second song and so on. This is helpful if you re-arrange your tracklist in the future and should help prevent the re-programming of cues.
EDIT: some things came to mind overnight that relate to generating timecode from Max4Live. Apparently original SMPTE only went to 30 FPS. There are some newer SMPTE standards (SMPTE ST 12-1 and SMPTE ST 12-2) that are capped at 60 fps (Iām assuming this is what touch designer supports since it can output 60fps timecode. The documentation doesnāt tell you though). SMPTE ST 12-3:2016 supports up to 960 fps apparently, but not sure if support for that is coming anytime soon.
This paragraph has some incorrect assumptions, see edit below:
<Why the framerate matters: if you were changing the tempo on the fly (just moving the tempo in the middle of your performance manually) your timecode would have to speed up or slow down for you timecoded cues to remain in sync (which presents another problem if you slow down the tempo too much and end up with 10fps video or something) so you would have to have your baseline be 30fps at 120bpm or something to allow for tempo speed up and slowdowns.>
ANOTHER EDIT: Was testing speeding up and slowing down the LTC doesnāt work because the receiving end expects a set frame rate. If your receiver is expect 30fps and you only send 27 frames of timecode, it just skips 3 frames on the receiver. Also, the receiver will just stop functioning if you go too far out of sync. I have a feeling if you were making your own LTC generator though you could run a 30 FPS signal slower by repeating frames or something, but I donāt know the specification deep enough to know if thatās actually the case.