Ableton + TD time line sync

Hi everyone

I am working with a setup, with Ableton that sends midi data to TD that triggers video files on various moviefilein object, linked to a switch, and effects, via midi notes.

My question is, is there a way to sync Ableton Live timeline with TD timeline? For example, i would like to edit a certain note at a certain beat on ableton piano roll, and i would like that video files on TD follows the point , and not restarting from the start of the video file every time that i push start on live.

Thank you in advance and let me know if something is unclear on the question.

This is complicated because Ableton Live doesnā€™t actually have anything like frames. It is all based on tempo and beats, which can change during a song.

That said, I am working on a solution for the simplest song casesā€¦ that is, no tempo change allowed. Should be ready for the next experimental release of TouchDesigner.

2 Likes

You could potentially use SMPTE timecode. Would just need to generate some timecode and send the timecode audio from ableton to touch (hopefully your audio interface has a loopback channel, as thatā€™s probably the easiest way to do that if both apps are on the same machine).

I havenā€™t tried this yet, so there may be some drawbacks that Iā€™m not thinking of. Hopefully it just works.

If anyone has any luck with @Kephalos technique, please let me know!

1 Like

Finally tried it out, it seems to work alright. Generated a timecode audio file on this site: http://elteesee.pehrhovey.net/ (note that the website only can generate up to 30fps so youā€™ll have to use another tool if you need 60fps. Record Touch Designers LTC out to an audio file maybe?) Imported it into ableton on its own audio track and set it to send audio on my interfaces loopback channel (note that if you donā€™t have one, you can loop your physical SPDIF ins and outs or just normal analog ins and outs. Watch out for feedback loops though)

Then in touch I used an audio device in to listen to the loopback channel and set up an LTC in to listen to the timecode audio. Then used the total_seconds channel to drive a moviefile.

Only thing I didnā€™t figure out was how to use the timecode to drive touches timeline, maybe someone else knows how to do it. Probably has something to do with /local/time but I donā€™t have the time today to look into it.

This method seems to work fine, the two clocks sync and you can jump around the timeline as expected. The only downside is if you want to do on the fly tempo changes, it wonā€™t work as you canā€™t warp the timecode audio file. Pre-programmed tempo changes shouldnā€™t be an issue though.

Thereā€™s a M4L device that seems to support on the fly tempo changes: Showsync | Free Tools for Ableton Live but for whatever reason itā€™s mac only. I suppose if you have the time you could look into how timecode signal is generated and create your own M4L LTC generator for windows (or maybe someone built one?)

Screenshot of my setup:

Side note: usually when I want to sync visuals to ableton I use the TD Ableton package in the browser, but that doesnā€™t have frame accurate sync to my knowledge

Another note! If you plan on having a live set full of a bunch of different songs it can be helpful to generate timecode starting at different times. Hour 0 can play under the first song, hour 1 under the second song and so on. This is helpful if you re-arrange your tracklist in the future and should help prevent the re-programming of cues.

EDIT: some things came to mind overnight that relate to generating timecode from Max4Live. Apparently original SMPTE only went to 30 FPS. There are some newer SMPTE standards (SMPTE ST 12-1 and SMPTE ST 12-2) that are capped at 60 fps (Iā€™m assuming this is what touch designer supports since it can output 60fps timecode. The documentation doesnā€™t tell you though). SMPTE ST 12-3:2016 supports up to 960 fps apparently, but not sure if support for that is coming anytime soon.

This paragraph has some incorrect assumptions, see edit below:
<Why the framerate matters: if you were changing the tempo on the fly (just moving the tempo in the middle of your performance manually) your timecode would have to speed up or slow down for you timecoded cues to remain in sync (which presents another problem if you slow down the tempo too much and end up with 10fps video or something) so you would have to have your baseline be 30fps at 120bpm or something to allow for tempo speed up and slowdowns.>

ANOTHER EDIT: Was testing speeding up and slowing down the LTC doesnā€™t work because the receiving end expects a set frame rate. If your receiver is expect 30fps and you only send 27 frames of timecode, it just skips 3 frames on the receiver. Also, the receiver will just stop functioning if you go too far out of sync. I have a feeling if you were making your own LTC generator though you could run a 30 FPS signal slower by repeating frames or something, but I donā€™t know the specification deep enough to know if thatā€™s actually the case.

The way i have solved this in the past is to create an automation lane on a track in ableton with a key point at 0 for the start of the song and then 1 at the end of the song. I have this automation assigned to a knob on the TDAbleton Rack to send the constantly rising value over OSC to TD. This should give you a very resolute float value coming in to TD that stays tight with the Ableton play head. I found all other timecode solutions to be insufficient.

2 Likes

Iā€™ve been playing with techniques for this, since I have a collaborator with pre-rendered video that he has been driving from Ableton. The problem is, he changes tempo, engages/disengages the loop, and jumps around to locater markers in the session, so timecode is out since it isnā€™t warpable.

Since ableton only thinks in time increments and LTC canā€™t be warped, I figured I could create an index from the Song Length, but Ableton has no dependable way of setting that, and its minimum is ~60 bars, so I went for a marker based approach. Iā€™ve had some success with making sure each ableton project has a ā€˜startā€™ and ā€˜endā€™ named locator, pull that information from the TDAbleton Song component, and re range to a normalized float. I then remap that to the number of frames in a video, with an optional offset. After running a bunch of tests, the TD version is only off two frames after 10000 from the same vid playing in ableton, and I actually attribute that to ableton being wonky with mp4sā€¦ the playback seems more stable in TD then the mp4 playing natively in Ableton.

@Ivan, what I think could be a helpful RFE in TDAbleton land would be a few extra optional parameters in the TD Song M4L component that lives on the master channel. Perhaps a start/end position, an optional string, or nameable bool/float/int/pulse etc. dummy parameters, or comments that could be used to store information with the ableton composition that gets parsed on TD side, but lives with the live session. Or maybe this is a separate metaData device and TDA component to store info and arguments in an ableton session that gets read by TDA.

right now we get the name of the TD Song component and an index for song ID, and while its true that this should be enough to program a show with all that data stored in TD, it would be great if i could give a collaborator who only is programming the ableton side some more power in configuring the data i want, so that I never have to touch ableton and they never have to touch TD and my session can be built to configure itself based on the data they provide

1 Like

This is a great idea @drmbt! Iā€™m going to give the special ā€œstartā€ and ā€œendā€ marker behavior a whirlā€¦ feel free to flesh it out before I dive in this weekā€¦ basic time/frame sync is on the slate next.

I also like the idea of bi-directional dummy pars and comments in a Max device. This is a big job that probably wonā€™t get into the coming TDA version, but Iā€™ll put it on the list and if you or anyone wants to flesh it out with specifics, especially UI layout in Live, Iā€™d love to see some ideas.

The ultimate wish here would be for a component one could configure with customPars on the TD side, that would be automagically reflected in the live device with the appropriate data type, min/max/default/label etc. as a mappable object in Ableton

I havenā€™t personally investigated Max/MSP enough to know how capable it would be of dynamically generating or enabling UI elements, but even if it were a set number of dummyPars of a few common data types with consistent names/osc addresses and labels youā€™re pulling via OSC under the hood, it should be possible to create TD component with customPars that are mirrored on the other side.

Longer term, it might even be possible to generate a TDComponent that mirrors the live object model in a network, allowing touch users to work with ableton objects in a familiar folder structure that mirrors the layout of ableton Session/track/device hierarchy. I know that would be a whole big thing, Iā€™m just extrapolatingā€¦

Dynamic parameters in Max/MSP are, if not impossible, certainly beyond my skill level to develop in a way that is reliable enough for TDAbleton. Maybe someday a Max/MSP expert will join forces to create that.

Until then, a few static Max devices with various kinds of i/o areas for data might be possible. Tell me what your main use cases for such a thing would be if you have some in mind.

So this timeline sync start/end method suffers from the fundamental problem of Ableton and TD not being able to synchronize cooks, so the timeline is jittery. Its not an awful sync, I can only really tell on very fast strobing material, but it can be off by several frames, and its not always going to be a consistent number.

For now my workaround is a lag(@dylanroscoverā€™s ā€˜laggerā€™) of about 12 samples with an offset of about 12 index units to drive the video, and Iā€™m getting a pretty tight sync with this pseudo look ahead/smoothing. Certainly a bit hacky, but its working well enough for what it needs to do.

This had me thinking about some of the other latency issues inherent with bussing around info in real time, and had me wondering if its possible for TD Ableton to implement a global lookAhead for certain kinds of fixed information like midi notes, cuePoint locater markers, automation lanes etc.If there were a way to access some of that info from the future, it could go a long way toward bridging some of the inherent latency issues with realtime A/V performance