Me and a friend are working on a project where the user will interact with different interfaces (TouchOSC, Kinect) to create music which will then effect visuals. After weeks of getting nowhere I’m not 100% sure on what is, and isn’t possible using the abletonsync setup so I just wanted a straightfoward answer if anyone knows.
Is it possible control parameters in ableton using Touchdesigner?
If not, there is a plan B but it is clumsy and unintuitive and we will be unable to use kinect integration.
unfortunately i’m falling at the first hurdle. just FYI, i’m using version 77
I’m attempting to use the abletonsync provided on the wiki to connect TouchDesigner and ableton. This works out of the box in terms of displaying the shows that are already setup.
My aim is to be able to trigger clips on ableton from within TouchDesigner using a TouchOSC chop. To my understanding I would be able to accomplish this by controlling the rotary’s on the TouchDesignerSync Master max plugin? (probably wrong) However I don’t know the steps i should take to control these rotaries within TouchDesigner using TouchOSC and as far as I am aware this is not documented on the wiki.
After this is achieved i’m not sure how i would link these rotaries to other aspects of live, would this require advanced knowledge of max?
I can’t remember about for sure if the knobs in the Ableton sync projects are bidirectional but I know they definitely go from Ableton → Touch.
As far as triggering clips and scenes go though it is all Ableton → Touch. Basically you trigger a clip in Ableton and the TDsync M4L patch sends data to Touch to trigger the corresponding scene. So just the way things are setup now you can’t trigger Ableton from Touch but that doesn’t mean you can’t if you change how the sync environment works. You would need to use M4L though and be able to edit M4L and make your own patch.
There is another alternative though, if your running an Ipad you can just get one of the Ableton clip triggering apps (there is probably a TouchOSC/M4L patch kicking around somewhere also, try maxforlive.com) to trigger both the Ableton and Touch clips just by triggering the Ableton clip. This wouldn’t be useful though if you want to trigger the clips with a kinect or just by some sort of Touch operation. Basically no matter what you’ll need to make a M4L patch (using M4L API objects) to trigger clips directly from OSC. You could also consider sending midi from Touch to Ableton to trigger the clips if you don’t have time or want to buy/learn Max.
Thanks for the reply Keith, over the past couple of weeks i’ve been learning max for live so now I have a rudimentary knowledge (forums are nowhere near as helpful as here though).
Ive created a max for live device which triggers user specified clips at the press of a button within the “instrument”. Now im looking for a way to get touchdesigner to interact with this in the same way TouchDesignerSync_Master_Rev6 works.
I’ve attempted to copy this instrument and paste it into a seperate ableton live file however whenever I attempt to run the instrument patch on its own, it fails to sync with the ableton sync environment in touch.
Is there any sort of in depth guide you know of that goes through the specifics of how the TouchDesignerSync_Master_Rev6 and touchdesigner interact with one another? Because as far as I was aware, using this instrument within another ableton file should have worked because it touchdesigner syncs with ableton through it. maybe i’m wrong. any ideas?
Another easy way to control knobs and dials in Ableton from TouchDesigner is to use MIDI, however that only works for parts of Ableton that are MIDI Mappable. For triggering clips etc from TouchDesigner you will need to dive into the OSC and M4L world, which you have already!
Sorry I can’t help with the explanation of TouchDesignerSync_Master in another Ableton project as I don’t know Live that well.
right guys. i’m gonna upload what I have so far, to anyone out there lurking, feel free to use what i’ve done (within ableton, please don’t re-use TD code) but it’s half baked anyway so probably wont be of any use.
we’re at the point where we are ready to try out the first round of testing between TD and Live but there has been a massive spanner in the works. M4L doesn’t seem to want to to TD…at all. we’ve tried networked and localhost and it just doesn’t work. And I know why…
When I was working on the M4L instrument, i deleted it then loaded in another instrument (that was the same M4L instrument). This seems to break some sort of connection with TouchDesigner, I have no clue what it is, I have not changed any other initial settings defined within the M4L patch apart from ip and setting the maximum value of dials from 127 to 1. I’ve simply added functions after the live dials which trigger clips and do other stuff.
So, to briefly explain. When you open sync-environment TD should be split into two (ignore everything else outside of view). On the left will be buttons which numbers correspond to the dials in the sync file. On the right, we’ve set up constant chops to receive the value from these button presses and send them to the dials. This already works no problem with the default abletonsync file, so you should try it out there first. The problem appears when attempting to use my ableton project file.
Try breaking it yourself first, make sure ableton and TD are set up to send and recieve midi at both ends. Im using this mididriver derivative.ca/wiki077/index. … d_Controls.
Test the TD file with the default abletonsync project first after midi is setup and see if both are sending and recieving data. Once this works save the default abletonsync M4L instrument. delete this instrument from the project, then insert your instrument you have just saved into the same place. This should have broken the connection between TD and ableton. even reinitiallising does not fix.
Could someone please explain to me why this happens? does anyone know?
Also, our deadline is in exactly a weeks time and hopefully we should have something to show to the touchdesigner community by then. 4seasonssync(demo).rar (1.82 MB)
I don’t have time to check out the projects at this moment, not sure if this is your problem but it could be.
When loading the M4L TDsync patch into a fresh Ableton project the fps (frame rate) parameter field is initialized to 0. You have to set it to 60 to get both projects to work.
Not sure if that was fixed in a more recent version but if it wasn’t this could be your problem.
Pretty sure the above problem is what is happening to you. I just loaded Rev6 and it still seems to not initialize. Enter 60 in the Touch FPS field and press the Reinitialize button. I just tested with your project and it’s working fine.
I’d like to know how this thread resolved (or not). I’ve been trying for hrs to get td to control Ableton for the same reasons (kinect controlling Ableton). I’ve tried to create a 2 way virtual midi connection, tried osc to max4 live(then what? a virtual port from max4live to Ableton?Cant make it work). I’m stumped. With ‘td sync’ demo files I’ve established sync, and managed to send controllers from Ableton to TD. But I cannot figure out how to get the reverse to happen. Ive also tried midiyoke, rtpmidi, to make a virtual port. I was really enjoying Win7 until I tried to setup and run audio and midi !!
I’m on 088, so I dont know how much of the tdsync demos are broken currently. For some reason it seems that midi out on td demo.toe is set to 0-1. value, rather than 0-127 for a cc out. I added a mathchop to fix this but still not getting any midi out of touch. Im tired, diminishing returns …try again tomorrow.
ps Matty1029 can you repost your files without .rar ? My free extractor is giving me grief and I don’t think most people compress files on this forum.
Do you want to control parameters in Ableton like a plugins filter? If so I would skip the whole midi deal do everything directly with OSC and M4L. Here is relatively simple way to do this you’ll need to do some custom M4L stuff though:
-load either Max Api Amap1.amxd or Amap4 onto a track (Max for Live/Max Audio Effect/Max Api Amap1.amxd)
-you’ll need to edit this device so it receives OSC from Touch where the OSC data is either directly connected to the knob labeled control or to where that knob is connected.
-In the Track,Device,Parameter fields navigate to the parameter you want to modulate.
-if your using an external plugin you might have to set up the parameter you want to use by configuring it so it is controllable by Ableton.
oh yeah I use 7zip for file extraction its free and decompresses just about everything.
thanks Keith ! that is exactly what I was looking for! I had never dabbled in the API objects. It didnt take me long to wire up an API A dial to do what I wanted to do- control Ableton params with Touch via osc… !@#$% Midi!
I have many more questions, but they are more in the scope of Max4Live, so I am moving along … happily.
re: 7-zip -I’ve been using that too, but I must not have something configured right- perhaps target folder settings for uncompressed files … another issue I will figure out - good to know it works (are you using 64 bit win7 ?)
hey guys, it’s been a while. Here’s how the project looked in the end, this was during our first testing stage and as you can tell there were still a few bugs to iron out. In the end we achieved our goal of being able to control ableton from touchdesigner in real time with the use of TouchOSC and Kinect. I have a few additional videos I made which explained how the process was achieved in max for live if anyone is interested. Let me know in here or PM if your interested in seeing them