"Rewire" audio programs

Hi all,

I’m starting to dabble again in audio software programs after a long time away.
I “of course” want to get Touch working with these.

To our tool gurus at Derivative: Any plans/support for “Rewire”?
I think David/Seed brought up rewire in the emails about a year or so ago.

If there is anyone who is working with audio programs and Touch
I’d love to exchange info.

Jim

Hey jim,

I think the jack audio server is now also available for windows. But I’m not sure if both apps need to support asio. If it works w/o ASIO on both ends, then you can use it to get the audio from any app into touch and send additional control signals via MIDI/osc

What you wanna do with the audio data? If it’s just for visualization purposes, you might be better of doing analysis in an optimized package (like max/msp, pd, bidule) and only send the “converted” control signals you need via osc to touch.

Regarding rewire, here’s another idea. If instead VST plugins would be supported, then you could even load something like energyXT to have a complete DAW inside your synth (and maybe send sync data from there to touch). Could have the same benefit as implementing rewire, but also giving you much more audio power. In case you don’t know energy XT, check it out. It’s a DAW running as a vst plugin, so you can also load additional VSTs/Instruments inside it. And it has a modular editor…
On the other hand I never really used rewire, so I probably miss the good stuff an integration might offer.

My goal is one day to control ableton live from touch, basically building a (multitouch) virtual midi controller. Problem is ableton not supporting track loads via MIDI (and osc). So until they change this and or the cycling74/ableton partnership produces any results, the only option is to use liveAPI, which is not really developed anymore. Or I stick with having to use the mouse to load tracks, which ain’t fun in a live set.

Achim

Hey Jim,

We have been using the Edriol FA-101 with great success on some projects. The reason we like them is they have a nice implementation for XP64 and DirectSound drivers. We havent supported ASIO yet and have no plans to ( i know it would be nice ). We also support OSC so if only those guys at proppelerheads would stop being so proprietary…

We have no plans to support Rewire so dont hold your breath. What kind of audio input cards do you have to your Touch computer? I would focus on Audio In (which has been greatly improved) and feed a 100ms window (use a trail CHOP ) into a spectral analysis CHOP and see what waveforms you can extract for use.

If you have an intermediary audio station you could use max msp to send TOuch info via OSC.

J

1 Like

Hey Guys,

Yeah I’m pretty much in the research stage right now.
Thank you Jarrett and Achim for pointing out some areas I need to research.
I’ll be writing more after I’ve a done a bit more homework.
Please forgive any ignorance that I display here,
I’m just getting started again with audio software after a very long time away.

My audio card is an external “M-Audio Firewire Solo”.
ASIO and WDM supported, both of which are new to me.

I’ve been playing around a bit with “Live” and “Max MSP” (Demo versions).
I think I may need them both to get what I want.

I’d like to have Live and maybe Max MSP running (on the SAME computer) while I record tracks from my real-world gear.
When I add audio tracks that are software based (no real-world instruments being recorded at the time),
or am preparing for an audio performance,
I’d like to have Touch open and being controlled by Live or Max MSP (again, SAME computer).
Once I’ve got the basic structure (moving from song section to section) of my audio performance set up,
I’ll start to play around with Touch, and build controls for improv animation that will be done via midi and Joysticks.
These animation controls will then be routed to software audio effects in Live.

Here’s a speculative performance example.
a) One computer running everything. Live running, switching sections of the song, and Touch is also being switched to different sections (of the same synth) from the same midi hardware buttons.

   b) The same sliders are being manipulated to control animation and audio effects.

   c) One midi knob turned to select the proper next song/Touch-synth, and one button to launch the new song and Touch-synth.

I’m pretty sure I’ll have to build one very big “Omni Synth” that contains all the Touch-synths for the entire performance.
I haven’t been able to get both Touch and Live to read the same midi input.
I’ve tried having Touch generate Midi out, but Live can’t find it.
My next move was to try to find some sort of plug-in or stand alone midi software that would allow me to patch/route/duplicate midi.
It’s not ideal, but it would get me up and running.
It would be sweet if pre-recorded audio effects (panning, filtering, etc.) could manipulate Touch,
and there was my interest in ReWire.
Wave analysis would be great to, but to be used as “the robot” when I run out of fingers and feet…
or as the foundation for further improvisational control.
Max MSP seems to be much more flexible with data flow than Live,
but Live seems the way to go for ease of live performance.
Also Max MSP seems to be much more powerful in the creation of unique audio devices,
which tickles me to no end.

I was afraid I’d have to try to run all three packages at once, Max MSP as the ultimate interpreter between Touch and Live.
Live talks to Max MSP, Max MSP talks to Touch, Touch Talks to MAX, Max talks to Live.

I’ll check out what you guys have suggested, and see if that can’t help me out.

thanks!

Jim

Yep I pretty much want the same thing as you… Over time I will attempt an interface between Live and Touch - I keep praying someone uses the LIVE Python API to patch Python OSC to Live - which will undermine the need for developing a MIDI pipeline from Ableton to Touch - something i am trying to avoid. In the meantime I focus any “audio integration” time I have (which is little to none) on the audio in capabilities for syncing realtime audio waveforms to Touch. I will definitely have some components available that take audio input and convert it to spectral controller data of come kind within the next few months. Everyone interested please chime in - the more intereseted others are the more likely i will make some time for it. Lets keep this thread going.

I will report back here when I make something available.

Think MIDI Yoke does this

You can also publish a pluggo from MAX (it is like a VST plugin) which will then run inside of live.

Did I get this right, you want to send OSC to an external python app, and have that app send the OSC to live via the python API?

I was thinking the same, but tend to favor sending data from touch to python via pipe out, so I can also send/retrieve “strings”.
Haven’t yet figured out if the live API actually allows loading a new clip into a slot. Do you now anything about it?

okay guys
here’s an update.

I’m just beginning to get things working.
It’s humble, and I’m most likely not going about this in the proper-ideal-manner,
but I’m learning.
I’m using the Midi patch cable routing driver “Midi Yoke” (works great),
and I think “mapletools” would work just as well… and my Canadian friends might like it better.

the gear/software

Two Midi Hardware controllers: Behringer BCF2000 (mixer/sliders), Axiom 49 (Keyboard)
Two pieces of software: Touch Designer, and Ableton Live.

what I’m doing
Note values off the Axiom, Slider values off the BCF2000,
controlling both Touch and Live.

The set up
a) Note values into Touch by “Midi In” chop 1.
Time slice active.
Separate note channels created (35-85 for my keyboard range).
Cook every frame is on.
The source is usb device.

b) Control Change values into Touch by “Midi In” chop 2.
Time slice active.
Separate Control Change (81-88 for my eight sliders).
Cook every frame is on.
The source is my other usb device.

c) These two are put into a “Merge” Chop,
and then piped into a “midi out” chop.
The Midi out chops destination is : Midi Yoke port 1

d) Ableton Live is set to input Midi Yoke 1.

why it works
The two software programs can’t read the same USB port at the same time,
but they have no problem with reading the same Midi Yoke Port.

drawbacks, slowdowns

a) Since Touch is doing the bulk of the midi work before it reaches Live,
there is a lag between when a note is struck on the midi-keyboard,
and when the sound is generated in Live.
It’s not much, but it’s just enough to screw up timing when banging out a song on a keyboard.

b) Also, if any of the “Midi chops” are selected (especially the midi out chop),
there is a massive slow-down in performance. So I don’t select them.

Questions

a) Any idea what my sampling rate should ideally be set at in my Midi-Chops?

b) Any clues on how to reduce lag?

What didn’t work and why

Midi from any single hardware-device/usb-port can not be read by both software programs at once.
Unless there is a trick I don’t know about.

Midi out of the Touch “Midi Mapper” for the BCF2000 (midi Mixer) is set up to deliver hardware updates,
from the virtual sliders in Touch, so midi data can’t be swiped from there.

I was unable (perhaps my fault) to get both the usb out and Midi out on my BCF2000 and into the computer.
I had to use either the USB or the Midi outs on the back of my BCF2000.

Ableton Live would not give me continuous midi-out for the BCF2000.
If I moved a slider around, it would only give me the output of the final CC value when I was done moving it.
No good for Touch input. Hope I can generate better Midi out somewhere in Live for my audio tracks.

more to come soon

Jim

again, I don’t know if this is the best way to do this.
I’ve only played with live for about one full days time.
but it’s working.

Getting midi note data out of Live and into Touch

say you have a midi track recorded in Live that you just love.
You want the note data from that track to go out to Touch.
I found that if you duplicate that track, and then place an “external instrument”
in that duplicate track, you loose sound on the duplicate, but you can assign it to a Midi Yoke port,
and bring it on into Touch through the “midi-in” chop.

Jim

Hi Jim,

A few things . . . first, I’m seeing much the same slowdown with Midi through my scenes. It happens to me even without any of the midi chops selected, I just posted a bug 2 days ago with a follow up yesterday.

One thing you could try for the USB port issue is to get a midi patcher, this would allow you to output each device to multiple midi outs on the box, then run those into yet another midi to usb. Each program could have it’s own usb with the midi devices copied to each. I bought an Edirol UM-550 that works great for this, I use it to copy midi controllers to 2 computers. Sam Ash had them on close-out about 3 weeks ago for ~$100.

Jeff

Hi guys

Okay you convinced me.
Time to buy more hardware.

Jarrett: Thanks for the tip on the Edriol FA-101.
I was trying to do things on the cheap, but that is a very fine unit.
It will serve my purposes much better.

Jeff: thanks for the tip on the Edirol UM-550.
I was afraid I’d need something like this unit.
I was trying to avoid spending any more $,
hoping I could get everything to do my bidding through software.
I think this unit will greatly reduce my lag time and allow me to pull some nifty tricks.

Already getting some great results!

thanks again

Jim

Hey Jeff,

Yeah, Midi is slow for me too.
Here’s some things I’ve discovered, am trying.
I don’t know if they will be of any use (or are applicable) to you or anyone else,
but here they are.

I’ve noticed that if I have an empty Touch scene with only a “Performance” chop,
and I hit play, I get 200-300 FPS.
If I set up my BCF2000 mixer in the “Midi Mapper” and monitor the “in events”
I get around 30fps when moving a slider.
So that “Midi Mapper” being displayed can really suck the refresh time away.

Midi for me was unworkable at one point,
and this made itself known to me through the “Perform” chop, and the Performance Monitor.
When I checked out the performance monitor,
my “UI draw times” were insane.
I tried optimizing my Touch UI, but the draw time was still out of control.
Restarted Live and Touch, same scenes, things were speedy… real speedy.
I think the deal is that when I was playing Touch without Live running (or vice versa)
the break in the midi communication did some bad business to Touch and Live.

Been thinking about ways to optimize the use of midi.
Imagine you have a Midi track in Live that is prerecorded with notes and cc-panning.
Rather than send that through to Touch,
it might be faster to save that midi info out, and load it into Touch as a Midi File.
The triggering of the both the Live track and the Touch Midi file could still be done on-the-fly from a single controller,
probably with less overhead.
That way you have to only worry about sending midi back and forth between programs
for stuff that you want to improv-perform.

I’m going to test this theory once my non-demo version of Live arrives in the mail.
I’m also going to keep hammering away at the communication pipeline to see how I can speed things up.
I’ll post whatever might be remotely useful to anyone.

Jim

Hi Jim,

Yeah, seems like that would help for an existing track. But for live performance with a bunch of sliders going and a stream of midi notes coming in I don’t know. Seems like MIDI events coming through shouldn’t kill the performance so badly, doesn’t seem inherently complicated. I don’t recall any MIDI slow downs back in the 017 day, and it has gotten way worse with recent builds, 257 run reasonably well. I’m hoping that there are some performance optimizations coming.

Jeff

The following link talks about the Live Python hookup to OSC. THis is probably the fastest way to get things going. I have not had time to investigate it.

code.google.com/p/liveapi/issues/detail?id=2

turns out you can’t load a new sample into a slot via liveapi :frowning: So the only way to build your own file browser seems to be a keystrokes approach, where you need to translate the filepath into a sequence of keystrokes the live browser understands. Then you can get clipnames via the liveapi and you should be able to fully perform via a touch panel (including a grid with named clips and a custom file browser to load em up).

Anyway, there’s now a pretty active community at groups.google.com/group/liveapi/topics?gvc=2

So these guys currently wrap the liveapi into midi, cause when using osc it’s only updated in the display thread and when sending via midi, the audio thread is used.

Either way it seems difficult/impossible? to get at string data send from the api:
When using osc, it seems touch has no string support, so how do get clipnames from the api to touch?

When using midi, the data from live is coming via sysex, so I wonder how to extract stuff like clipnames inside touch? As far as I can see, there’s only CHOP support and no way to parse the sysex in DATs. Any ideas? The document describing the format is here assembla.com/wiki/show/live- … i_Protocol

Achim

Yes, we love all things maple! :mrgreen: