Migrating from Isadora: Layer-based MIDI Control & Parameter Routing Architecture

I’ve used TD over the years for very specific operations, but never for a larger project. I would say that I am an intermediate noob.

I recently did a Leap Motion network in TD that is gloriously useful and has inspired me to consider migrating a control system from Isadora to TouchDesigner.

I particularly excited by the multichannel chop workflow, audio processing capabilities, the Leap Motion CHOP and the data manipulation capabilities of TD - the sort chop is super cool. I am, however, struggling to find a clean way to manage routing MIDI control parameters in my layer-based control system.

Current Isadora System

Core Components

  • 4 video layers controlled by MIDI
  • Parameter processing and modulation
  • OSC output to Resolume

Data Flow

  • MIDI input → Two-key JSON (controllerPage:controllerName) → video layer user actor → target-parameter processor actors.
  • Parameter processors access JSON values using a table lookup via layer number + target-name
  • Processors use constants from table
  • Processed parameters → OSC to Resolume

Proposed TD Architecture

Core Structure

  • 4 layer-specific Select CHOPs for parameter storage
  • 4 layer-subnets containing many target-parameter processor subnets
  • Target-parameters processed based on layer+target-parameter inputs and lookup table
  • TauCeti preset manager for state handling

Parameter Access

  • Dynamic table-driven absolute references: layer number + parameter name
  • Access reference paths and processing constants stored in the table
  • Table data selection example: “layer 1, xlp1:xlb2” for Streamdeck XL page 1, button 2

My Key Questions

Absolute References for op parameter values:

  • Is this viable for 100s of parameters per layer - albeit momentary parameter changes from a midi controllers?
  • Alternative approaches seem less intuitive (exports, indexed single-channel streams)

Dynamic Updates:

  • How do I update CHOP absolute references from a table?
  • Table contains: layer name, parameter name, reference access paths, processing constants

Layer Switching:

  • Best practice for updating MIDI controllers on layer change?
  • Previous solution: reload state with delay

My wishlist

  • Avoid complex muxing/demuxing
  • Python for reference assignment, not processing any streaming data.
  • Easy maintainability and simplicity

Any guidance would be greatly appreciated. Thanks!

_J

Hello,
I came from the same background, 15 years using and teaching Isadora, mainly in dance and theater.
I decided to change for TouchDesigner at the macOs version, following many frustrations with Isadora:
– lack of 3D
– lack of code (jasascript actor was a joke)
– lack of reuse for interface (numbering was a nightmare)
– lack of proper funding for development
– lack of stability
To name a few.
I had to relearn, rebuild, rethink all my processes and it take me 3 month / 6 hours by day but I never regreated it.
My first advice is not to try to reproduce what you was accustomed for in Isadora, mainly scenes and layers.
– learn the software with all his peculiarities
– analyse what you need really and how you can do it in TD (not reproduce it)
In my practice, each project is a prototype, I build it with a very modular and procedural way, separating UI, inputs, image creation and treatment, sound creation and treatment, output preparation (mainly mapping 3D) and outputs.
– With time (8 years now), I have a collection of prepared toxes for every need, midi consoles, GLSL operator to cut and map films etc.
Concerning layers, the easier way for me is to separate the different process through base comps with custom parameters, sound and image output and I affect a nanoKontrol midi slot for each one (fader for image level, pot for sound level, S for start, M for Pause, R for rewind), and more for more complicated process.
All images outputs go inside a comp Top (add), all sound outputs go to a math Chop (combine chop : add).
For the parameters, I use normaly a set of tables with external .csv (easier to edit), stocking the parameters in rows and the cues in columns. The main table reference to other tables, alowing to multi-dimensional settings. If necessary, i introduce time in it but I prefere to do it manualy with the fader, according to the type of performance.
For some project, I used Json, mainly for the possibility to store and edit huge list og cue and parameters. I use mainly Visual Code Studio ti edit it and I made special tools in Python to edit, save, insert and delete cues.
Unfortunately each experience is different and I lack information on your work to help you more deeply.
Dont hesitate to come back with questions.
I gave a workshop about it in the 2019 Montreal meeting and I published some tools for theatrical performances (but I think I have to update it)
Good luck for your projects.
Jacques

1 Like

Thanks, Jaques, for your detailed and considered reponse. I particularly appreciate your modular workflow. I’m working on doing something like this, too. I love Isadora but understand it’s limitaitons in working with large sets of parameters and state management . They do have the new Python actor which I am sure will make for some exciting work.

For this specific project I think the audio analysis tools and CHOP channel workflow in Touch Designer will make the whole project lighter and easier to work with.

I am not doing any video processing in the Isadora/Touch project - it receives midi data from controllers, maps the data to specific target parameters, modulates and processes the parmeters and then send it out over OSC to Resolume which handles all the video playback and effects in 4 layers. This is for an audio reactive project.

I worked on a few mermaid diagrams to visualize the workflow.

ROOT LEVEL

MODULATOR COMP (one of many)

TARGET PARAMETER PROCESSSING COMP (one of many in a layer)

I think in all of this my main question is about routing hundreds of CHOP channels:

  • Is this viable to use absolute references in chops to access 100s of parameters - mostly momentary parameter changes from midi controllers and 4-10 60fps audio/noise/wave modulation streams?
  • Alternative approaches seem less intuitive (exports, indexed single-channel streams, parameter packaging in JSONs like in my current system, etc).
  • How can I dynamically assign absolute references in a chop using a table and a key?

Thanks again for any input you might have.
_J

  • Is this viable to use absolute references in chops to access 100s of parameters - mostly momentary parameter changes from midi controllers and 4-10 60fps audio/noise/wave modulation streams?
    If you have your data in a table, you can reference your chop value using:
    op(‘myTable’)[‘row’, ‘col’]
  • Alternative approaches seem less intuitive (exports, indexed single-channel streams, parameter packaging in JSONs like in my current system, etc).
    Table are perfect for that, depend of many parameter (speed, complexity…)
  • How can I dynamically assign absolute references in a chop using a table and a key?
    Doesn’t understand exactly the question but you can create a chop with one channel for each table row using select Dat (to isolate cue) and datto chop (to create chop), each channel labeled with the first col. You can also use Python to create chop from dat and to add/delete row or col.

This sounds like the right direction: If you have your data in a table, you can reference your chop value using:
op(‘myTable’)[‘row’, ‘col’]

I will test this out. Does the destination chop cook when table is updated or when the selected ‘row’,‘col’ is updated?

“Don’t understand exactly the question:”

I mean that the parameter processing comps are all instances of the the same “comp” and will have two constant inputs that determine how it processes parameter data: layer number and target parameter name. These two constants provide the paths to access parameter and layer-specific contants in a table.

I will play with the idea of working with DATS to move data around instead of CHOP channels streams and see what I can come up with.

Thanks for your input, Jacques.

Best,

  • J

I did some experimenting with different ways of referencing the midi controller data in comps and subcomps.

Exporting DAT Table by Name is very powerful in this spot. It will allow me to route all the midi parameters to the the target parameter processing comps with tables, which is really great. I just need to name all the instances of the parameter processing comps with the target parameter name and I can keep the routing in the root composition, which is very tidy.

This answers my question about “dynamic references”, too. I can assign any midi chop channel to any parameter in any instance of the processing comps using tables from the root comp. Excellent. Thanks again for your help.

  • J