I’ve recently watched a few NODE 2017 conference vides, interesting stuff. One thing got my attention, a library for GPU geometry transformations Instance Noodles. It seems that in VVVV people have packed geometry shaders into VVVV nodes, so they can be composited together, one after another, e.g. fisheye → wobble → extrude.

I wonder if such geometry operation composing is possible in TD, or rather the only option here is to compose all operations together into single GLSL code file?

I didn’t get a chance to look at the link, but you can attach a geometry shader to a GLSL TOP and write your own geometry shader before it gets passed to the pixel shader. By default it just gives you a quad that fills the frame and generates UV co-ordinates for you. Would that be similar to those VVVV you linked if those shaders were inside a container with custom parameters?

I’ve checked the GLSL TOP - it can run a vertex&pixel shader or a compute shader, the output is a 2D/3D texture.

To chain GPU geometry transformations, the GLSL node would need to run a geometry shader and have a vertices collection (or a geometry?) on input/output.

So chaining GPU transformations as nodes is currently not possible (if my understanding is correct).