Hello everyone,
I am a French art student currently working on an interactive installation project.
I am using a camera as my main input and I am now looking to transform body movement data into sound and interactive visuals using TouchDesigner. My interest in TouchDesigner comes from seeing it as one of the most effective tools to translate physical movement into real-time audiovisual interaction.
The core idea of my project is to create a system where the presence, movement, direction, and intensity of the body in space directly influence:
-
sound generation and sound textures,
-
visual elements such as shapes, colors, or motion-based graphics.
At this stage, I am able to capture video input from the camera, but I am still exploring the best ways to:
-
extract meaningful movement data from the image,
-
convert this data into stable and expressive control signals,
-
map these signals to audio parameters (volume, texture, spatialization),
-
and connect the same data to visual behaviors in a coherent way.
As an art student, this project is both technical and artistic, and I am still learning TouchDesigner. I would greatly appreciate practical advice, recommended workflows, or patch-building strategies for transforming camera-based body movement into sound and interactive visuals.
Any help, examples, or references would be extremely valuable.
Thank you very much for your time and support.
Best regards,
A French art student exploring body movement, sound, and interactive media