Hardware for TD interactive projections

Say one is creating a touchdesigner project for interactive projection where something is projected, the audience touches or interacts with the projection and then the content of the projection changes

Any tips, advices on portable good quality hardware for these purposes that has a reasonable cost and provides good quality:

  • cameras or other sensors to detect people’s behaviours
  • 4k projectors

thank you for any tips :slight_smile:

So big questions and so few informations about what you want to do!
– camera tracking with problems of lighting, occlusion etc.
– deep tracking (kinect…) with problems of distance, field of view etc.
– leapmotion, touching screen
– motion capture (blacktrack, rokoko…)
For projectors:
– size of screening
– distance
– light
So I think you have first to do some research about what you want to build, what time and money you have and which friends you have.
Perhaps search some videos of desired performance and ask how its done.
Jacques

1 Like

yes, all great points, great summary,
I think for my current purpose kinect may be the way,
thank you again :slight_smile:

@jacqueshoepffner
so for a next experiment I will use a short throw projector and a Kinect 2;
I will project visuals on a wall with the projector and when a person approaches or touches an area of the visuals, an animation will be triggered around that point in the wall

is there any .toe .tox example tutorial where I can see how people typically interconnect with these different parts in touchdesigner? the input, output and in between changes?

I found this one for example

which is great,
so its very clear how to read from the kinect and affect a particle system for example

but now suppose that I want to detect the specific point that a person touches on a wall or on the ground, and from that point make emerge the animation of a flower growing, how would you go about that

thank you :slight_smile:

Hello,
what you want to do its not so easy with the kinect because the body of the people will hide the point of touching… In the video you show, it works with one person doing large gestures in front of the kinect.
If you want to analyze the touching there are some infrared frame that you can install on the wall, giving you the position of the finger touching the wall, there is also a device made by Epson for digital black board, suing infrared laser to receive the hand position.
I used that for two installations I made with a firm in west of France. Working well but the surface is not so big.

1 Like

interesting @jacqueshoepffner what you are describing sounds like a similar system to how touch screens work on some laptop computers; there is this grid of rays that you interrupt with your finger and that gives out the position

but yes that implies a more complex setup indeed

I am currently in the same positions as you - learning and deciding on hardware setup and composition. My advice would be to work with what you have. It might be tempting to start huge and complicated art installations, but it will be easier to set up a screen and single kinect and gain some experience for larger works. Limiting yourself this way will also teach you more about your possibilities, I think.

We’ve had great success using the Kinect as a touch device, you just need to orientate the Kinect to look down the wall, then using the point cloud data, slice this using a simple glsl script to show the touch interaction. You can then process this using blob detection. Here’s two examples, one on a wall, one on a table, both using projection.

In our experience it can be good to use two kinects, one for body tracking from behind for larger scale interactions, then one looking down the surface for the above techniques.

Bear in mind if you want to touch a projection surface you’ll need an ultra short throw projector with a mirror lens to avoid the user blocking the projection with their body.

Nice!

How do you run two kinects in touchdesigner, at the same time? Do you just plug them both into your computer?

Yes exactly that. If you want to, you can also split them across multiple PCs and then stream the data via OSC or touch in/out