I’ve been doing some research on TouchDesigner for the purposes of live event visuals. Specifically, I’m looking into using a RealSense camera for tracking performers’ movements and feeding that data into my TD effects.
From what I’ve seen, this is effective for things like video art installation; but how does it fare in an auditorium-like setting or even outdoors for say, a small music production? Have users had success using TD and realsense in this context? For example, are there issues with the camera not picking up the movements from 20+ feet away?
Apologies if this is more of a question for Intel. I may reach out to them too, but I wanted to know how this technique works out for live shows, and whether it’s a viable option.
RealSense likely isn’t the correct solution for large scale tracking like that. You’ll want to look at something like Blacktrax or Optitrack that uses trackers. Or wrnchAI for video based person-tracking.
Another option is something like the Vive Trackers, if your space is medium in size.
1 Like
Tracking in a large venue typically has more specialized requirements - you typically need a tracking solution that works well with larger distances, in stage lighting conditions, and has a communication solution for handling the distance from your sensor to your media server.
+1 to @malcolm’s suggestions for BlackTrax and Optitrack.
Depending on what you’re doing you may consider a few different solutions. If you’re more interested in capturing the audience as a point cloud, then lidar tools like the Ouster make good sense. If you need high fidelity tracking for a performer you may end up needing to consider a tracking solution like BlackTrax that uses a marker based solution for helping you keep track of a performer.
For smaller budgets - or proof of concept work you could certainly experiment with tools like the Kinect or Realsense, but those don’t typically work well for larger venues.
1 Like
Thank you both so much for your replies. Going off what raganmd said, this would be for a proof of concept/prototype, so it’s not important that it works without a hitch. If it works ‘somewhat’, I’m good with that for now, until I can get a grant to fund some of the bigger budget tracking tools. So at this point, I will likely go with either Kinect or RealSense, but blacktrax is out of the question for now. I might also try the Vive method, but from what I understand, I’d need to buy the whole Vive system for this to work and not just the tracker (correct me if I’m wrong). Also looking into wrnchAI because I’ve never heard of it, and it sounds interesting. Thanks a lot, guys!