THE GOAL:
To dynamically illuminate a predetermined path in front of individuals based on their directional movement. Essentially, if a person were to change their direction, the lit pathway should intuitively adjust and follow their new heading. The system should be adept at accommodating individuals moving in both directions simultaneously. While I have the projection mapping aspect under control, it’s the directional detection and dynamic path lighting that I’m primarily focusing on.
My Problem:
Hello TouchDesigner community,
I’ve successfully connected to an Ouster OS1-64 (please note, it’s an older version which requires TouchDesigner 2021.1) and can access the RGBA panorama with simple ousterTOP, giving me a visual representation of what the Ouster perceives. I’ve also visualized this data in a 3D format.
However, I’m encountering challenges when attempting to use this data for blob tracking. I’m not entirely sure if a more 2d conversion is essential, but my understanding is that utilizing just a slice of the 64 lasers (centrally focused) should be sufficient to capture people’s positional data for my needs. I’m pondering whether the approach should involve cropping the elevation or isolating a specific laser.
I’m seeking assistance in setting up the required nodes for this functionality. While I anticipate handling calibration aspects such as scale and projection matching, my immediate need is to establish the core functionality. My primary goal is to delve deeper into the visual side of things once this foundation is established.
Moreover, I’m preparing to present this as an art installation for a local festival this Wednesday, attended by approximately 100 enthusiasts. Any guidance, insights, or suggestions would be immensely appreciated!
Thank you in advance for your help!
THIS IS WHAT I THINK WE NEED. Although I am new to this, please review and tell me what I’m missing or if there is a better approach because I don’t know this tool that well.
- Data Acquisition and Preparation
Ouster TOP: Drag and drop the Ouster TOP into your workspace.
I Have the Lidar connected and working with Ouster TOP. I’m not sure how to get the appropriate data set out to just get a 2D Overhead view of the lidar or if we even need that.
- Blob Tracking
Blob Track TOP
Not sure if blob track Top will work with the ouster os1
Adjust parameters based on the environment, focusing on:
Minimum Blob Size and Maximum Blob Size to cater to human sizes.
Threshold to fine-tune detection sensitivity.
Maximum Move Distance to manage fast-moving objects. ( probably not necessary)
Toggle Draw Blob Bounds on to visualize the detection. (would love to mask out areas that are not or shouldnt be effected)
Adjust the Threshold parameter to get a binary image based on the difference between the background and the primary input.
Cleaning up Blob Detection:
Track individual blob data such as coordinates, ID, and size.
For trajectory predictions, logic assessments, or to trigger events based on blob movement. (it can be cheated if necessary )
- Providing Directions
Use the blob data to generate guidance. For visual directions:
Be able to attach visuals or geo or particles to someones location
- Output and Display
Integrate a Window COMP to project or display your combined visuals (original data + blob detection + guidance visuals).
- On-Site Adjustments
Once set up, you’d need to make on-site calibrations:
Adjust the Threshold on the Blob Track TOP.
Modify the Minimum Blob Size and Maximum Blob Size parameters, especially if the environment is different from the initial setup.
Blob mask.
Projection map and Tracking alignment.