Triggering Sounds with hand position

Hello!

Iam trying to create a system where sounds are played with the use of audio play chop according to the the position of hands at certain places.


I have created boxes and have instanced it using a grid. Im trying to figure out a way where I can link a tone to each box (have used metaballs for left and right hand to show the position for reference), but have been unable to do so. So far I am able to only have 4 tones ( right hand x y and left hand x y).
Was wondering if its possible with instancing, or should I create multiple individual boxes instead of having one instanced?
Would you have any suggestion with as to how I can achieve my goal?
Big thanks!
Rhea :slight_smile:

take a look at the renderPick DAT if you want to do it this way, feeding it normalized x/y data from your sensor. Conversely, you could start with that x/y data and is it to drive your instances. You could even use that position data to drive a single pixel across a 10x10 grid of pixels in TOP space, shuffle that in CHOP space, and get your instancing data that way

1 Like

Thank you for your feedback! will give this method a shot! :slight_smile:

While trying to work out solutions i also came out with working with the bulletsolver to resolver the issue. I was able to link the notes with the colliding data of the instances! :smiley: