Azure Kinect Touch Wall help

Hi, I’m quite new to TouchDesigner but am learning quickly and loving every minute of it. The community is amazing and generous. Happy to be here.

I’m looking to build a project which comprises an Azure Kinect and a projector. I am looking for a reliable way of detecting ‘touches’. Basically a room-scale touch screen.

I’ve got some prototypes working, but I feel like this must be a common requirement and somebody is bound to have done a better job of it than I can.

I’m looking for reliable hand tracking and touch detection mainly. Things like collision detection, touch detection based on depth, debounce etc. But if there are swipes and gestures etc. Then this is amazing.

Can anybody point me in the direction of a tutorial or sample project file to get me started?



I worked on something similar to this in the past:

You need to select the xyz of say the right hand and and export the xyz to the xyz translate of a Comp Null node, then duplicate the comp null node, one of these will be used to track the hand the other to lock down a position where you want the touch to be activated, move your hand to the position where you want the the touch to be activated and lock one of the nodes, create an object chop and activate distance measurement only, on the target tab link the 2 comp nulls to the target and reference objects, link the out of the object chop to a logic chop, set it to outside bounds, set the high bound to the range where you want the touch to be activated, in the attached file I have set the touch to activate when the hand gets within 30cm of the touch area, the logic node is now your trigger for interaction.

Hope this helps.

KinectHandTrigger.toe (4.8 KB)