Hello,
I am using a Kinect Azure however the AI model (dnn_model_2_0_op11.onnx) does not output labels for if the hand is open/closed unlike Kinect v2.
this makes clicking on objects relatively difficult, The solution I am planning to use is something like this:
- Each frame get the hand UV position (translate the hands position to a 0-1 UV space)
- Check if the hand point is over a panel object (button)
- If it is over a panel object, a small progress wheel starts to fill up
- when it fills completely, send a click event to the panel object (button)
- If the user moves their hand off the object, reset the fill count
To achieve something like this, I need to find a way to check if there are any panels under the handUV each frame. Which function should I use to do this?
Alternatively, is there an AI model for the Kinect Azure which has gesture labels?