Using an older firmware (v3.2.1), the leap motion can auto-orient and detect hands from both sides which is great but it also auto-orients the coordinate system, so I have no way of knowing if the user is approaching from the front or from the back.
So, looking at the leap motion from the front, if a user approaches it and moves their hand left to right, tx goes from negative to positive, and this is what I want. But then if a user approaches it from the other side, it flips the coordinate system so you still get that negative to positive move from their perspective when they move their hand left to right, BUT from the original perspective they’re moving the hand from right to left and still get the negative to positive move, whereas I want to keep/lock the initial perspective so that would instead give you a positive to negative move.
Because the leap motion auto orients the coordinate system based on which side it detects the hand from, I don’t think I can ever lock the perspective/orientation while still detecting hands from both sides, SO I’m hoping to be able to detect when the orientation flips so I can invert the coordinates its giving me so I can maintain the same tx/ty/tz moves from any position. Any ideas?
Thank you, currently just using the leap motion camera and feeding it into Mediapipe’s hand detection but because of the fish eye distortion, it gives incorrect coordinate values towards the edge of the camera/video because in the video it looks like the hand is rotating. I’ve been trying to figure out the correct values to put into Lens Distort TOP to fix it (the Image Correction toggle on the Leap Motion TOP doesn’t do enough) but that’s pretty tough and not even sure it would do the trick. Really need the actual coordinates from Leap Motion but to be able to detect hands 360°
Just to confirm, when you say you’re using the old firmware, are you using the ‘Version 2 Tracking’ option in the Leap Motion CHOP? A lot of options have been added and removed from the SDK over the years, so it depends a lot on what version you are using.
Out of curiosity, does the camera image flip when it detects hands from a particular side? There is reference in the code to the distortion matrix updating when the device is flipped ( Group Structs - Ultraleap documentation ) I’m wondering if that could be useful. If the camera image flips you could put a visual marker in your scene (on the ceiling or something) and detect what side it is on in the camera image.
As far as lens distortion is concerned, there are opencv python algorithms that can calculate the coefficients that can be used in the Lens Distort TOP to correct the image. They take a little work to set up with a calibration board, but there are people on the forums here who have tried it.
Yup using Version 2 Tracking and yup it does auto flip and that’s not a bad idea having a visual marker, I’m working outside so there’s no ceiling but maybe I could adhere something very tiny to the corner of the camera to obstruct it a tiny bit and just determine in the TOP where that obstruction is.
Yup aware of the opencv method, just trying to save a little time if someone had already done it.