When I run a pipeline for spatial detection on person , the output of the ObjectTrackers, the tracklets don’t include any spatial coordinates , only the regular xmin,ymin … bounding box .
The SpatialImgDetections from the MobilNetSpatialDetectionNetwork includes those coordinates ( x,y,z) that are passed to the ObjectTracker.inputDetections
ColorCamera.preview >> MobilNetdetectionNetwork.detections >> ObjectTracker.trackets>>XlinkOut
Do I miss something?
I have posted on the luxonis forum :Spatial Person Trackers - Luxonis Forum
Each tracklet should include a
spatialCoordinates property which should have all - x, y and z info
Any option to retrieve those coordinates from the tracklets message?
I moved my question from Bug to hardware. Maybe there is a workaround.
It doesn’t seem that I can do anything with the Tracklets or ImgDetections messages in the oakselectCHOP callback .
onReceiveBuffer doesn’t seem to parse those messages.
If a derivative OAK teams guru can confirm @DavidBraun - @MarkusHeckmann
the samplefile in
app.samplesFolder OAK contains a
mobilenetDeviceSpatialDetections example which outputs the
z position of the detected object - can you see what is done differently there compared to your pipeline?
The `mobilenetDeviceSpatialDetections example is a little different than what I’m trying to accomplish. I just want to track people.
The example tracks all objects. it streams the SpatialImgDetection directly.
To achieve what I want, I need to pass the detections from the mobilnetSDN to the objectTrackers, set to track people only, and stream the tracklets to TD.
You need the ObjectTracker to track a particular object.
In my setup I stream both for debugging:
The SpatialImgDetections that includes the x,y,z
The tracklets that don’t pass the x,y,z. only the bouding box like in 2D.
It is more like the mobileNetTracker example, but using the mobinetSDN instead of the mobileetDN.
Similar to this example on Luxonis:
I hope my explanation are clear enough?
ah - thx for reiterating this. Checking…
@snaut great, thanks.
BTW, I don’t really understand the purpose of the objectTracker node.
Trying to understand this issue, I’ve streamed both messages ( spatialImgDetections and Tracklets) and I haven’t noticed any improvement. Side by side almost identical.
Reading the literature, the objectTracker should improve the tracking data from the mobile net NN (or Yolo).
I need to play with the different types of trackers.
for info, I have posted on the luxonis :
Any news about the x,y,z data not output by the Tracklets messages in spatialDetectionNetwork mode .
sorry for the late response. Currently looking for the best solution here as not currently sure if always all data is transported with a tracklet or if this can change depending on how the tracker is used.
Never mind, all good.
If you can’t find a solution, I can deal with the SpatialImgDetection message that carries the x,y,z data.
but then I lose the tracking feature of the object tracker.
I don’t have the DepthAi package installed, so I can’t test the spatial_object_tracker.py example for output verification.
I’ll definitely install it, as I’ll be working with OAK quite a bit.
I think it won’t interfere with DepthAi SDK in TD.
for the next release, tractlets will include the spatial information.