With Kinect Azure now dead, what are we moving to?

Microsoft announced the Kinect Azure is dead, only shipping until October.

What’s the plan going Forward? The Orbecc products? I saw discussions elsewhere that said it “wasn’t on the roadmap” but hoping the latest announcement causes Derivative to reconsider.

1 Like

OAK-D support is currently in the Experimental build:

It looks like Windows and MacOS are supported. I’m waiting for an OAK-D to be delivered to test it out.

Here the announcment, which is incrdible hard to find -.-

Old Post:
Do you have a link for that information?
Cannot find any info on AK being discontinued.

1 Like

You can use the nuitrack plugin for orbecc support in TD:

You would hope that, as central as Kinect is to the TD universe, we’d eventually get first class support for the anointed successor. That, and skeletal tracking is just the surface. Having depth and color and skeletal all in a nice synced package is pretty amazing.

As an aside, I actually think this is potentially fantastic news. Kinect always felt too niche for a behemoth like Microsoft, especially once it was divorced from XBOX. Orbecc feels like a better fit for pushing the use cases TD designers care about… Just the addition of PoE cabling in the Femto Mega seems like a huge win. Hello 100 meter runs without hacky converters/extenders! (Anyone know what latency is like on that thing?)

The beauty of Kinect* was the front end giving rgb, 3d skeleton, depth and other features that the TD Kinect(2) could access with various InComps. The downside is you were at mercy of their update and marketing. Kinect2 invalidated kinect, etc And if the StateOfArt moved, it might take a while for Kinect to update anything comparable.
Today there are a number of camera/sensors that give you point clouds, skeletons, etc. Orbbec, Zed, Realsense (no dont go there), NuiTrack’s SDK, etc. Then again the StateOfArt with AlphaPose, MediaPipe etc has been moving faster than just about anything. You can download Python demos from research projects that do (partial) 3d from single rgb camera, do skeleton tracking of dozens of people in crowds, etc. And there are new papers out just about monthly.
I’ve been experimenting with these libraries with aim of making FOSS smart cameras that are open enough that talented coders can add update the processing pipeline to meet new requirements. (train a ML system for your poses, hand positions, face, etc). You can fairly easily use a standard usb camera with python, NDI and OSC, to get you just about all the features you want… if you can afford to diy the code.
Hardware is still a bit of issue. I tried to run some on a raspberry pi4 but only got 8fps, which is useless for interactivity. I have a Google Coral dev board but got distracted from building it out, to get some basic TD demos working with the PC python app. I havent kept up the documentation or even the github well but its over at GitHub - MauiJerry/Pose2Art: Pose2Art creates an AI Camera run on small 'edge' device that extract Pose and sends tracked points via OSC to TouchDesigner and Unity/UnReal for ART!
caveat: i live on Maui and am further distracted from TD/Art at present by the fires that have wiped out a significant part of our community, and venues where we were planning to try some installations.

1 Like

It seems Orbbec has a wrapper SDK that emulates the Microsoft SDK, making them a drop in replacement.

The “Femto Mega” and “Mega I” are shipping now, and the “Femto Bolt” (the most direct replacement) shipping in October (about when they anticipate Microsoft’s supply of Kinects to run out).

All have identical optics and capabilities.

  • Bolt basically a Kinect Azure clone
  • Mega adds PoE and edge processing.
  • Mega I is an IP65 rated version of the Mega.

Tagline on the new marketing is “develop with one, deploy with any”.

I’m doubling down on my “this is a good thing” take. Exciting stuff

Some more details on Orbbec & Kinect: Microsoft Collaboration - ORBBEC - 3D Vision for a 3D World

1 Like

Reached out to Orbbec… The Axure wrapper is not supported in network mode… So still stuck on USB unless someone figures out a direct plugin.

As i understand it from their sales rep, their SDK is not a drop in replacement, but a wrapper of the K4A SDK. So the TD Kinect Azure library would have to be changed to use this new SDK. Are there any plans of Touchdesigner supporting the Orbbec SDK?

We are working on a couple of solutions for Orbbec, but I’m not sure on the exact release dates yet.

The Orbbec Kinect wrapper does largely work as a drop-in replacement for the original Microsoft Kinect Azure. It allows the Orbbec to work with the Kinect Azure nodes for color, depth and body tracking. You can actually get it working right now with the current TouchDesigner release if you copy Orbbec’s version of k4a.dll and OrbbecSDK.dll into the TouchDesigner bin folder. It will block Microsoft hardware from working and there are some differences in the color controls, but it should work in the short term.

We are also working on a direct Orbbec SDK solution that will support more Orbbec cameras directly for depth and color, but does not provide the Kinect body tracking since that relies on Microsoft’s tech.

The Kinect wrapper does not support ethernet devices since it has no mechanism for assigning an IP address or decoding the H264 images that the camera returns in that mode.

Ethernet will be supported for the new Orbbec SDK-based nodes.

3 Likes

Thanks for the quick reply robmc! Great to hear that it is in active development. I will try out the Femto Bolt and see how it works with the Orbbec wrapper.

We have tried a bunch of stereo camera solutions from OAK-D and the discontinued Intel realsense and have never been impressed with the performance compared to the ToF in the Kinects, so we were a bit worried when MS discontinued it.

In the latest TD release I see that the Orbbec TOP is capable of receiving streaming data coming a Orbbec Femto Mega. Is it possible to receive the data from the Mega inside of the Orbbec TOP and use that TOP in conjunction with the Kinect Azure CHOP for skeletal tracking? My potential use case is to send the streaming data from an Orbbec Femto Mega to a computer over ethernet and need to use the point cloud image and skeletal data.

Unfortunately the Orbbec TOP will not work with Kinect Azure CHOP. To get the skeleton tracking to work, the Orbbec needs to pretend it is a Kinect Azure and this requires logic that is only in the Kinect Azure TOP.

I’ve been told that you might be able to get the Femto Mega to work over ethernet with the Kinect Azure TOP/CHOP by using an external configuration file ( OrbbecSDKConfig_v1.0.xml ), but I’m not sure if anybody has gotten this to work yet. I think the config file is included with the orbbec viewer app and in the file EnumerateNetDevices needs to be set to true to discover ethernet cameras.

I hope that is useful. We do hope to look into this further ourselves, but don’t have the compatible hardware at the moment.

Thank you for the information. I also don’t have a Femto Mega for testing this idea out on. I’m considering purchasing one though and if I do I will report back.

You mentioned that you do hope to look into this in the future. Is this potentially a feature that will be included in future TD releases?

I can’t really say whether that feature will become available or not. We know people are interested in it, but getting the Kinect and Orbbec to work together is all a bit of a hack, so it depends more on what Orbbec and Microsoft make possible.