iPhone X True Depth camera support

I would love to see support for iPhone true depth camera support on a level similar to the Kinect. It would give us OSX users some depth camera fun!

Interesting app! While I can guess you would want streaming of the data over USB from that application, I noticed it says you can also output PLY files from it. The good news is the 2020 versions of TouchDesigner now support PLY files so you can bring those point clouds directly into TouchDesigner.


Thank you! I didn’t know I could do that, can’t wait to try it!

I wonder if I could record a .PLY with the Kinect as well :thinking:

Looks like there is a steaming mode - with Python libraries:

There is indeed a streaming feature! I actually had a bit of a winding road to come across touch designer. I spent 2 months learning blender, then I saw some cool stuff people were doing with point clouds where the point clouds moved around which led me to this app, which has a plugin for unity. However, the Unity plugin didn’t work, and I spent a few days going back and forth with the developer trying to get it to work, and I told him what I was trying to do with point clouds, and he recommended I try Houdini, and in searching for Houdini, I came across touch designer and decided it suited my needs better with the Kinect functionality.

How hard would it be for somebody relatively noobish (Me) to get the python libraries to work in touch designer?

I just gave it a shot, and I’m hitting a snag. Maybe I’m just doing something wrong, but the answer is not very straightforward. I recorded a short video with the app of me petting my cat, and sent it to my laptop (5 GB for 20 seconds! :exploding_head:), and it came in the form of 700 .PLY files. I can load them in individually just fine, and I see there is a “play” tab, which presumably is for animated point clouds, but perhaps for a more convenient format? (Perhaps .EXR?). I would REALLY rather not convert every single frame into the proper format if I can avoid it, but would love to figure out how I can get such an animation into touch designer. Is there some kind of external software I can further convert the files into?

Is it possible for TouchDesigner to play all the PLY’s in sequence?

I would try to batch-import the plys Into EXRs by making my custom automation of the import feature. After that playback the EXR sequence into the regular MovieinTOP.
Great app find btw! P

1 Like

It’s not like streaming directly from the app, I get that, but I am sure there’s a way for that too. TD can definitely handle some video streams, even WebRTC.

@priam I DID actually get this working! (I should have followed up), by selecting the folder the PLY files are in, instead of the individual PLY file. I recognize the benefits of an EXR file though, but how would I go about converting a PLY to an EXR? What do you mean by "I would try to batch-import the plys Into EXRs by making my custom automation of the import feature. "

I heard there is a streaming mode for it.

There is a streaming mode for it, for Unity. I’d rather avoid learning Unity as I’d rather accomplish this in touch designer.


I haven’t had time to figure out the rgbd video.
I was curious how the data is arranged (channels). I my uninformed mind there is a .ply to EXR converter in the point-cloud palette. I would mod that and have it do all your .ply files automatically one after another. That would give you an EXR sequence which will play like a 3D point cloud video if you use the palette premade point cloud tools.

1 Like

The other way might be to use the regular Kinect vertex shader method and stream the rgbd video, use the depth info as a 3D point positioning map and the rgb channels to color each point. Straightforward I would think.
I just wonder how the rgbd video is arranged, it might just be replacing the alpha channel with depth values. The video would have to be 16 to 32 bit. And all values normalized.

1 Like

Splitting the channels of the video feed with tops should be ok, streaming via usb the video to TD with the IPhone on is where it might be some work. Tried VLC? It does streams, maybe it can take it live and push it back to TD somehow. Or (not in my field) WebRTC on a local server? Talking crap here. Must look st streaming video in TD.

1 Like

VERY clever ideas! Thank you for the suggestions! I think much of the streaming would require more technical know how on my part than I’m willing to commit. I have it working “good enough” where I can record record it on my phone and then transfer the PLY sequence to my laptop.

I was thinking of something similar and came across ZIG SIM PRO which sends depth data via NDI: https://twitter.com/amagitakayosi/status/1203236098522042368?lang=en

1 Like

Whoa excellent! Have you tried it yet?

I haven’t yet but I plan to once NDI input is included in the next Non-Commercial version [1].

[1] https://derivative.ca/community-post/broadcasting-social-media-touchdesigner/62737

ZIG SIM Pro works awesomely, can vouch for that. :+1:

As for the .ply to .EXR conversion, technically you can now do that all now in TouchDesigner. We added extensive .EXR writing capabilities to the Movie File Out TOP, especially for Point Clouds as it allows you to save .EXRs with an arbitrary number of channels (ie. XYZ and RGB and Intensity and UV etc all in one .EXR).
I have not tried using the ‘Image Sequence’ mode in Movie File Out TOP with EXR like this, but it should work well if you turn real-time flag off.

I could not really find an appropriate way to get access to the live point cloud data, but after compiling and building the python version of record3d I have used OBS to forward the depth and RGB data through NDI inside TD.

Did anyone find a way of accessing the live point cloud feed?

And did anyone find a way to access the depth- and RGB camera feeds directly inside TD via Script TOP?