Hi all,
I’m quite new to both Blender and TouchDesigner, but I’ve been diving into both to challenge myself through a project that pushes beyond the usual workflows.
Specifically, I’m trying to build a custom point cloud pipeline starting from a hand-modeled 3D object — not a scan or LiDAR capture.
The goal:
- Start from a textured 3D mesh created in Blender
- Convert that mesh into a point cloud
- Sample the texture color per point
- Export the result as a
.ply
file containing both XYZ coordinates and RGB color data - Load that point cloud into TouchDesigner (using
File SOP
orPoint File In TOP
) for further animation & visualization
I know that this is quite different from the standard photogrammetry or scan-based pipelines
but I wanted to approach this as both a creative and technical learning challenge.
So far, in Blender, I’ve tried using Geometry Nodes to:
- Distribute points across the mesh surface
- Use
Sample UV Surface
to sample the texture color - Store RGB values as per-point attributes (
red
,green
,blue
) - Export as
.ply
with vertex attributes enabled
But when importing into TouchDesigner:
- I get
"No X, Y, Z"
errors fromPoint File In TOP
- And no visible color when using
File SOP
either
What I’m looking for:
- Has anyone tried creating a point cloud from a polygonal model (not a scan)?
- Is there a proper format or
.ply
structure that TouchDesigner expects? - Any tips for mapping and reading custom vertex attributes inside TD?
I’m fully aware that this is an edge-case use —
but that’s exactly why I wanted to tackle it. I’d love to get this working and share the workflow with others like me who are bridging creative 3D modeling and real-time pointcloud visualizations.
Thanks in advance for any advice, and for all the knowledge this community already provides
(Using Blender 4.4 and TouchDesigner 2023.1+)