Sampling texture at arbitrary 3D surface points & render the result

Hello. This seems fairly basic so I am confident there is an efficient way to do this:

I have an arbitrary array of points, each corresponding to a (u,v) location on a 3D surface. I want to sample a 2D texture at each of these points and render the result mapped onto that 3D surface: each point from the array will get assigned the texture color at that location. All other locations will not be rendered.

What I’m struggling to figure out is how to create the updated texture (color = input texture sampled at the array points, transparent everywhere else). Then I imagine I’d use usual SOP instancing in a Geometry COMP (position = all points from the surface, color = updated texture) to render the result.

I have a feeling GLSL would be suited for this but this goes beyond my very limited GLSL abilities (and from what I’ve read so far seem to need vertex shaders?..)

I hope this makes sense. Could someone point me in the right direction? If there’s a simpler way via native TD objects I’m open to that too of course (CHOPtoTOP?).

Thank you!

Below an example of the surface I’m working with, with “regular” then “arbitrary” (u,v) sampling (ignore the warning):


PS: This old post looks like it would have elements of answer, but it’s >9 years old and the example is using an old version of GLSL so I decided to start a new post from scratch.

Hi @leonard.roussel,

Do the points carry a uv attribute with them and is the desired texture basically the original texture minus the “missing” points?
I might be completely misunderstanding what you are looking for.

Best
Markus

Yes, and yes! You understood perfectly.

The points do carry (u,v), stored in a table somewhere, (listening from Grasshopper via OSC at the same time as the points positions and surface normals at these points.).

Thanks.

Hi @leonard.roussel,

great! Can you share a file with the data locked so I can look for a solution with actual data?

cheers
Markus

Here you go! Thanks so much.

Grid_OSC.toe (45.1 KB)

Hi @leonard.roussel,

sorry this took a bit longer.
You can use the UVs of the points (scaled by the aspect ratio of the grasshoper subdivision) as the position in your instancing network to create the mask you are looking for.
The Camera is being set to orthographic with a bottom/left root.

The attached file hopefully makes this process understandable.

Best
Markus

Grid_OSC.40.toe (47.0 KB)

Wow you’re amazing, that does exactly what I need AND MORE. Thank you!

The key is in the “Texture Coord OP” + Replace mode.
I’d have never thought of the orthographic camera with Ortho Width exactly at the size of my surface + multiply, but it’s great to know it works. I’ll have to pass surface dimensions from Grasshopper to reference that dynamically in the Math and Camera OPs, actually that’ll be easy since by default GH does not normalize UV coordinates.
Thanks again so much.

1 Like