Hi everyone,
I’m trying to understand a conceptual difference between how point index–based deformations work in TouchDesigner compared to Blender, and I’d really appreciate some clarification.
Setup
-
I create a Line SOP with 100 points
-
I derive a looping parameter from the point index using floor + modulo
Specifically:
-
Start from
_PointI -
Apply floor and modulo so that every 32 points the value loops
-
Normalize the result so that it goes from 0 to 1 within each 32-point segment
Conceptually:
loopIndex = _PointI % 32
u = loopIndex / 31
theta = u * 2π
- I then apply:
P.z = cos(theta)
So in practice, every group of 32 points should describe one full cosine cycle from 0 → 2π.
Expected result
-
A clear sinusoidal deformation along the line
-
Every 32 points forming one full cosine wave
-
I expect the cosine to act as a spatial deformation
What actually happens
-
Instead of a clear sinusoidal shape, I get a big shift on the Z axis and what looks like a small offset per modulo.
-
The deformation feels more like a numerical modulation than a visible wave in space
-
Even though
ucorrectly goes from 0 to 1 every 32 points, the result doesn’t resemble a spatial sinusoid
Why this is confusing
In Blender Geometry Nodes, using a normalized index (even with modulo) produces the expected sinusoidal deformation along existing geometry, without explicitly using a spatial coordinate.
I’m attaching:
-
the TouchDesigner file
-
the Blender Img with nodes and the final result
so the exact same logic can be compared.
Thanks a lot for any insight — this feels like a conceptual difference rather than a math error.
knit.toe (4.1 KB)

