hello,
i am having trouble understanding the geometry editors dependencies … when i have a simple chain of geometry/camera/renderer from an imported model i can go edit the base node and when i drag around vertices or edges i can see the mesh update in the renderer in real time … when i exit the editor this node is now locked and all is well
but in my patches i always have quite complex networks of SOPS modifying the geometry, and often use in/out SOPs going from different containers to other so i can split up the complexity of my patches
and in these complex networks i somehow can’t figure out where i can edit the source geometry so that this change at the base is carried on throughout the chain of SOPS after that …
currently i am working on a projection mapping patch and in there i have modeled the geometry i want to project on and imported into touch, but the 3d model is not 100% precise and with the innacuracies of the projection lens and virtual camera setup it is necessary to drag some vertices into place manually, and to be able to see the result in the renderer in real time is vital
completely stuck here, i tried various approaches but can’t figure this out, for example i am combining several meshes like this:
meshSOP → transformSOP → mergeSOP → outSOP —>another container —>inSOP
now when i go model either of these SOP all downstream models don’t have the change … the only way to update the final renderer is either to force cook (right click force cook option) or to insert sth in the stream that forces a cook (we used a transform with an animated $AT transform multiplied by 0)
if someone could demistify this for us that would be cool