This problem stems from the fact that perspective projections are non linear in nature and vertex attributes like UVs are interpolated linearly between verticies.
Since a quad is made up of two triangles, and projection distorts uvs in a non planar way, those two sub triangles do not exist in the same “plane” that you would expect when looking at it, or laying uvs out in something like blender etc.
You’ll notice the distortion happens along these polygon edges:
If you visualize the geometry uv map and the texture, you’ll notice there is nothing square about the shape of the projected uvs ( even though the texture now looks fine )
For this reason subdividing the geometry does not fix the issue it just makes it less extreme the more you subdivide. If you’re able to subdivide enough and don’t have any animated geometry or complex surfaces then that might be a viable work around.
There is one step specifically in the shader that breaks things in terms of linearity, though it is necessary for projection - that is the perspective divide.
Usually, for speed this is done in the vertex shader and that is usually fine since it is assumed uvs won’t be stretched in this way, but when the need comes for perspective projection like this then the problem can be fixed by doing the perspective divide in the fragment shader uniquely for each pixel rather than in the vertex shader for just each vertex !
this means fundamentally there’s no way to provide this info through sops alone, the shader and thus the render TOP down stream has to know that it needs to do this step in a different way when it renders the actual pixels.
Though you can still make this work by using a material SOP the way you had.
I’m not sure if anyone wanted the long and drawn out technical answer but there it is - also here’s a modified version of the above file with a simple glsl material that will do this perspective divide in frag shader.
ProjectionFromCamPerspective.1.toe (2.5 MB)