I’m looking for a very quick pixel displacement shader: given a plane with UVs and a texture, I want to displace pixels according to the rgb of the texture (in various ways).
anyone has a quick starting point shader to do that?
thanks!
d
I’m looking for a very quick pixel displacement shader: given a plane with UVs and a texture, I want to displace pixels according to the rgb of the texture (in various ways).
anyone has a quick starting point shader to do that?
thanks!
d
It depends what you mean by pixel displacement. You can’t generate new pixels or move the pixel you are drawing. All you can do is choose a color and a depth. Any pixel that is drawn must originally come from geometry (that gets rasterized)
You can change how you are sampling your color texture based on heightmaps/viewangle to generate the illusion of depth such as this:
download.nvidia.com/developer/GP … 2_ch08.pdf
But the plane will still be flat when you start to look at it from oblique angles. So you can’t create bumps that are visible from the side (or occlude other geometry) in the pixel shader.
You can displace vertices based on a texture though, so if you have a very finely divided mesh you can displace it in the vertex shader instead. As you know you need to be tricky to get correct normals though.
thanks - that’s weird - I thought the latest cards from 2-3 years ago were able to do (slowly) pixel displacement.
d
If you have an example where you saw real-time images that you thought were pixel displacement you could post a link here and I could try to tell you how they do it.
mmmm… I think I see:
http.developer.nvidia.com/GPUGem … ter08.html
does a pixel displacement in that it does occlusion but not real xyz displacement, right?
It occludes stuff within it’s own texture by sampling the color intelligently (essentially skipping parts of the texture and stretching others), but it doesn’t occlude any geometry (even geometry that was part of that original object).
In the end it’s just shifting around texture coordinates.