If you’ve ever setup a render network, and wondered why the lighting and realism just feels off, or doesn’t match the reference renders in other software or substance thumbnails etc, you might have something to gain from this post!
I’ve done a quick search through the facebook group, discord, and forums here and found quite little on the subject of linear vs gamma workflow, especially as it applies to 3d rendering pipelines.
I figured it would be worth while to start a conversation about this topic to clarify some things, and ask some questions, and hopefully by the end of it iron out a workflow that can help PBR renders and even just renders and image effects in general.
I’m writing the first part of this post with several assumptions about how Touch expects / handles data, so if anyone knows something I don’t please chime in! I’ll update anything that’s wrong in relation to how TouchDesigner works.
So, most of us know what gamma is functionally, you throw a level TOP after an image, and gamma makes it brighter and more washed out, or darker and more constrasty.
I think most of us use gamma as a form of color / image adjustment, or maybe in more technical ways to adjust the distribution or shape of a ramp, etc. This makes sense because Touch is such a visual power house, we can move sliders until we get a result we like, and not worry so much about the underlying formulas.
However, when it comes to 3d rendering, especially the PBR workflow - we plug a bunch of things into a render top, and a bunch of textures into a shader - some complicated magic happens - then we get a rendered image that we then color correct, tweak, and finesse some more.
Since the PBR shading formula is such a complex one, this is where I am concerned that maybe there’s not enough information about the proper workflow for setting up thes types of renders.
The short version of the problem is that PBR shaders expect to do their calculations on numbers in LINEAR space, not sRGB space.
This means all of our texture inputs that go into the PBR equation / shader need to be in linear space as well. What you get out of that shader is generally still in linear space and will look “wrong” even though it’s technically correct. More on this in a bit.
Let’s say we’re using substance materials, and for clarity lets just say we’re using exported textures for inputs into the pbr shader. Our export defaults in substance look like this:
basecolor is exported typically as sRGB and everything else as linear. What this means, is that to our eye - basecolor will “look” correct when we look at the image, but it’s actually not in the correct space for the pbr shader (unless of course it’s programmed to expect this) I expect that we need to toggle on the sRGB toggle in the movie in TOP or apply a gamma of .455 for base color:
Wes McDermott at Substance said it nicely:
A good way to look at is like this…
If it’s a color that you see then it’s sRGB. Everything else is linear. Basecolor contains diffuse reflected color and metal reflectance values. Both represent light rays bouncing back and picked up by our eyes. Emissive is the same.
Maps like AO, metallic and roughness represent data. All of the grayscale map need to be interpreted as linear. When we choose colors on our monitor, it’s in a gamma-encoded space (sRGB) this is why we need to have sRGB checked so the shader knows that this maps needs to have the gamma encoded value removed.
It’s also worth noting here, that to represent linear space with high quality, 16 bit is recommended. This is especially true of normal maps. If you find a texture set online where things like normals are in 8 bit, you’re likely going to get sub par results because of the loss of information from wherever that texture was exported from.
We also have Environment Light HDR files to think about. HDR files are generally saved in LINEAR space. Which is why when we drag it into TouchDesigner, it looks like this:
This “looks” incorrect, but actually the data itself is in linear space, and technically correct. This is what PBR shaders expect for the HDR environment.
If I put a gamma of 2.2 on this, in Touch, the image now “looks correct”, but would be technically incorrect for the shader:
The trick here is to leave the HDR in linear space, and pass that into the environment light COMP (again, assuming it’s not programmed to expect it in sRGB, which it probably isnt)
Sometimes an additional step called Tone Mapping happens before the final gamma correction to sRGB space. This is useful when lighting units are in real world values and may exceed the 0-1 range that values must eventually fall into for display on a monitor. Tone-mapping is a name for a curve mapping function that takes a value and remaps it into a 0-1 space while also redistributing luminance somewhat.
All this is to say, that typically the workflow with pbr is we make sure all of our assets are coming in as LINEAR, and the shader will render things out in LINEAR as well… then we tone map/gamma correct this result to it’s final screen ready state.
So, in graph summary form - ideally you are working like this:
All this said - I am not 100% certain about how Touch handles some of these things, only fairly certain from my testing. Here’s some questions I still have:
-
Do Touch’s PBR shaders expect all input maps to already be in Linear space? Or is Base Color treated any differently?
-
When using substance TOP with SBAR files in Touch, we have the ability to plug this into a pbr shader to supply all textures in one node. Is Base Color is being internally gamma corrected OR are they all getting passed in assuming they are all in linear space? Or does the substance TOP have a way to know if something is sRGB or Linear and correct it with out user input?
-
When we change the Pixel Format for a substance top from the default of 8 bit to 16 bit, do we actually get 16 bit higher quality output? or is the format simply 16 bit with no change to the data quality? When using a substance select, I see that it properly extracts the 16 bit version of an sbsar, so that method at least works.
-
Does Touch’s Environment light COMP expect the hdr to be in linear space?
-
Does the render TOP put out linear space as well? Or does it gamma anything on the way out?
Would love to hear other peoples thoughts and experiences on this topic as well. If the devs have any input into how some of the backend stuff works, would be grateful for more info there too!
Resources and further reading :
GpuGems - Importance of Being Linear
ToneMapping
Gamma in HDR images
Colorspaces in Substance