Linear vs Gamma PBR workflows in Touch

image
If you’ve ever setup a render network, and wondered why the lighting and realism just feels off, or doesn’t match the reference renders in other software or substance thumbnails etc, you might have something to gain from this post!

I’ve done a quick search through the facebook group, discord, and forums here and found quite little on the subject of linear vs gamma workflow, especially as it applies to 3d rendering pipelines.

I figured it would be worth while to start a conversation about this topic to clarify some things, and ask some questions, and hopefully by the end of it iron out a workflow that can help PBR renders and even just renders and image effects in general.

I’m writing the first part of this post with several assumptions about how Touch expects / handles data, so if anyone knows something I don’t please chime in! I’ll update anything that’s wrong in relation to how TouchDesigner works.


So, most of us know what gamma is functionally, you throw a level TOP after an image, and gamma makes it brighter and more washed out, or darker and more constrasty.


I think most of us use gamma as a form of color / image adjustment, or maybe in more technical ways to adjust the distribution or shape of a ramp, etc. This makes sense because Touch is such a visual power house, we can move sliders until we get a result we like, and not worry so much about the underlying formulas.

However, when it comes to 3d rendering, especially the PBR workflow - we plug a bunch of things into a render top, and a bunch of textures into a shader - some complicated magic happens - then we get a rendered image that we then color correct, tweak, and finesse some more.

Since the PBR shading formula is such a complex one, this is where I am concerned that maybe there’s not enough information about the proper workflow for setting up thes types of renders.


The short version of the problem is that PBR shaders expect to do their calculations on numbers in LINEAR space, not sRGB space.

This means all of our texture inputs that go into the PBR equation / shader need to be in linear space as well. What you get out of that shader is generally still in linear space and will look “wrong” even though it’s technically correct. More on this in a bit.


Let’s say we’re using substance materials, and for clarity lets just say we’re using exported textures for inputs into the pbr shader. Our export defaults in substance look like this:

basecolor is exported typically as sRGB and everything else as linear. What this means, is that to our eye - basecolor will “look” correct when we look at the image, but it’s actually not in the correct space for the pbr shader (unless of course it’s programmed to expect this) I expect that we need to toggle on the sRGB toggle in the movie in TOP or apply a gamma of .455 for base color:

Wes McDermott at Substance said it nicely:
A good way to look at is like this…

If it’s a color that you see then it’s sRGB. Everything else is linear. Basecolor contains diffuse reflected color and metal reflectance values. Both represent light rays bouncing back and picked up by our eyes. Emissive is the same.

Maps like AO, metallic and roughness represent data. All of the grayscale map need to be interpreted as linear. When we choose colors on our monitor, it’s in a gamma-encoded space (sRGB) this is why we need to have sRGB checked so the shader knows that this maps needs to have the gamma encoded value removed.

It’s also worth noting here, that to represent linear space with high quality, 16 bit is recommended. This is especially true of normal maps. If you find a texture set online where things like normals are in 8 bit, you’re likely going to get sub par results because of the loss of information from wherever that texture was exported from.


We also have Environment Light HDR files to think about. HDR files are generally saved in LINEAR space. Which is why when we drag it into TouchDesigner, it looks like this:


This “looks” incorrect, but actually the data itself is in linear space, and technically correct. This is what PBR shaders expect for the HDR environment.

If I put a gamma of 2.2 on this, in Touch, the image now “looks correct”, but would be technically incorrect for the shader:

The trick here is to leave the HDR in linear space, and pass that into the environment light COMP (again, assuming it’s not programmed to expect it in sRGB, which it probably isnt)


Sometimes an additional step called Tone Mapping happens before the final gamma correction to sRGB space. This is useful when lighting units are in real world values and may exceed the 0-1 range that values must eventually fall into for display on a monitor. Tone-mapping is a name for a curve mapping function that takes a value and remaps it into a 0-1 space while also redistributing luminance somewhat.

image


All this is to say, that typically the workflow with pbr is we make sure all of our assets are coming in as LINEAR, and the shader will render things out in LINEAR as well… then we tone map/gamma correct this result to it’s final screen ready state.

So, in graph summary form - ideally you are working like this:


All this said - I am not 100% certain about how Touch handles some of these things, only fairly certain from my testing. Here’s some questions I still have:

  1. Do Touch’s PBR shaders expect all input maps to already be in Linear space? Or is Base Color treated any differently?

  2. When using substance TOP with SBAR files in Touch, we have the ability to plug this into a pbr shader to supply all textures in one node. Is Base Color is being internally gamma corrected OR are they all getting passed in assuming they are all in linear space? Or does the substance TOP have a way to know if something is sRGB or Linear and correct it with out user input?

  3. When we change the Pixel Format for a substance top from the default of 8 bit to 16 bit, do we actually get 16 bit higher quality output? or is the format simply 16 bit with no change to the data quality? When using a substance select, I see that it properly extracts the 16 bit version of an sbsar, so that method at least works.

  4. Does Touch’s Environment light COMP expect the hdr to be in linear space?

  5. Does the render TOP put out linear space as well? Or does it gamma anything on the way out?

Would love to hear other peoples thoughts and experiences on this topic as well. If the devs have any input into how some of the backend stuff works, would be grateful for more info there too!


Resources and further reading :
GpuGems - Importance of Being Linear
ToneMapping
Gamma in HDR images
Colorspaces in Substance

8 Likes

I’ve also uploaded a TOE file illustrating how this linear pipeline would look in Touch.

2 Likes

I actually wrote an article on sRGB recently here:
https://docs.derivative.ca/SRGB

To answer your questions directly here:

The shaders assume all the texture values they are getting are in linear space. So flagging an sRGB texture as sRGB in the Movie File In TOP lets the GPU know to convert that data to linear before giving it to the shader.

The Substance TOP should be converting sRGB textures to linear when it loads them if it seems them tagged as such. Let me know if you don’t see this.

It won’t cause Substance Engine to output at a higher bit depth, it just changes what the original results as stored as when loaded into TD.

Yes, it expects it to be in linear. In general only 8-bit textures are worth storing in sRGB space to make better use of the limited bits (as described in the article I wrote). HDR data should always just be linear because there is no need for sRGB.

It outputs linear unless you choose the sRGB pixel format, in which case the data is ‘compressed’ as sRGB, but all downstream TOPs will get their values as linear since the GPU knows to convert back to linear when sampling from that texture. Thus, the brightness of the pixels don’t change, just what data is stored and what is discarded.

3 Likes

@malcolm thank you! that’s an awesome write up, I totally missed that in my search.
Thanks for clarifying on these questions too.

When we change the Pixel Format for a substance top from the default of 8 bit to 16 bit, do we actually get 16 bit higher quality output? or is the format simply 16 bit with no change to the data quality? When using a substance select, I see that it properly extracts the 16 bit version of an sbsar, so that method at least works.

The pixel format parameter on the Substance TOP doesn’t do anything currently, and the sbsar textures are rendered at their default pixel format defined in the sbsar. This will be resolved in the upcoming 40k (next experimental), where we will override the textures’ default format when the pixel format parameter is changed to something other than “Use Input”.

thanks for the clarification!

@malcolm so I just checked out the question of whether or not the sbsar basecolor is gamma corrected when providing the shader with the entire substance top - and it seems to not be.

Here’s a direct comparison of the two shaders - top is sourcing things manually, and I’m applying the gamma to basecolor by hand, ant bottom is a new pbr where I just plug the substance top directly in:

This is with my manual approach:

this is with the substance top -> pbr mat:

for reference, here’s the basecolor in sRGB space:

Perhaps the substance top is not flagging it correctly?

Thanks, can you share your files with us so we can look at this example directly?

Ya the above screen shots I took from the file I uploaded from the community page above

I just made a new pbr mat and plugged that substance top already in there to it:

Also not clear to me - does the shader torus in the viewer show a sRGB preview? or linear space preview?