Linear vs Gamma PBR workflows in Touch

image
If you’ve ever setup a render network, and wondered why the lighting and realism just feels off, or doesn’t match the reference renders in other software or substance thumbnails etc, you might have something to gain from this post!

I’ve done a quick search through the facebook group, discord, and forums here and found quite little on the subject of linear vs gamma workflow, especially as it applies to 3d rendering pipelines.

I figured it would be worth while to start a conversation about this topic to clarify some things, and ask some questions, and hopefully by the end of it iron out a workflow that can help PBR renders and even just renders and image effects in general.

I’m writing the first part of this post with several assumptions about how Touch expects / handles data, so if anyone knows something I don’t please chime in! I’ll update anything that’s wrong in relation to how TouchDesigner works.


So, most of us know what gamma is functionally, you throw a level TOP after an image, and gamma makes it brighter and more washed out, or darker and more constrasty.


I think most of us use gamma as a form of color / image adjustment, or maybe in more technical ways to adjust the distribution or shape of a ramp, etc. This makes sense because Touch is such a visual power house, we can move sliders until we get a result we like, and not worry so much about the underlying formulas.

However, when it comes to 3d rendering, especially the PBR workflow - we plug a bunch of things into a render top, and a bunch of textures into a shader - some complicated magic happens - then we get a rendered image that we then color correct, tweak, and finesse some more.

Since the PBR shading formula is such a complex one, this is where I am concerned that maybe there’s not enough information about the proper workflow for setting up thes types of renders.


The short version of the problem is that PBR shaders expect to do their calculations on numbers in LINEAR space, not sRGB space.

This means all of our texture inputs that go into the PBR equation / shader need to be in linear space as well. What you get out of that shader is generally still in linear space and will look “wrong” even though it’s technically correct. More on this in a bit.


Let’s say we’re using substance materials, and for clarity lets just say we’re using exported textures for inputs into the pbr shader. Our export defaults in substance look like this:

basecolor is exported typically as sRGB and everything else as linear. What this means, is that to our eye - basecolor will “look” correct when we look at the image, but it’s actually not in the correct space for the pbr shader (unless of course it’s programmed to expect this) I expect that we need to toggle on the sRGB toggle in the movie in TOP or apply a gamma of .455 for base color:

Wes McDermott at Substance said it nicely:
A good way to look at is like this…

If it’s a color that you see then it’s sRGB. Everything else is linear. Basecolor contains diffuse reflected color and metal reflectance values. Both represent light rays bouncing back and picked up by our eyes. Emissive is the same.

Maps like AO, metallic and roughness represent data. All of the grayscale map need to be interpreted as linear. When we choose colors on our monitor, it’s in a gamma-encoded space (sRGB) this is why we need to have sRGB checked so the shader knows that this maps needs to have the gamma encoded value removed.

It’s also worth noting here, that to represent linear space with high quality, 16 bit is recommended. This is especially true of normal maps. If you find a texture set online where things like normals are in 8 bit, you’re likely going to get sub par results because of the loss of information from wherever that texture was exported from.


We also have Environment Light HDR files to think about. HDR files are generally saved in LINEAR space. Which is why when we drag it into TouchDesigner, it looks like this:


This “looks” incorrect, but actually the data itself is in linear space, and technically correct. This is what PBR shaders expect for the HDR environment.

If I put a gamma of 2.2 on this, in Touch, the image now “looks correct”, but would be technically incorrect for the shader:

The trick here is to leave the HDR in linear space, and pass that into the environment light COMP (again, assuming it’s not programmed to expect it in sRGB, which it probably isnt)


Sometimes an additional step called Tone Mapping happens before the final gamma correction to sRGB space. This is useful when lighting units are in real world values and may exceed the 0-1 range that values must eventually fall into for display on a monitor. Tone-mapping is a name for a curve mapping function that takes a value and remaps it into a 0-1 space while also redistributing luminance somewhat.

image


All this is to say, that typically the workflow with pbr is we make sure all of our assets are coming in as LINEAR, and the shader will render things out in LINEAR as well… then we tone map/gamma correct this result to it’s final screen ready state.

So, in graph summary form - ideally you are working like this:


All this said - I am not 100% certain about how Touch handles some of these things, only fairly certain from my testing. Here’s some questions I still have:

  1. Do Touch’s PBR shaders expect all input maps to already be in Linear space? Or is Base Color treated any differently?

  2. When using substance TOP with SBAR files in Touch, we have the ability to plug this into a pbr shader to supply all textures in one node. Is Base Color is being internally gamma corrected OR are they all getting passed in assuming they are all in linear space? Or does the substance TOP have a way to know if something is sRGB or Linear and correct it with out user input?

  3. When we change the Pixel Format for a substance top from the default of 8 bit to 16 bit, do we actually get 16 bit higher quality output? or is the format simply 16 bit with no change to the data quality? When using a substance select, I see that it properly extracts the 16 bit version of an sbsar, so that method at least works.

  4. Does Touch’s Environment light COMP expect the hdr to be in linear space?

  5. Does the render TOP put out linear space as well? Or does it gamma anything on the way out?

Would love to hear other peoples thoughts and experiences on this topic as well. If the devs have any input into how some of the backend stuff works, would be grateful for more info there too!


Resources and further reading :
GpuGems - Importance of Being Linear
ToneMapping
Gamma in HDR images
Colorspaces in Substance

11 Likes

I’ve also uploaded a TOE file illustrating how this linear pipeline would look in Touch.

3 Likes

I actually wrote an article on sRGB recently here:

To answer your questions directly here:

The shaders assume all the texture values they are getting are in linear space. So flagging an sRGB texture as sRGB in the Movie File In TOP lets the GPU know to convert that data to linear before giving it to the shader.

The Substance TOP should be converting sRGB textures to linear when it loads them if it seems them tagged as such. Let me know if you don’t see this.

It won’t cause Substance Engine to output at a higher bit depth, it just changes what the original results as stored as when loaded into TD.

Yes, it expects it to be in linear. In general only 8-bit textures are worth storing in sRGB space to make better use of the limited bits (as described in the article I wrote). HDR data should always just be linear because there is no need for sRGB.

It outputs linear unless you choose the sRGB pixel format, in which case the data is ‘compressed’ as sRGB, but all downstream TOPs will get their values as linear since the GPU knows to convert back to linear when sampling from that texture. Thus, the brightness of the pixels don’t change, just what data is stored and what is discarded.

4 Likes

@malcolm thank you! that’s an awesome write up, I totally missed that in my search.
Thanks for clarifying on these questions too.

When we change the Pixel Format for a substance top from the default of 8 bit to 16 bit, do we actually get 16 bit higher quality output? or is the format simply 16 bit with no change to the data quality? When using a substance select, I see that it properly extracts the 16 bit version of an sbsar, so that method at least works.

The pixel format parameter on the Substance TOP doesn’t do anything currently, and the sbsar textures are rendered at their default pixel format defined in the sbsar. This will be resolved in the upcoming 40k (next experimental), where we will override the textures’ default format when the pixel format parameter is changed to something other than “Use Input”.

thanks for the clarification!

@malcolm so I just checked out the question of whether or not the sbsar basecolor is gamma corrected when providing the shader with the entire substance top - and it seems to not be.

Here’s a direct comparison of the two shaders - top is sourcing things manually, and I’m applying the gamma to basecolor by hand, ant bottom is a new pbr where I just plug the substance top directly in:

This is with my manual approach:

this is with the substance top → pbr mat:

for reference, here’s the basecolor in sRGB space:

Perhaps the substance top is not flagging it correctly?

Thanks, can you share your files with us so we can look at this example directly?

Ya the above screen shots I took from the file I uploaded from the community page above

I just made a new pbr mat and plugged that substance top already in there to it:

Also not clear to me - does the shader torus in the viewer show a sRGB preview? or linear space preview?

Currently everything in our UI shows the data as-is, so if the source content is linear, it’ll be darker than it should be on an sRGB monitor

1 Like

We looked into and and did find something amiss, in the next build (2020.24220+) the Substance TOP will properly flag that sRGB.

@lucasm Maybe your asset should be updated to remove the extra conversion on the ThroneRoom_Stone’s baseColor once we post the new build?

Contact us if you want to try an internal build in the meantime to test.

1 Like

@ben would love to test out a build and update the asset accordingly when that goes live!
Exciting to hear this will be fixed in the next build.

Have there been any developments or changes surrounding the normal style RFE?

Normals in sbar’s are either exported as openGL or DirectX style, I’m not sure if it’s possible to detect the style, or if it makes sense to give users a toggle for reversing the normal style, but I have a feeling many normal maps get applied incorrectly, making renders and lighting look wrong, in a very subtle sneaky way.

Sorry, had not seen that RFE, we’ll look into it.

1 Like

We did not find a way to auto-detect in the SDK so there will be a toggle in the next posted build.

1 Like

woohoo! This is great news :slight_smile:

Hi and thank you lucasm for these valuable insights !
I’m currently in the process of refining my PBR renderings, and I while tweaking with HDR files, I came accross something I don’t fully understand.
I think it’s related to the topic, and allow myself to share it here, hoping it’s not out of place :slight_smile:

Depending on the situation (generally when the HDR file is very bright), I sometimes get better visual results setting a Level TOP with default parameters, before referring to the environment light map.

At first it really didn’t make sense to me, thinking that a default Level TOP was acting as being bypassed. But we can see a change in the highlights.
The Topto clearly shows that the Level TOP crops the luminance values between 0 and 1, discarding the original high dynamic range (going over 10).
I’m surprised because the Level TOP is in 32-bit float, and behaves like 8bit.
I never read about something like that in the wiki, are TOP unable to work with high dynamic ranges ?
I’m also wondering why this range crop actually makes some materials look better (A marble could appear totally burned with those highlights).

In your level top you have clamp input to 0-1 enabled, which will automatically clamp the incoming values regardless of your pixel mode (8vs16v32)

Also, the latest stable has some greatly improved hdr prefiltering for specular reflections, this especially helps for bright spots reflected in rougher levels.

Clamping to 0-1 really takes away from what makes hdr environment lights so beautiful, the range and contrast and nuance goes away to a large degree. you’re right though definitely get a “cleaner result” at expense of all that.

I’d clamp it to 10, or 16, or 20. Something that’s like ~5x the rough baseline, but it really depends on the map, no perfect value.

Oh this little clamp switch is so obvious that I did not pay attention to it !
thank you for your input

2 Likes