Original Author: Jonas-Norberg
Do we really need to care about Gamma?
Many of you might have read a lot about gamma correction already, many of you might have figured out a way of wrangling the related problems in a way that leaves no room for improvement. Still I know different studios bound by different limitations handle gamma in different ways, and I hope an enlightening discussion will follow the post.
We recently made our workflow gamma-aware, and the below points were important to us.
- Improve visual quality (through math)
- Don’t introduce artifacts
- Do not hurt performance
- Do not impact our artist workflows
- Maintain small compressed textures
- Maintain 32-bit back-buffers
Gamma is a part of your color-work-flow whether you like it or not, so you might as well embrace it. Let’s dive deeper.
Origins of Gamma
Gamma originates from days of CRT-monitors, and refers to the non-linear relationship between the voltage you presented to your monitor and the light-intensity it represents. Most of the time this can be expressed with a simple power function.
Light = Voltage ^ Gamma
The engineers knee-jerk reaction to hearing this might be to blame the monitors and suggest that we fix this problem in the monitor or close to it (graphics-card). Not so fast Sherlock, listen to what Charles Poynton says:
The non-linearity of a CRT is very nearly the inverse of the lightness sensitivity of human vision. The non-linearity causes a CRT’s response to be roughly
perceptually uniform. Far from being a defect, this feature is highly desirable.
Turns out this non-linear-color-encoding is actually useful. And the world or computer color is slowly converging around a standard color space called “sRGB”.
Gamma vs. sRGB
Most likely the computer you’re using to read this post is calibrated to sRGB. Images you see on the Internet are encoded in sRGB too.
Technically, sRGB is a color space that can not be expressed exactly with a simple power function, since it has a linear piece close to black. In practice though, you’re fine using gamma-2.2 for most application.
The need for Linear
Even though sRGB is great for storing and presenting colors, there are things it’s not good for. Whenever we need to blend two colors together, or do any mathematical operation on a color, we should do it in linear color space.
Examples of operations that needs to be done in linear space include:
- filtering (i.e. blur)
- light calculations
- fade in / out
I know there are some overlapping concepts in the list above, but still, most things you do with colors should be done in linear space.
This is all well known in film, and many film-houses use linear image formats (like openexr) to keep their whole work-flow linear.
Attack the Core
Bear with me for a second as I give in to the engineering knee-jerk reaction I mentioned earlier. Let us try to attack the Gamma problem at the root.
All graphics-cards now have a gamma-ramp function. Let’s use it to undo the perceived weakness of the monitor.
(as a note, if you follow the instructions of the “calibrate display color” you do not undo the monitor gamma, but calibrates to sRGB )
Ok, tweaking the gamma to undo the monitor gamma, now we have a great linear view of our linear assets. If I author a texture in my favourite graphics package I will end up with a linear asset. All my math in my tools and shaders can be linear and all should be good.
This might seem like a decent idea, but we can already recognize some obvious flaws. Let’s see:
1. The surrounding world is not linear. This means most ways of acquiring colors, like cameras, scanners, images from texture-cds, images from the internet will give you colors encoded in sRGB.
2. Every tool in your workstation (including your internet browser) will look bad because they are all tuned for sRGB.
3. We can not force the consumer to apply the same gamma curve we just did.
This doesn’t feel 100% legit, but let’s see if we can muscle through these issues, A few minor problems haven’t stopped us in the past.
Keeping it Linear
The problem with the consumer having his OS calibrated to sRGB can actually be supported by using a present-flag in DirectX9 (D3DPRESENT_LINEAR_CONTENT).
The downside now is that the developer (you) have double gamma correction going on while testing your title. We gamma correct once with the present-flag, and a second time with the hardware gamma curve. This doesn’t look good. Still, let’s move on.
If you keep textures in linear space, you quickly notice that lossy-compression schemes start breaking down. The compression becomes unexpectedly harsh in dark areas.
Both DXT and our own DCT based solution (see my last post) relies on the color-space being closely related to our perception. So most texture compression-tools really need sRGB textures.
The light areas are fine and the dark areas suffer.
It’s even worse if you try to use DXT compression that uses 5 and 6 bits per channel. This low bit depth is borderline acceptable in sRGB but becomes unusable in linear.
Ok, so compressed textures need to be in sRGB. What if we convert them to linear after de-compressing them?
Even if the compression now works fine, we quickly realize that 8bit per channel is not good enough for linear. Banding galore. The same is also true for the frame-buffers. If we have 8-bits per linear channel in our frame-buffer, we will have banding. The present flag I mentioned earlier is thus only useful if your back-buffer is higher precision than 8bits per channel.
This is true for both textures and frame-buffers. This is why film-houses go 16 bit per channel. You don’t need HDR to appreciate 16 bit per channel, you just need linear.
This looks bad… we can’t store textures in linear, and we don’t want to take the plunge going to 16 bits per channel because of the memory and performance cost.
Mixed Color Spaces
To get around this we have to keep textures and frame-buffers in sRGB and convert to linear in our shaders, do our math, and then convert back before writing.
This conversion business would add some code to every texture fetch, and some code to every final color. Always looking for corners to cut, we realize that using gamma 2.0 instead of sRGB allow some optimization. We can use a square function (x*x) when reading a color and a square root before writing. This has been proven to work and you might need to resort to this on some platforms. Close enough for rock and roll… or video games.
The color-connoisseurs among you will notice one limitation. The texture fetch is often using bi- or tri-linear interpolation to get you a color. If you do gamma correction in the shader the interpolation is still not aware of Gamma.
Your hardware does know about sRGB
Luckily we stand on the shoulders of some pretty smart giants here. Both OpenGL and DirectX provide us with some pretty nifty functionality. If you tell yoru API-of-choice that your texture is in sRGB format ( for DirectX9, D3DSAMP_SRGBTEXTURE ) the texel-sampler hardware can do the conversion for you. In the shader you will receive a fancy all linear color.
Not only can our hardware do the linearization of sRGB textures, it can also write to sRGB framebuffers. ( for DirectX9, D3DRS_SRGBWRITEENABLE )
Even the alpha blending with destination buffer (ie multi-pass rendering) can be gamma correct if the sRGB conversion happens after combine. ( for DirectX9, D3DPMISCCAPS_POSTBLENDSRGBCONVERT )
Technically the card is allowed to implement the linearization after the interpolation, but most DX10 compatible cards and above does this correctly and interpolates in linear space.
Testing your Gamma Correctness
A neat test case is to compare a black-white checkerboard texture with a solid grey representing half luminance. The different parts of the image should appear very similar in lightness. If you use an LCD monitor, beware that the lightness will greatly depend on the viewing angle. View from straight on before judging.
Let us look at the difference between doing regular old texture reads and writes, and then read and write sRGB aware. (click on images for true size)
With these two features all puzzles pieces fall into place.
Summary of our Gamma Workflow
- Textures are sRGB
- Frame-buffers are sRGB
- Shaders read sRGB
- Shader operations done in Linear Space ( lighting, box filters, post effects )
- shaders write sRGB
So what happened to the “gamma adjust” you can do in you driver. That is still useful, and is only used to go from the sRGB color space to whatever your monitor is.
Things to look out for
- You still have to look at your rendering package. Make sure it’s setup in a gamma correct way, taking your sRGB textures in to account. Make sure the resulting textures are sRGB.
- Calibrate your monitors to be sRGB compliant standard – Windows 7 std profile, and calibrate to it so that when you work on your assets what looks “right” to your eyes will save out as a proper sRGB texture asset on disk.
- Many graphics-packages are not fully gamma aware, so resizing an image in (old versions of) Photoshop will change the colors! (try with checkerboard)
- If you generate mipmaps manually, make sure to do filtering in linear space. (not standard) but store out sRGB mip chain colors.
The checkerboard test image is a good test for mip map generation too. Below we do the sRGB reads and writes in the run time, but the mip maps in the first example were created with no worries about gamma.
- All our shaders are doing their thing in linear space
- We are able to use compressed textures
- We are using only 32-bit backbuffers
- Letting artists keep their machines setup like they are used to
- Doing no extra shader instructions to the shaders on modern GPU’s
References / Further Reading