Tool Post: Texture Compiler
All engines run off of generated assets. The most advanced renderer on the planet is meaningless if all you can do is draw a programmer defined cube with it. These assets are created by artist tools, such as Maya or 3ds max, but aren't necessarily loaded into the game in those formats. Try parsing a Wavefront .obj model every time you want to load an asset and you'll see what I mean, it's damn slow. Engines tend to run off their own optimized formats that are created from source material by a resource pipeline, a set of tools that converts artist created models, audio clips, textures etc, into a format that is optimal for the engine to locate, load and work with. In addition, the resource pipeline may bundle engine and game specific data into these formats for use later on.
The first tool I created was a texture compiler. Now loading in raw .png files and using them as textures isn't the most horrible thing that could be done. But it does have problems as you'll see later on in this post. It appears trivial at first, but there's a bit of work that needs to be done with source textures before you're ready to render with them properly. Chief among the concerns is the issue of Gamma Correction.
Gamma Correction
The first tool I created was a texture compiler. Now loading in raw .png files and using them as textures isn't the most horrible thing that could be done. But it does have problems as you'll see later on in this post. It appears trivial at first, but there's a bit of work that needs to be done with source textures before you're ready to render with them properly. Chief among the concerns is the issue of Gamma Correction.
Gamma Correction
There are TONS of resources on this subject now, but I'll include the briefest explanation.
So, from what I can gather, the story goes like this. Once upon a time we had CRT monitors, and it was noted that the physical output of the beam varied non-linearly with the input voltage applied. What this means is that if you wanted to display the middle grey between pitch black and pure white, and you input the RGB signal (0.5, 0.5, 0.5), you wouldn't get the middle grey as you would expect. If you measured the output brightness, you got something along the lines of (0.22, 0.22, 0.22). Worse still with this phenomenon, you actually get colour shifting(!), observe... I enter (0.9, 0.5, 0.1) and I get (0.79, 0.21, 0.006), the red becomes far more dominant in the result.
When plotted on a graph, the relationship could be viewed thus:
When plotted on a graph, the relationship could be viewed thus:
Note the blue line, this is the monitors natural gamma curve. Also note that I've used the power factor 2.2 to represent the exponent that the monitors have. The exponent actually varies, however, 2.2 is generally close enough to correct that it can be used in most cases.
Nowadays, most displays aren't CRT. But, in the interest of backwards compatibility, modern monitors emulate this gamma curve.
But how come all the images you see aren't darker than they should be?
Well that's because a lot of image formats these days are pre-gamma-corrected (jpeg, png are two). That means that the internal image values are mapped to be the green line in the graph, basically raised to the power of 1 / 2.2. This has the effect of cancelling out the monitors gamma when displayed to the user. So at the end you see the image values as they were intended. Which is great when all you're doing is viewing images, but it causes some serious (and subtle) issues when rendering. Because all of the operations that occur during rendering assume linear relationships. Obvious examples are texture filtering, mipmap generation, alpha-blending, or lighting.
Why didn't they keep the image formats linear and just pre-gamma correct before outputting to the display? What's with the inverse gamma curve on the images? The answer is another history lesson, it turns out by lucky coincidence (which was actually purposeful engineering) that raising the image values to the reciprocal gamma exponent had the side effect of allocating more bits to represent darker values. This makes sense as humans are more adept at seeing differences between dark tones than differences between light tones. In a way, it makes sense to still have it around.
What this all comes down to is that we have to deal with these non-linear inputs and outputs on either side of our linear rendering pipeline.
The Texture Compiler
Nowadays, most displays aren't CRT. But, in the interest of backwards compatibility, modern monitors emulate this gamma curve.
But how come all the images you see aren't darker than they should be?
Well that's because a lot of image formats these days are pre-gamma-corrected (jpeg, png are two). That means that the internal image values are mapped to be the green line in the graph, basically raised to the power of 1 / 2.2. This has the effect of cancelling out the monitors gamma when displayed to the user. So at the end you see the image values as they were intended. Which is great when all you're doing is viewing images, but it causes some serious (and subtle) issues when rendering. Because all of the operations that occur during rendering assume linear relationships. Obvious examples are texture filtering, mipmap generation, alpha-blending, or lighting.
Why didn't they keep the image formats linear and just pre-gamma correct before outputting to the display? What's with the inverse gamma curve on the images? The answer is another history lesson, it turns out by lucky coincidence (which was actually purposeful engineering) that raising the image values to the reciprocal gamma exponent had the side effect of allocating more bits to represent darker values. This makes sense as humans are more adept at seeing differences between dark tones than differences between light tones. In a way, it makes sense to still have it around.
What this all comes down to is that we have to deal with these non-linear inputs and outputs on either side of our linear rendering pipeline.
The Texture Compiler
Wow, a seriously complicated tool yeah? It's about as basic an interface as you can get for working with textures. Most of what's interesting happens in the background. It basically works as follows.
What the tool does is maintains a list of textures in an intermediate format, which can be saved out to a texture asset file (*.taf). This enables you to load up an existing asset file, add images, remove images, rename, change a parameter and so on, then save again. Then, when you want to export to the format the engine consumes, you select which platforms you want to publish to (right now it's just PC OpenGL) and hit the Publish button. This then generates a very simple database, it's basically a two part file. The index part and the data part.
When the engine loads up, it only loads the texture database's index. Then, when an asset is encountered that requests a texture, the following process occurs. The engine queries the resident texture set, if the texture has been loaded onto the GPU already, it's returned immediately. If it hasn't been loaded yet, then the texture database index is queried for the offset into the texture database file of the desired texture. The raw database entry is loaded and, if it was successfully compressed by the publishing tool, it's decompressed into the raw image data. Then that raw image data is compressed to whatever GPU friendly compression format is supported and sent off to the GPU. If a texture is requested that isn't inside the texture database, then a blank white texture is returned instead.
It should be noted that the textures inside the texture database file are already in linear space. If you look at the tools screenshot, you'll see that there's a "Gamma correct on publish." option. That will simply tell the tool that on publish, raise the texture values to the desired power (in this case 2.2) to bring the values back into linear space. Then all of the automatic mipmap generation and texture filtering in the API and on the GPU will be correct from the get go. It's also an option specifically because for some textures, you don't want to gamma correct. Normal maps for instance tend to not be pre corrected i.e are already linear. Because our inputs are now linear, and our internal operations are linear, all that's required at the end of the rendering pipeline is to apply the inverse gamma correction to the framebutffer and... that's a bingo!
Just as an addendum on the whole linear pipeline topic, note that the alpha channel of gamma corrected (sRGB) textures will also be linear and therefore need no correction. Aaaaand also that while storing linear textures has its advantages, you won't be allocating as many bits of precision to the lower intensity light values. There are a few ways to go about fixing this (such as moving from 8 bits per channel to 16). Having said that, I haven't really noticed any glaring artifacts as the textures we're using for our game are all bright and colorful, so its alright :)
No comments:
Post a Comment