I was surprised by how easy it was to interpolate over the texture coordinates, given the barycentric coordinate space. I have more boilerplate code to convert the mesh vertices into cgmath vectors than there is code to interpolate the triangles! I'll refactor it all away after the renderer's features begin to stabilize. With a little gamma and luminance love, I now have nearest-neighbor texture mapping:
The gamma correction was crucial, since this texture went through two separate processing passes; first, I dropped all chrominance information from the RGB leaving just the relative brightness; and second the global illumination was applied to the texture mapped geometry as you might imagine. To get it working right, the RGB components are transformed from Gamma Space to Linear Space prior to the luminance transformation. Then the texture stays in Linear Space until after the final illumination pass. The pixels are transformed back to Gamma Space as they are written to the frame buffer.
You'll also notice I haven't removed the gradients yet. It will remind me to wrap up the post processing later on. Unfortunately, the evenly spaced gradient stops that I've been using so far are TERRIBLE for this particular frame buffer. The pixels are all too dark, so they fall into the lower half of the spectrum where there are only two shades; black and mid-gray. I had always planned to make the gradient code flexible enough to work with the stops placed anywhere in the normalized range. I just haven't tried it yet. This kind of image is exactly why that flexibility is needed; pushing the stops closer to black will give much better dithering results.
Also useful to mention is that the texture is a massive 1024x1024. That's 16x larger than my frame buffer! I don't have to do any kind of fancy texture filtering in any case. Nearest neighbor is perfect for my use case.
The next thing I did was implement viewport and projection matrices. There's not a whole lot to talk about though. So here's a picture:
And that's it! At this point, it doesn't even look like it needs any per-pixel lighting. But I'm fairly certain that better lighting will be noticeable. That also means that better lighting will affect the dithering, too. As luck would have it, the next lesson in the tutorial covers Gouraud Shading! I don't expect any surprises. After this, the ugly dark spots on the neck should be hidden by liner gradients. :)
So far, it's looking great. Everything is coming together with exceptional results. It looks identical to the screenshots in the tutorial, with the exception that I'm doing low-resolution black-and-white renderings!
After the lighting improvements, the next item on my TODO list is making my gradient functions more flexible, so I can start dithering the frame buffer. 😁
No comments:
Post a Comment