I was surprised by how easy it was to interpolate over the texture coordinates, given the barycentric coordinate space. I have more boilerplate code to convert the mesh vertices into cgmath vectors than there is code to interpolate the triangles! I'll refactor it all away after the renderer's features begin to stabilize. With a little gamma and luminance love, I now have nearest-neighbor texture mapping:
You'll also notice I haven't removed the gradients yet. It will remind me to wrap up the post processing later on. Unfortunately, the evenly spaced gradient stops that I've been using so far are TERRIBLE for this particular frame buffer. The pixels are all too dark, so they fall into the lower half of the spectrum where there are only two shades; black and mid-gray. I had always planned to make the gradient code flexible enough to work with the stops placed anywhere in the normalized range. I just haven't tried it yet. This kind of image is exactly why that flexibility is needed; pushing the stops closer to black will give much better dithering results.
Also useful to mention is that the texture is a massive 1024x1024. That's 16x larger than my frame buffer! I don't have to do any kind of fancy texture filtering in any case. Nearest neighbor is perfect for my use case.
The next thing I did was implement viewport and projection matrices. There's not a whole lot to talk about though. So here's a picture:
tutorial covers Gouraud Shading! I don't expect any surprises. After this, the ugly dark spots on the neck should be hidden by liner gradients. :)
So far, it's looking great. Everything is coming together with exceptional results. It looks identical to the screenshots in the tutorial, with the exception that I'm doing low-resolution black-and-white renderings!
After the lighting improvements, the next item on my TODO list is making my gradient functions more flexible, so I can start dithering the frame buffer. 😁