Monday, October 9, 2017

Tiny MCU 3D Renderer Part 8: Programmable Pipeline and Asset Pipeline

Oh hey, look at that! I changed the CSS on my blog a smidge. It's worth mentioning, anyway. Say hello to the Blipjoy Invader! You can just make it out on the left side, there. (Depending on your screen resolution, it might be behind the post content, whoops!)

I also finalized the programmable pipeline on the 3D renderer, thanks to @vitalyd over at the Rust user's forum for the hint I needed to push me in the right direction. The API isn't exactly what I had in mind, but it's certainly reasonable.

This is what I built yesterday. The model on the left is drawn with our very familiar Gouraud shader with interleave dithering on a four-color gradient. This is what we've seen exclusively in screenshots to this point. On the right is something new! A much simplified shader that renders something like a cartoon, aka cel shading minus edge detection. Each model rotates in opposite directions for funsies.

Code from the last article will be referenced below.

Let's write a shader!

let projection = perspective(Deg(60.0_f32), SAR, 1.0, 100.0);
let light_dir = Vector3::new(-1.0, 1.0, 1.0).normalize();

let mut gouraud_shader = {
    let path = Path::new("./data/african_head_diffuse.tga");
    let gradient = Gradient::new(
        Step::new(0.0 / 255.0, 0x0d),
        Step::new(32.0 / 255.0, 0x07),
        Step::new(71.0 / 255.0, 0x17),
        Step::new(160.0 / 255.0, 0x27),
    );
    let diffuse = load_texture(path, gradient)

    let uniform = gouraud::Uniform::new(projection.clone(), light_dir.clone(), diffuse);

    ShaderProgram::new(uniform, gouraud::Vertex::new())
};

The magic load_texture function is a utility that wraps the awesome imagefmt crate. It returns a Texture struct, as mentioned briefly in the last article. Here's the structure definition:

pub struct Texture {
    pub width: f32,
    pub height: f32,
    pub texels: Vec<u8>,
    pub gradient: Gradient,
}

Pretty straightforward. The surprising thing here is probably the gradient. The shader uses that to reduce the color depth. In this example, the color depth for the texture is reduced to 4 colors.

The perspective projection matrix and light direction vector are cloned in the above code because it's just easier to write structs that own the data in their fields. These objects will be cloned into every shader instance created. This one, for example:

let mut toon_shader = {
    let uniform = toon::Uniform::new(projection.clone(), light_dir.clone());

    ShaderProgram::new(uniform, toon::Vertex::new())
};

The toon shader is much simpler! No Texture is necessary, so we don't even need to create a gradient. You saw the implementation of the Gouraud shader in the last article. Now let's look at the new toon shader, starting with the structs:

pub struct Attribute {
    position: Vector4<f32>,
    _uv: Vector2<f32>,
    normal: Vector3<f32>,
}

pub struct Uniform {
    pub projection: Matrix4<f32>,
    pub modelview: Matrix4<f32>,
    pub normal_matrix: Matrix3<f32>,
    pub light_dir: Vector3<f32>,
}

pub struct Vertex {}

pub struct Fragment {
    intensity: f32,
}

No surprises there. The uv attribute is unused, so it's prefixed with an underscore. We'll move on to the meaty implementations:

impl VertexShader for Vertex {
    type Attribute = Attribute;
    type Uniform = Uniform;
    type Fragment = Fragment;

    fn shader(
        &self,
        attr: &Self::Attribute,
        uniform: &Self::Uniform,
    ) -> (Vector4<f32>, Self::Fragment) {
        // The transformed position
        let position = uniform.projection * uniform.modelview * attr.position;

        // Create varyings
        let light_dir = attr.normal
            .dot(uniform.normal_matrix * uniform.light_dir)
            .max(0.0);
        let varyings = Fragment::new(light_dir);

        (position, varyings)
    }
}

impl FragmentShader for Fragment {
    type Uniform = Uniform;

    fn shader(&self, _: Vector3<f32>, _: &Self::Uniform) -> (u8, bool) {
        let mut color: u8 = 0x0d;

        if self.intensity > 0.85 {
            color = 0x31;
        } else if self.intensity > 0.6 {
            color = 0x21;
        } else if self.intensity > 0.3 {
            color = 0x11;
        }

        (color, false)
    }
}

The vertex shader is practically the same as it was for Gouraud, but the fragment shader is way different. It doesn't need the position or uniform inputs, so the function arguments are just named underscore to hint the compiler that they are unused. The fragment shader does pretty much the simplest form of color quantization possible. In this case, it is also reducing the color depth to 4 colors.

Optimization note: premultiplying projection and modelview is a good way to reduce the total number of compute operations per-vertex. The normal_matrix and light_dir can also be premultiplied. These optimizations are not recommended on a GPU, but they provide a nice win on a CPU.

Ok, there's just one more thing to do, now that we have instances of our two shaders. Let's draw something!

gl.clear(&ClearMask::ALL);

{
    // Update rotation
    angle += 0.004;
    rotation.s = angle.cos();
    rotation.v.y = angle.sin();

    // Animate the camera
    gouraud_shader.uniform.modelview =
        shift_left * Matrix4::look_at(rotation.rotate_point(eye), center, up);

    // Compute gl_NormalMatrix for correcting normals used in the lighting model
    gouraud_shader.uniform.normal_matrix = Matrix3::from_cols(
        gouraud_shader.uniform.modelview.x.truncate(),
        gouraud_shader.uniform.modelview.y.truncate(),
        gouraud_shader.uniform.modelview.z.truncate(),
    ).transpose();

    // Draw something interesting to the frame buffer
    gl.draw_arrays(&mut gouraud_shader);
}

Nice! gl is an instance of the renderer, in this snippet.

Start by clearing the frame buffer and depth buffer. gouraud_shader was bound as mutable so that we are able to update its owned uniforms each frame. This is really simple; just increase the rotation angle and apply it to a Quaternion. Then update the modelview matrix using the camera look_at method provided by cgmath. (I can't give enough shoutouts to this crate!) Final step is updating the normal matrix from the modelview and asking the renderer to draw the scene!

Ok, let's draw the next model with the toon shader.

{
    // Update rotation
    let angle = -angle;
    rotation.s = angle.cos();
    rotation.v.y = angle.sin();

    // Animate the camera
    toon_shader.uniform.modelview =
        shift_right * Matrix4::look_at(rotation.rotate_point(eye), center, up);

    // Compute gl_NormalMatrix for correcting normals used in the lighting model
    toon_shader.uniform.normal_matrix = Matrix3::from_cols(
        toon_shader.uniform.modelview.x.truncate(),
        toon_shader.uniform.modelview.y.truncate(),
        toon_shader.uniform.modelview.z.truncate(),
    ).transpose();

    // Draw something interesting to the frame buffer
    gl.draw_arrays(&mut toon_shader);
}

These snippets are both wrapped in scope blocks on purpose. It keeps things a little cleaner, since I can reuse the angle binding on the toon shader without negating the original binding for the next loop iteration. Otherwise it does the same exact thing. I can make these DRY if I write a trait for updating the modelview and normal matrices on a uniform. It could be called LookAt ;) But for a first iteration, I'm ok with WET code.

That about wraps it up for the programmable pipeline! As you can see, it's pretty easy to write and use as many shaders as needed. This will come in very handy in the near future...

But wait, what are these magic numbers for colors?

You might have noticed some peculiar numbers, like 0x0d and 0x31 being used where you would expect colors. Those are palette indices! It's easier to use raw u8 primitives than it would be to try naming all of them with an enum. There are a lot of greens in this palette, but only one color that comes close to orange, but it is not orange.  Naming them all sounds like a nightmare.

How about that asset pipeline?

Glad you asked! My asset pipeline is also starting to come together. Today I wrapped up the Blender add-on that exports scenes to CBOR files. I still don't like the Blender Python API, but I am starting to understand some of the choices the designers made. And yet, other choices are a complete mystery. The organization of the bpy.types module is truly frightening; what were they thinking?!

Well, whatever. I managed to write an export add-on using various tutorials around the internet, since their API documentation is impossible to navigate. CBOR was extremely easy to write to the filesystem thanks to cbor_py, which I shamefully copied directly into my add-on module (after annotating the file with its license). Loading the CBOR file in Rust was equally easy because the fine folks working on serde and serde_cbor did all the hard work!

I wrote a handful of structs that represent the structure of the scene in CBOR, and also implemented a serde remote derive for the cgmath Vectors, so I can feed them straight from CBOR deserialization into the renderer! Deserialization is done in just two lines of code, and a handful of annotated structs. AWESOME!

But that's about as far as I got; load the scene from a CBOR file and print it to the terminal. I'm not doing anything beyond that yet. Just had to stop and write this article for the week!

Next steps

Next week, I should have a scene renderer to write about. It will build upon the topics discussed so far; iterating the Scene struct and drawing each object with the appropriate shader. I also get to port my GLSL "sunbeam" shader mentioned way back in the Progress Report post from last month. Following that, I'll also port the water shader. This is where things get really exciting! The renderer tech is almost entirely done, now. It's all going to be about content from here on out.

Onward to LoFi 3D gaming!

No comments:

Post a Comment