2022 November 2,

CS 311: Hardware Triangle Rasterization

The second part of CS 311 deals with the same triangle rasterization algorithm as in the first part. However, instead of implementing it ourselves, we learn how to use Vulkan to run the algorithm on the graphics processing unit (GPU). Then we layer new techniques atop the basic algorithm.

You work with your newly assigned partner. Except where otherwise noted, each homework assignment is due at the start of the next class meeting. Remember our late policy: If you don't have an assignment finished on time, then don't panic, but do e-mail me about it, so that we can formulate a plan for getting back on track.

Day 14

Today we learn some big concepts about GPUs.

OpenGL History

What does it mean, to shift work from the CPU to the GPU? Why and how do contemporary graphics libraries give programmers fine control over the GPU — and require them to exercise it? To understand these issues, it is instructive to survey some of the history of OpenGL.

Depending on your computer's set-up, you might not be able to compile all of these examples (especially the last one). That's okay. They produce pretty dull imagery. The important thing is to read the code.

370mainStrings.c: Compile, execute, and skim. This tutorial has nothing to do with graphics. It illustrates the general idea that programmer-friendly, high-level code often executes more slowly than low-level code does.

380mainGLFW.c: Skim. Focus on the render function and the call to glfwSwapBuffers.

390mainOpenGL14.c: Skim, focusing on render.

400mainOpenGL15.c: Skim, focusing on render. What is the big change from OpenGL 1.4 to OpenGL 1.5, and why?

410shading.c: Skim. This file offers helper functions for making shader programs. Most of the code is error checking.

410mainOpenGL20.c: Skim. What is the big change from OpenGL 1.5 to OpenGL 2.0?

420mainOpenGL20.c: Skim, focusing on the comments. The change from the previous tutorial is small, but it epitomizes my point about requiring the programmer to exert more control.

430mainOpenGL32.c: Skim. There are several changes. How do they illustrate my point?

(In retrospect, surveying OpenGL took the whole class period, which is fine. New project work is assigned on Day 16. Until then, there is no new project work. Spend time finishing your first project, preparing for our oral exam, enjoying your midterm break, etc.)

Day 15

There is no class today because of the oral exams. There is also no assigned project work.

Day 16

Now we start learning Vulkan. You are not expected to memorize specific function calls or structure definitions. You are expected to understand the high-level concepts and remember which files do what, so that you can look details up when you need them.

Vulkan Tutorial 1: Basics

This first tutorial provides the extreme rudiments of a Vulkan setup for graphics.

440gui.c: Skim. This file provides a simple window based on the GLFW toolkit. It doesn't do any Vulkan.

440vulkan.c: Study. This file provides key machinery: Vulkan instance, physical device, logical device, extensions, validation layers.

440mainVulkan.c: Study. This file should compile and run, showing a blank window. It should print responses to certain keyboard and mouse events. If you instead get errors, then try reconfiguring the constants according to the directions.

Vulkan Tutorial 2: Swap Chain

Unfortunately, we need a lot more machinery to get Vulkan going, before we can run a proper animation loop. The key concept here is the swap chain: roughly speaking, the queue of raster images to be rendered and shown to the user.

450buffer.c: Study. This file lets you use chunks of memory on the GPU.

450image.c: Study. This file lets you use chunks of memory on the GPU that are optimized for holding raster images.

450swap.c: Study. This file provides the swap chain machinery.

450mainSwap.c: Study. This file should compile and run, showing a black window and throwing an error on each animation frame.

Vulkan Tutorial 3: Shader and Meshes

This next program actually shows some imagery. Hooray. Also, we glimpse our first OpenGL Shading Language (GLSL) code.

460shader.vert: Study. This is a vertex shader written in GLSL. Even though I haven't taught you GLSL, can you guess what each line does?

460shader.frag: Study. This is a simple fragment shader written in GLSL.

460shader.c: Study. This file lets you build shader programs from compiled GLSL shaders.

460mesh.c: Study. This file lets you build meshes as vertex buffers and index (triangle) buffers.

460mainMeshes.c: Study. In the instructions at the top, notice that you must compile not only this C file but also the two shader files. The running application should produce a static image in bright pinks, greens, and cyans.

Exercise A: Meshes

The work, that we do now, is the first work to be handed in for this second course project. The goal is to package meshes better, so that we can write larger applications more easily. Along the way, we re-introduce some of our old graphics engine.

We do everything in single-precision floats rather than double-precision doubles, because a little extra precision is not worth doubling the memory required. Similarly, we use 16-bit uint16_ts for our triangle indices, even though they limit our meshes to 65,536 vertices each.

470vector.c: Make a copy of 250vector.c. Replace all doubles with floats. (I was able to accomplish this task using a simple search-and-replace.)

470mesh.c: Download. Starting with a copy of 350mesh.c, I have replaced all doubles with floats. I've also adjusted a bunch of ints to uint16_ts. Finally, I've deleted meshRender and its helpers. I'll never forget you, meshRender!

470mesh2D.c: In a copy of 190mesh2D.c, replace all doubles with floats.

470mesh3D.c: Download. Starting with a copy of 250mesh3D.c, I have replaced all doubles with floats, and I've replaced several ints with uint16_ts.

470vesh.c: Study. "Vesh" is my personal abbreviation for "Vulkan mesh". In the public section of this file, there are two methods for you to implement. I recommend that you implement them by copying a bunch of functions from 460mesh.c into the private section of this file, renaming them to start "vesh...", and then calling them from the methods. In the end, this file should not rely on 460mesh.c, because that file is not part of our engine. Also, all identifiers in this file should start "vesh...".

470shader.vert: Make a copy of 460shader.vert. In the near future, at least, we use meshes built by the functions in 470mesh3D.c. So they have a certain attribute structure: XYZ, ST, NOP. Update the vertex shader accordingly. Make the varying color in any reasonable way that you like. Compile this new shader to 470vert.spv.

470mainMeshes.c: Make a copy of 460mainMeshes.c. Make sure that 460mesh.c is not included! Adjust the rest of the code to initialize, render, and finalize two decent meshes, as follows. Include the five "470....c" files. In initializeArtwork, make sure that 470vert.spv is loaded. There are six global variables and one global constant pertaining to mesh style. Replace them with a single veshStyle global variable. In initializeArtwork, replace the call to meshGetStyle with a call to veshInitializeStyle. Don't forget to finalize the style in finalizeArtwork. Where initializePipeline is called, you need to pass it certain members of the style. Remove all 16 global variables pertaining to meshes A and B. Replace them with two global variables holding veshVeshes. In initializeArtwork, initialize them from two temporary 3D meshMeshes of your choosing. Don't forget to finalize them in finalizeArtwork. Elsewhere in this file, there is a chunk of code that renders the two meshes. Carefully replace that chunk with two calls to veshRender. Then you should be done. Here's a screenshot of my version, which uses a box and a capsule:

470mainMeshes screenshot

You have five files to hand in: 470vector.c, 470mesh2D.c, 470vesh.c, 470shader.vert, and 470mainMeshes.c. Make sure that each C file credits both partners in a comment at the top. (In some C files, the old partners should be credited too!) Make sure that each file is working, clean, and commented appropriately. Make sure that both partners have a copy of each file. Then, just one partner submits the files to their hand-in folder on the COURSES file server.

Day 17

Vulkan Tutorial 4: Scene-Wide Uniforms

This tutorial shows how to declare and set uniforms that have a single value across the entire scene. Practical examples might include time, the camera, and lights (which we study later). An important concept here is descriptors, which specify how the uniforms connect to the shader program.

480shader.vert: Study. The camera matrix is now being passed into the shaders as a uniform.

480shader.frag: Study. We also pass a color into the shaders, just as another example.

480uniform.c: Study. This file helps us allocate the memory needed to pass data into shaders through uniforms.

480description.c: Study. This file helps us communicate the structure of the uniforms to the shaders.

480mainUniforms.c: Study. Don't forget to compile the two shader files. The running application should produce a rotating version of the previous tutorial's imagery. In setSceneUniforms, change the uniform color, to check that it produces the correct effect in the fragment shader.

Exercise B: Camera

The preceding tutorial uses a pretty crummy camera. Let's replace it with the camera machinery from our first course project. (We leave the meshes unimproved this time.)

490matrix.c: In a copy of 280matrix.c, replace all doubles with floats.

490isometry.c: In a copy of 300isometry.c, replace all doubles with floats.

490camera.c: In a copy of 300camera.c, replace all doubles with floats. Near the top of the file, declare the following constant matrix. It is needed, to make our projections match Vulkan's conventions, as follows. The function camGetOrthographic should output camVulkan times what it used to output. Delete the comment just before that function, because it is no longer accurate. Delete camGetInverseOrthographic, because it is no longer accurate and we don't need it. Then repeat these steps for the perspective projection. Check that camGetProjectionInverseIsometry is using these new projection matrices.

const float camVulkan[4][4] = {
    {1.0, 0.0, 0.0, 0.0}, 
    {0.0, -1.0, 0.0, 0.0}, 
    {0.0, 0.0, 0.5, 0.5}, 
    {0.0, 0.0, 0.0, 1.0}};

490mainCamera.c: In a copy of 480mainUniforms.c, #include 470vector.c and those files. Declare global variables for the camera's ρ (rho), φ (phi), and θ (theta), to be used in camLookAt and camSetFrustum. (Don't forget that camSetFrustum's focal length should always match camLookAt's ρ.) Configure a camCamera as part of the artwork, using camSetProjectionType and those two functions. In setSceneUniforms, set the uniform matrix using the camera and mat44Transpose (and delete lots of code that's no longer needed). Test.

490mainCamera.c: Add a keyboard handler that lets the user increase/decrease θ with L/J and lets them switch projection type with P. Test.

490mainCamera.c: Have you tried resizing the window? Try making it really tall and narrow or short and wide. The image should appear distorted, because our projection is not matched to the window size. We need to fix that. When the user resizes the window, reinitializeSwapChain is called, and inside that function swapInitialize is called. At any time after that swapInitialize — either in reinitializeSwapChain or during the next frame's rendering — the camera can call camSetFrustum, passing swap.extent.width and swap.extent.height for the width and height. Implement this idea now. The exact details depend on how and where you update the camera. When it's working correctly, the window size may affect the size of the rendered meshes, but it should not affect their shape. They should not be distorted.

You have four files to hand in: 490matrix.c, 490isometry.c, 490camera.c, and 490mainCamera.c. Make sure that they are credited, working, clean, and commented. Make sure that both partners have copies. Then one partner submits them to COURSES.

Day 18

Vulkan Tutorial 5: Body-Specific Uniforms

Let's use the term "body" to mean "object in the scene". A scene might contain many bodies: a landscape, a tree, a bird, etc. Let's assume that all of the bodies have the same kinds of uniforms; for example, each body might be positioned and oriented using a 4x4 modeling matrix. However, they usually don't have the same values for those uniforms; for example, they don't all use the same modeling matrix. This tutorial shows how to declare uniforms that can be set with body-specific values.

500shader.vert: Study. The modeling matrix is now passed into the vertex shader as a uniform.

500mainUniforms.c: Study. If everything is working, then one of the two bodies rotates, the other does not, and the camera revolves around them.

Exercise C: Oscillation

This is a short exercise. It just checks that you understand a key part of the preceding tutorial: how the modeling transformations get set. (We leave the meshes and camera unimproved this time.)

510mainOscillation.c: Make a copy of 500mainUniforms.c. Introduce two isoIsometrys, and use them to set the two bodies' modeling transformations. Neither body should rotate. Instead, the two bodies should translate back and forth, toward each other and away from each other. The camera should revolve around them as always; don't edit that, please.

In the usual way, make sure that 510mainUniforms.c is credited, working, clean, commented, and shared with both partners. One partner submits it to COURSES.

Day 19

Vulkan Tutorial 6: Textures

In this final Vulkan tutorial, we introduce texture mapping. Technically, textures are a kind of uniform. In some ways they are treated like other uniforms; for example, they require descriptors. In other ways they are different from other uniforms; for example, they don't live in uniform buffer objects, and the process of setting them scene-wide or per-body is a bit surprising.

reddish.png, bluish.png, grayish.png: Download these three textures.

520shader.vert: Study. We share the body uniforms between the vertex shader and the fragment shader, so we have to change the vertex shader, even though only the fragment shader benefits from the new body uniforms.

520shader.frag: Study. The fragment shader has access to all three textures in the scene. It samples from two of them as chosen by the body uniforms.

520texture.c: Study. This file provides two kinds of machinery: textures, and the samplers that sample from them.

520mainTextures.c: Study. The running application texture-maps each mesh with two textures.

By the way, our helper files total 2,182 lines of code (not including the exercises or the STB image library), and our latest main.c is 962 lines (not including the shaders).

Optional: Study Question

In 480description.c, we have machinery for conveniently adding as many descriptors as we want. But why would we ever add more descriptors than we already have? Based on what you've learned in the tutorials, what's the maximum number of kinds of descriptors that you can imagine wanting? (Hint: Something is "wrong" in Vulkan Tutorial 6 above.)

Baseline

Now that we're done with tutorials, all of our work goes into building a useful graphics engine atop Vulkan. Let's start with a "baseline" program, from which we can all progress together.

530shader.vert: Study.

530shader.frag: Study. I've left some extra code in there, to give you more examples of how GLSL works. Once you have main.c running, try scaling the texture color by intensity instead of stripe. (By the way, this intensity exactly matches the one used in 340mainLandscape.c.)

530landscape.c: In a copy of 340landscape.c, replace all doubles with floats.

530mainBaseline.c: Study. The running program shows an abstract hero (a person? penguin? robot? Totoro?) walking across a landscape. Try the movement controls (W/S, A/D). Try the camera controls (J/L, I/K, U/O, P). If something's not working, then fix your code from the earlier exercises. Feel free to change the textures. With the default textures, the landscape looks Arctic:

530mainBaseline screenshot

530mainBaseline.c: By using the keyboard controls, adjust the camera in such a way, that there is visible clipping at the near plane. Is the effect similar to what you achieved in 350mainClipping.c? (This is a short educational exercise, intended to solidify your understanding of clipping.)

Body

Now we start developing a simple scene graph. The first step is a body abstraction, which contains all of the data needed by a single body in the scene. We assume that these data include a modeling isometry. (Many users of our graphics engine will like this assumption and the resulting scene graph. Other users, who don't want a body isometry, will have to ignore this part of our graphics engine.)

540body.c: Study.

540mainBody.c: Start with a copy of 530mainBaseline.c. Raise the definition of BodyUniforms to the #include section, just before you include 540body.c. Just before initializeScene, declare three global variables to hold the bodies. In initializeScene, configure those bodies with all of the data that don't change frame-to-frame. Don't forget to set one texture index in each body. In setBodyUniforms, update whatever body data do change frame-to-frame, then call bodySetUniforms three times, then do the big memory copy to the GPU-side buffer. In initializeCommandBuffers, you should have three calls to bodyRender (and no veshRender or bodyUBOOffset). Test. The running program should produce exactly the same imagery as 530mainBaseline.c produces.

In the usual way, make sure that 530landscape.c and 540mainBody.c are credited, working, clean, commented, and shared with both partners. One partner submits them to COURSES.

Day 20

The body abstraction makes our code a little cleaner, which is nice. But it becomes much more useful when we allow bodies to connect to each other and form a scene graph. The crucial idea is that each body's modeling isometry orients and positions the body relative to the body's parent (if it has a parent). This idea helps us keep bodies visually connected, even as they rotate and translate in the world.

Small Scene Graph

Thus far, we have methods bodyRender and bodySetUniforms, which must agree on the index of each body in the rendering order. And now we want to make recursive versions of these functions, that assign each body's index automatically. The simplest solution is to proceed by depth-first search — both in building the command buffer (once, when the scene is initialized) and in setting the uniforms (once per frame). Then the eldest root has index 0, the next node to be visited (either the eldest root's first child or its next sibling) has index 1, and so on.

550body.c: Study. There are two un-implemented methods. Don't implement them yet. We proceed by baby steps, to avoid changing too much code at once.

550mainGraph.c: In a copy of 540mainBody.c, include 550body.c. Update the calls to bodyConfigure, passing NULL for the new arguments. Test.

550mainGraph.c: In initializeScene, make the three bodies siblings of each other. To match the indexing still being used in setBodyUniforms and initializeCommandBuffers, the eldest sibling should be the land and the youngest sibling should be the hero. Test.

550body.c: Implement the bodyRenderRecursively method. This method needs to handle the indexing delicately. To finish this step, it might help to look ahead to the next step...

550mainGraph.c: In initializeCommandBuffers, replace the three calls to bodyRender with one call to bodyRenderRecursively. Test.

550body.c: Implement the bodySetUniformsRecursively method. This method needs to handle the indexing delicately. To finish this step, it might help to look ahead to the next step...

550mainGraph.c: In setBodyUniforms, replace the three calls to bodySetUniforms with one call to bodySetUniformsRecursively. Test.

550mainGraph.c: Let's do one final test. In initializeScene, change the structure of the scene graph, so that the hero is the eldest sibling (followed by the land and water in either order). In setBodyUniforms and initializeCommandBuffers, start the recursions from the hero body. Test. The running program should still produce exactly the same imagery as 530mainBaseline.c and 540mainBody.c produce. The keyboard should still let us walk the hero across the landscape.

Larger Scene Graph

Thus far, our scene graph is so small that it doesn't even illustrate why scene graphs are useful. Nor does it thoroughly test our code. So let's design a larger, more interesting scene graph.

560body.c: In a copy of 550body.c, implement the following two methods. They're both helpful in assembling scene graphs. (Also, the first method helps you implement the second.)

/* Appends the given sibling to the body's list of siblings. */
void bodyAddSibling(bodyBody *body, bodyBody *sibling) {
    
}

/* Appends the given child to the body's list of children. */
void bodyAddChild(bodyBody *body, bodyBody *child) {
    
}

560mainGraph.c: Start with a copy of 550mainGraph.c. Enlarge the sub-graph rooted at the hero, to make a hero with more detail. Be creative. There are only two requirements. First, the hero needs to be asymmetric enough, and realistic enough, that a typical user can tell where its front is. (I want to be able to guess what direction it's going to walk, before I start walking.) Second, the root node of the hero should have at least one child and at least one grandchild, and their modeling isometries should not be trivial. (The point is to test whether your relative modeling isometries are behaving correctly.)

Do you want more guidance? Here's what I did. I sketched my hero on paper. It was some kind of...duckdog? I made some new veshes, and tested whether the program still ran. I made bodies for the veshes and tested. I connected the bodies into the scene graph and tested. At this point I could finally see the new hero parts. Finally, I adjusted the vesh geometry and modeling isometries, tested, re-adjusted, tested, etc. Here it is:

560mainGraph screenshot

You have four files to hand in: 550body.c, 550mainGraph.c, 560body.c, and 560mainGraph.c. Make sure that they are credited, working, clean, commented, and shared with both partners. One partner submits them to COURSES.

Optional: Study Question

We now have an abstraction that helps us manage bodies in our scenes. What other abstractions would help us streamline our code? (Hint: Read main.c, find clumps of global Vulkan variables that are related to each other, and think about how they could be combined.)

Day 21

Today we implement Lambertian diffuse reflection from two light sources — one directional and one positional. We also add ambient light.

Optional: Artistic Light

If you want to check out the examples that I've shown in class, then follow these links. The point here is simply that artists have different goals for lighting, both realistic and unrealistic.

Judith and Her Maidservant by Artemisia Gentileschi, circa 1625

Still life with oysters, a rummer, a lemon and a silver bowl by Willem Claesz. Heda, 1634

An Experiment on a Bird in the Air Pump by Joseph Wright of Derby, 1768

Sudden Shower over Shin-Ohashi bridge and Atake by Hiroshige, 1857

Bal du moulin de la Galette by Pierre-August Renoir, 1876

At the Cafe-Concert: The Song of the Dog by Edgar Degas, 1870s

The Third Man (excerpt) by Carol Reed, 1947

Diffuse Reflection with a Directional Light

570shader.vert: In a copy of 530shader.vert, remove the nop and xyzwWorld varyings. Add vectors for ulight and clight to the scene uniforms. Compute unormal and pass it through the varyings.

570shader.frag: In a copy of 530shader.frag, remove the nop and xyzwWorld varyings. Also remove remove intensity and stripe. Add ulight and clight to the scene uniforms. Receive dnormal through the varyings, and re-normalize it to unormal. Do the diffuse reflection calculation, using the sampled texture color for cdiffuse.

570mainDiffuse.c: In a copy of 560mainGraph.c, edit the code to use 570vert.spv and 570frag.spv. (And make sure those shaders are compiled.) Add ulight and clight to the scene uniforms, and initialize them with sensible values. Test.

Diffuse Reflection with a Positional Light

580shader.vert: Make a copy of 570shader.vert. For the second light, add a plight and another clight to the scene uniforms. Compute ulight for this light, and pass it through the varyings.

580shader.frag: Make a copy of 570shader.frag. Add plight and clight to the scene uniforms. Receive dlight through the varyings, re-normalize it to ulight, and do another diffuse reflection calculation. Now you have diffuse reflections based on two lights; add them to get the final fragment color.

580mainDiffuse.c: In a copy of 570mainDiffuse.c, edit the code to use the new shaders. Add plight and clight to the scene uniforms. I want the light to be in a particular position: (0, 0, Z), where Z is landData[0] + 1 or waterData[0] + 1 — whichever is greater. So the light should be just above the surface near the origin. Give the light a color that is different from the first light's color. Test.

Ambient Light

590shader.vert: In a copy of 580shader.vert, add cambient to the scene uniforms.

590shader.frag: In a copy of 580shader.frag, add cambient to the scene uniforms. Add an ambient contribution to the final fragment color.

590mainDiffuse.c: In a copy of 580mainDiffuse.c, edit the code to use the new shaders. Add cambient to the scene uniforms, and initialize it to a sensible value. Test.

You have nine files to hand in: three 570 files, three 580 files, and three 590 files. In the usual way, clean them up and hand in one copy of each.

Optional: Study Question

In 530shader.frag, there was an intensity variable. What is the relationship between this variable and diffuse reflection?

Day 22

Today we implement Phong specular reflection. The code depends on whether our camera is perspective or orthographic. For simplicity, we handle only the perspective case. Then we implement either attenuation, spot light, or fog.

Specular Reflection with a Perspective Camera

600shader.vert: In a copy of 590shader.vert, add pcamera to the scene uniforms and cspecular to the body uniforms. Compute the unit vector ucamera pointing from the vertex toward the camera, and pass it through the varyings.

600shader.frag: In a copy of 590shader.frag, add pcamera to the scene uniforms and cspecular to the body uniforms. Receive dcamera through the varyings, and re-normalize it to ucamera. For both lights, implement specular reflection. Let's agree that the shininess is hard-coded to 64.0.

600mainSpecular.c: In a copy of 590mainDiffuse.c, add pcamera to the scene uniforms and cspecular to the body uniforms. Initialize pcamera to match the world position of the camera. The water should be shiny, and the land should be matte. (How is this possible, if both have shininess 64?) Design the hero's specular reflection as you like. Test.

One More Thing

You have one more task for this project. You get to choose what it is, from the three options below: attenuation, spot light, or fog. They don't depend on each other. I have tried to make them approximately equal in labor. You don't get extra credit for doing more than one.

In the end, you have six files to hand in: three 600 files and three more files from below. In the usual way, clean them up and hand in one copy of each.

Attenuation

610shader.vert: Start with a copy of 600shader.vert. Add the attenuation coefficient to the scene uniforms.

610shader.frag: Start with a copy of 600shader.frag. Add the attenuation coefficient to the scene uniforms. Compute the squared distance from the positional light to the fragment. Attenuate the positional light's clight.

610mainAttenuation.c: Start with a copy of 600mainSpecular.c. Add the attenuation coefficient to the scene uniforms. Add a keyboard controller, so that the user can increase the coefficient by pressing X (X with shift) and decrease it by pressing x (X without shift). Change the setting of the scene uniforms, so that the positional light is at (or just above) the hero, and therefore tracks the hero as it moves around. Test. Here's mine:

610mainAttenuation screenshot

Spot Light

620shader.vert: Start with a copy of 600shader.vert. Add uspot to the scene uniforms.

620shader.frag: Start with a copy of 600shader.frag. Add uspot to the scene uniforms. Using the dot product of -ulight and uspot and some hard-coded spot half-angle (maybe π / 12, which is 15°), decide whether to zero-out the positional light's clight.

620mainSpot.c: Start with a copy of 600mainSpecular.c. Add uspot to the scene uniforms. Change the setting of the scene uniforms, so that the positional light is at (or just above) the hero, and points in the same direction that the hero faces. It therefore behaves like a light attached to the hero as the hero moves around. Test. Here's mine:

620mainSpot screenshot

Fog

630shader.vert: Start with a copy of 600shader.vert. Add a fog bound b to the scene uniforms.

630shader.frag: Start with a copy of 600shader.frag. Add b to the scene uniforms. Hard-code a fog color cfog. Let d be the distance from the camera to the fragment, and let cfrag be the fragment color before fog. If db, then the fragment's final color is cfog. If d < b, then the fragment's final color interpolates as cfrag + (d / b) (cfog - cfrag).

630mainFog.c: Start with a copy of 600mainSpecular.c. Add the fog bound to the scene uniforms. Change the background color to the fog color. Add a keyboard controller, so that the user can increase b by pressing V (V with shift) and decrease it by pressing v (V without shift). Test. Here's mine:

630mainFog screenshot

Optional: Study Questions

Consider all of the lighting effects in 600mainSpecular.c. Could we implement them in the software triangle rasterizer from our first project? If not, why not? If so, what would unifDim, attrDim, and varyDim be?

How is the specular reflection calculation different for an orthographic camera? (Hint: The relationship between perspective and orthographic cameras is analogous to the relationship between positional and directional lights.)