**Implementation Details, Advice, and Examples for Assignment 4** Perspective Projection =============================================================================== `Renderer.projectVertices = function(verts, viewMat)` Project the input traingle vertices using viewMat, which is composed of the perspective projection matrix and the inverse of camera pose. The perspective projection and inverse camera matrices have been computed for you and then multiplied together beforehand in the render function for efficiency. Thus, viewMat serves to map world space coordiantes to normalized device coordinates (NDC). Return the screen space coordinates of each projected vertex, as well as its z in normalized device coordinates (NDC) for later use of z buffering. Note that we need to skip the input traingle (return undefined) if any of its camera space z is not between the near plane and the far plane of the camera, because it shouldn’t be seen. We have provided implementation of orthogonal projection in `projectVerticesNaive`, and you may refer to it as a starting point. Also check this slides (OpenGL projection matrix-Perspective projection part) for how the perspective projection transformation maps a view volume (pyramidal frustrum) to a canonical view volume (cube with range [-1, 1] in each of x, y, z dimensions) in whcih normalized device coordinates is defined. Hint: In renderer.js, there is a global object called "Renderer" which stores properties of the renderer we are using and has functions that define the entire rendering pipeline. You will need to utilize its properties in your implementation of perspective projection and also other assignment features. The following images are generated using the attached urls, where you can click through to examine your results. The back face of the cube looks smaller than the front face due to perspective projection. ![ Generated using this url](./example-images/perspective0.png border="1" width="480") ![ Generated using this url](./example-images/perspective1.png border="1" width="480") Phong Reflection Model ======================================================================== `Reflection.phongReflectionModel = function(vertex, view, normal, lightPos, phongMaterial)` You should apply this equation: $I = I_E + K_A I_{AL} + K_D (N \cdot L) I_L + K_S(V \cdot R)^n I_L$ We ignore the emission term ($I_E = [0, 0, 0]$) and assume the light color is white ($I_{AL} = I_L = [1, 1, 1]$). This is similar to what you have implemented in A3. In the code, we have already computed diffuse color, and you need to add in ambient color and specular color. The following images are generated using the Phong shader. ![](./example-images/phong_reflection0.png border="1") ![](./example-images/phong_reflection1.png border="1") ![](./example-images/phong_reflection2.png border="1") Helpers ======================================================================== You will not be able to directly observe the effects of the helper functions on GUI, but they will be helpful when you implement shaders. Bounding Box ------------------------------------------------------------------------ `Renderer.computeBoundingBox = function(projectedVerts)` This computes the screen-space bounding box for the triangle defined in projectedVerts. The bounding box should always be within the screen coordinates since any locations outside the screen won't be rendered. This function is useful when we do rasterization for each projected triangle, where we need to loop over pixels that cover the projected traingle. One simple way to do it is to compute the bounding box around the projected triangle, and just loop over pixels inside it. There are more effficient algorithms of rasterizing triangles which are up to your choice for optimization. Barycentric coordinates ------------------------------------------------------------------------ `Renderer.computeBarycentric = function(projectedVerts, x, y)` Compute the barycentric coordinates for a point (x, y) inside the triangle defined in projectedVerts. Return undefined if (x,y) is outside the triangle. See this article and pages 30-33 of this slides for an efficient 2D algorithm (the only difference from A3 is that the points are all 2D and that efficiency is even more important). Shaders with Phong Reflection Model ======================================================================== The pipeline of rendering a triangle is similar for all the three shaders to be implemented: Loop over pixels in the tiangle Hint: for implementation, you may want to first check the other functions of the Renderer object that we have already implemented for you to get a sense of the entire rendering pipeline. Flat Shader ------------------------------------------------------------------------ `Renderer.drawTriangleFlat = function(verts, projectedVerts, normals, uvs, material)` Compute the face normal as the average of normals of the three vertices. Compute the face centroid as the average of the three vertices. Pass the face normal and the face centroid to the Phong reflection model. ![ Generated using this url ](./example-images/flat0.png border="1" width="300") ![ Generated using this url ](./example-images/flat1.png border="1" width="300") ![ Generated using this url ](./example-images/flat2.png border="1" width="300") Gouraud Shader ------------------------------------------------------------------------ `Renderer.drawTriangleGouraud = function(verts, projectedVerts, normals, uvs, material)` Compute the color using the Phong reflection model for each vertex. Interpolate the colors of the pixels inside the triangle using the barycentric coordinates. ![ Generated using this url ](./example-images/gouraud0.png border="1" width="300") ![ Generated using this url ](./example-images/gouraud1.png border="1" width="300") ![ Generated using this url ](./example-images/gouraud2.png border="1" width="300") Phong Shader ------------------------------------------------------------------------ `Renderer.drawTrianglePhong = function(verts, projectedVerts, normals, uvs, material)` Interpolate the normals and the vertex positions of the pixels inside the triangle using the barycentric coordinates and apply the Phong reflection model. ![ Generated using this url ](./example-images/phong0.png border="1" width="300") ![ Generated using this url ](./example-images/phong1.png border="1" width="300") ![ Generated using this url ](./example-images/phong2.png border="1" width="300") Texture Mapping ======================================================================== Diffuse and Specular Mapping ------------------------------------------------------------------------ Implement inside the Phong shader (drawTrianglePhong) on top of your previous features. Interpolate the uv cooridinates of the pixels inside the triangle using the barycentric coordinates. Look up the material using the function getPhongMaterial(). Be careful of the cases where meshes don't come with uv coordinates or textures. Always check whether those are defined simply by "uvs===undefined". ![](./example-images/texture_mapping0.png border="1") ![](./example-images/texture_mapping1.png border="1") XYZ normal Mapping ------------------------------------------------------------------------ Implement inside the Phong shader (drawTrianglePhong) on top of your previous features. Some material comes with xyz normal texture (essentially an image), where each RGB values in the texture is mapped to actual XYZ coordiantes that represents a normal vector. This enables a more realistic looking mesh. We can use the interpolated uv coordiantes as before to lookup RGB values in the xyz normal texture. Assuming RGB is in [0, 1], the mapping to XYZ is simply XYZ = 2*RGB - 1. Remember to normalize the derived normal vector. Be careful of the cases where materials don't come with xyz normal texture. Always check whether those are defined simply by "material.xyzNormal===undefined". ![ Generated using this url ](./example-images/normal_mapping1.png border="1" width="480") ![ Generated using this url ](./example-images/normal_mapping0.png border="1" width="480")