# Shading
Previously we have a brief overview of [[3_opengl_intro#4. Shader|setting up a "nop" shader]] that doesn't do anything interesting. Now we are ready to look into shading in detail, since shading enables realistic simulation of light.
#warning: Lots of terms/definitions! Since this is our first encounter with the concept of shading, some terminology definitions are direly needed before we proceed.
## Terminologies and Preliminaries
![[Pasted image 20221220101113.png|500]]
- ***Interpolation***: The calculation of values (such as color or depth) for interior pixels, given the values at the boundaries (such as at the vertices of a polygon or a line).
- ***Fragment***: A generalized concept of "pixel" (an oversimplification). For a more comprehensive definition refer to [OpenGL documentation](https://www.khronos.org/opengl/wiki/Fragment).
- Primitive: An OpenGL geometric [[3_opengl_intro#OpenGL Primitives and Drawing|primitive]].
- ***Flat Shading***: Every pixel corresponding to each face has identical (i.e. "flat") color
- ***Gouraud (smooth) Shading***: (in **vertex shader**) Every **pixels** color is determined by **interpolation**. Said interpolation is induced by **rasterization**. It magically makes a polygon mesh (of lower fidelity) appear smooth without having to increase fidelity (i.e. sub-dividing faces into smaller ones).
- ***Phong Shading***: (in **fragment shader**) Similar to Gouraud shading, but light is calculated per-**fragment** as opposed to per-vertex in Gouraud shading. ([stack overflow discussion on Gouraud vs.Phong Shading](https://stackoverflow.com/a/63958763)).
- **Vertex vs. Fragment Shader**: There are in general 2 ways of interpolation for smooth (Gouraud or Phong) shading:
1. Per-vertex computation is performed in **vertex shader**, then **interpolated by rasterizer**. Both **Gouraud** (with the interpolation step) and Flat shading (without interpolation) can be computed this way.
2. Normals/coordinates/etc are interpolated at each vertex, then each fragment is shaded in the **fragment shader**. This provides more accurate shading (e.g. more accurate specular highlights).
- In OpenGL there is also wireframe rendering, which we can activate using the `glPolygonMode(GL_FRONT, GL_LINE)` command. It will superimpose the wireframe, then remove hidden lines to give a nice retro look to the object.
### Color Theory (briefly):
Every pixel contain 3 color channels - red, green, blue (RGB), which can be thought of as vertices of a color cube.
- Yellow=R+G, Cyan=B+G, Magenta=B+R, White=R+G+B,...
- Often normalized to [0,1] in OpenGL, but represented as [0,255] in terms of pixel intensity
- ***RGBA***: 32-bit, 8 bits per channel (additional A channel is for alpha, i.e. transparency)
## Lighting And Shading
### Importance of shading/light
![[Pasted image 20221219092836.png|400]]
Light brings out 3D appearance of the object (as opposed to a simple projection of 3D object onto 2D surface)/Notice that the teapot lid isn't visible without shading, since it's only visible through diffusion. Also notice that when teapot's position changes, the position of ***specular highlights*** also change.
### Types of light sources
![[4_shading 2023-01-31 09.49.01.excalidraw]]
1. **Point (line, and area) light source**: Defined by position and color.
- **Attenuation** (quadratic model):
$\text{attn}=\frac{1}{k_e+k_id+k_qd^2}$
2. Directional Light: $w=0$ , infinitely far away, no attenuation
3. Spotlight: Defined by *spot exponent* and *spot cutoff*.
### Material Properties
![[Pasted image 20221219212629.png|300]]
![[Pasted image 20221220131557.png|600]]
Lets define some variables for ease of following articulation:
- $N$: Unit vector of surface normal
- $L$: Unit vector pointing towards the light source (for diffuse shading)
- $V$: Unit vector pointing towards eye
- $H=\frac{V+L}{\Vert V+L \Vert}$: Unit [half-vector](https://en.wikipedia.org/wiki/Blinn–Phong_reflection_model) (for specular shading)
- $R$: Mirror reflection induced by $-L$
- $E$: Vector pointing towards eye
Also note that rigorous definition of the following properties are often w.r.t light source intensity $I$, irradiance $E=I\frac{\cos\theta}{r^2}$, and reflected light $L=k_{\text{reflection}}E$. But the lecture slides refer to the reflected light using letter $I$, so we use $I$ here to denote reflected light to be consistent with the slides.
1. **Ambient**: Not directly associated with any particular light source. Uniformly distributed throughout space, with light shining on the surface from all directions. Independent of surface location/orientation. Typically we assign a very small number like $A=0.1$.
$A = \text{Ambient}$
2. **Emissive**: Only relevant when directly looking at a light source. Geometry needs to be created to actually see the light in OpenGL. It doesn't affect other light calculations.
$E = \text{Emission}_{material}$
3. **Diffuse (Lambertian)**: Scattered light, the intensity of which varies according to object orientation w.r.t. light source. Such light is scattered evenly in all directions. Imagine a rough matte surface.
$\begin{align*}
D &=\sum_i\text{diffuse}_{material}*\text{intensity}_i*\text{attn}_i*\max(L_i\cdot N,0)\\
&=\sum_iD_i\max(L_i\cdot N,0)\\
\end{align*}$
4. **Specular** (**Phong**): Shiny reflection, with greatest intensity in the direction of reflection (i.e. 'mirror' direction). *Phong exponent* $p$ determines the apparent surface shininess. Details are omitted.
$\begin{align*}
S &= \sum_i \text{specular}_{material}*\text{intensity}_i*\text{attn}_i*\max(H_i\cdot N,0)^p \\
&=\sum\limits_i S_i\max(H_i\cdot N,0)^p\\
&\sim \sum\limits_i (R\cdot E)^p, \quad\text{where }E=-L+2(L\cdot N)N\\
\end{align*}$
Putting everything together, we have:
$I=A+E+\sum_i L_i\left[D_i \max (N \cdot L, 0)+S_i \max (N \cdot H, 0)^s\right]$
### Vertex normals
We need normals to calculate diffuse and specular reflections.
- We can obtain vertex normals by interpolating its adjacent face normals
- We can obtain interior normals by interpolating vertex normals
## Gouraud Shading (Vertex shader)
**The rasterizer** is what actually does the heavy-lifting, despite that shading happens in vertex shader. Namely: 1) **Enumerate** each pixel (covered by a primitive), 2) **interpolate** values, then 3) output fragments (one for each pixel covered by the primitive).
- To enumerate each pixel, we (informally) define [scan-line](https://en.wikipedia.org/wiki/Scan_line) to be a line parallel to the x-axis, which we move from bottom ($I_3$) to top ($I_1$).
- [Bilinear interpolation](https://en.wikipedia.org/wiki/Bilinear_interpolation) is used to get values (i.e. light intensities) in-between vertices. Note again that the following interpolation problem happens at the rasterization step.
- Don't worry about the degenerative case for now. We'll go through it later
### Problem setup
![[Pasted image 20221220000304.png|600]]
Let assume we have a triangle with 3 vertices, and at each vertex we have light intensities $I_1, I_2, I_3$ (grayscale for simplicity, so $I_n\in\mathbb{R}$ as posed to $\mathbb{R}^3$). The goal is to compute the intensity $I_p$ at point $p$. Also assume the scan-line going through $p$ intersects two triangle edges at $a, b$, the intensity at which are $I_a, I_b$. We refer to the y-coord of the scan-line, point $p$, and the 3 vertices as $y_s, y_p,\; y_1,y_2,y_3$, respectively.
The big picture is to
1. Get $I_a, I_b$ by interpolating vertically between $I_1, I_2$ and between $I_1,I_3$, respectively
2. Get $I_p$ by interpolating horizontally (along the scan line) btw. $I_a, I_b$.
$\begin{align*}
I_a&=\frac{I_1\left(y_s-y_2\right)+I_2\left(y_1-y_s\right)}{y_1-y_2},\;
I_b=\frac{I_1\left(y_s-y_3\right)+I_3\left(y_1-y_s\right)}{y_1-y_3}\\
I_p&=\boxed{\frac{I_a(x_b-x_p) + I_b(x_p-x_a)}{x_b-x_a}}
\end{align*}$
Notice that the above calculation is needed for EVERY pixel inside the triangle, which would require a double for-loop construct that's in the following form:
```
for each scanline_of_the_triangle:
for each pixle in scanline_of_the_triangle:
compute I_a, I_b, I_c
```
$O(n^2)$ alert!! So there's really a strong incentive to make the above calculation efficient, which we attempt to do below.
As the scan-line goes up, we need to update $y_s$ to $y_{s+1}$ and re-calculate $I_a,I_b,I_p$. We'll soon discover that we can reuse quite a few variables to make calculation efficient:
$\begin{align*}
\text{let } B&\coloneqq I_2y_1-I_1y_2, \; \alpha\coloneqq\frac{1}{y_1-y_2}, \; A\coloneqq\alpha B, \; \\
\text{observation: } I_a&=\frac{I_1\left(y_s-y_2\right)+I_2\left(y_1-y_s\right)}{y_1-y_2} = \alpha(y_s(I_1-I_2)+B) = \alpha(y_s\alpha^{-1}+B)\\
&=y_s+A \;\boxed{TODO: Verify}
\end{align*}$
#question Why doesn't interpolation take into account x-coord? Is it because we are using a [barycentric coordinate system](https://en.wikipedia.org/wiki/Barycentric_coordinate_system)?
### Degenerative case and errors
1. If $I_1=I_2=0$, then the interpolated $I_p$ will be 0.
2. Gouraud shading lacks rotational invariance, which leads us to a better shading model: Phong Shading.
### Simple vertex shader (`mytest3`)
```cpp
#version 330 core // Do not use any version older than 330!
// Inputs
layout (location = 0) in vec3 position;
layout (location = 1) in vec3 normal;
layout (location = 2) in vec2 texCoords;
// Extra outputs, if any
out vec4 myvertex;
out vec3 mynormal;
out vec2 texcoord;
// Uniform variables
uniform mat4 projection;
uniform mat4 modelview;
uniform int istex; // Are we doin texturing or no?
void main() { // Execute vertex shader
gl_Position = projection * modelview * vec4(position, 1.0f);
// Normal transformation is the inverse transpose of modelview matrix
mynormal = mat3(transpose(inverse(modelview))) * normal ;
myvertex = modelview * vec4(position, 1.0f) ; // Eye coord.
texcoord = vec2 (0.0, 0.0); // Default value just to prevent errors
if (istex != 0){texcoord = texCoords;}
}
```
---
## Phong Shading (Fragment shader)
**Vertex normals** (as opposed to light intensities in Gouraud shading) are interpolated in the rasterizer. The benefit of doing so is that emitted light that's defined by nonlinear functions can now be accurately interpolated.
![[Pasted image 20221220133328.png|300]]
### Fragment shader example
```cpp
#version 330 core // Do not use any version older than 330!
/* ---- Fragment Shader setup ----*/
// Inputs fragment shader are outputs of same name of vertex shader
in vec4 myvertex;
in vec3 mynormal;
in vec2 texcoord;
// Output the frag color
out vec4 fragColor;
uniform sampler2D tex ;
uniform int istex ;
uniform int islight ; // are we lighting.
unifiorm vec3 color;
/*---- Fragment Shader vairables ----*/
//Assume light 0 is directional, light 1 is a point light.
//Actual light values are passed from the main OpenGL program.*/
uniform vec3 light0dirn;
uniform vec4 light0color;
uniform vec4 light1posn;
uniform vec4 light1color;
// Set material parameters. Could be bound to buffer, but for now they are uniform
// Ambient is just additive and doesn't multiply the lights.
uniform vec4 ambient;
uniform vec4 diffuse;
uniform vec4 specular;
uniform float shininess;
/*---- Compute Lighting ----*/
vec4 ComputeLight (
const in vec3 direction,
const in vec4 lightcolor,
const in vec3 normal,
const in vec3 halfvec,
const in vec4 mydiffuse,
const in vec4 myspecular,
const in float myshininess
){
float nDotL = dot(normal, direction);
vec4 lambert = mydiffuse * lightcolor * max(nDotL, 0.0);
float nDotH = dot(normal, halfvec);
vec4 phong = myspecular * lightcolor * pow(max(nDotH, 0.0), myshininess);
vec4 retval = lambert + phong;
return retval ;
}
/*---- Main Transforms ----*/
void main (void) {
if (istex > 0) fragColor = texture(tex, texcoord);
else if (islight == 0) fragColor = vec4(color, 1.0f) ;
else {
// They eye is always at (0,0,0) looking down -z axis
// Also compute current fragment position, direction to eye
const vec3 eyepos = vec3(0,0,0) ;
vec3 mypos = myvertex.xyz / myvertex.w; // Dehomogenize
vec3 eyedirn = normalize(eyepos - mypos);
vec3 normal = normalize(mynormal); // Compute normal for shading.
vec3 direction0 = normalize (light0dirn); // Light 0, directional
vec3 half0 = normalize (direction0 + eyedirn) ;
vec4 col0 = ComputeLight(direction0, light0color, normal, half0, diffuse,
specular, shininess);
vec3 position = light1posn.xyz / light1posn.w ; // Light 1, point
vec3 direction1 = normalize (position - mypos) ;
vec3 half1 = normalize (direction1 + eyedirn); // no attenuation
vec4 col1 = ComputeLight(direction1, light1color, normal, half1, diffuse,
specular, shininess);
fragColor = ambient + col0 + col1 ;
}
}
```
## Transforming a Light Source
Lights can be transformed just like other geometry primitives. However, only model-view matrix need to be applied, not ~~projection~~. There are 3 types of light motion:
1. Stationary: Set the transforms to identity
2. Moving light: Push matrix onto stack, move light, pop matrix
3. Moving light, with viewpoint: Make model-view matrix identity, then set light to [0,0,0] (the origin with respect to eye coordinates).
### Model light transform example
```cpp
/* Transform vectors using modelview transformation matrix*/
void transformvec (const GLfloat input[4], GLfloat output[4]){
glm::vec4 inputvec(input[0], input[1], input[2], input[3]);
glm::vec4 outputvec = modelview * inputvec;
output[0] = outputvec[0];
output[1] = outputvec[1];
output[2] = outputvec[2];
output[3] = outputvec[3];
}
```
### Lighting setup example
```cpp
const GLfloat one[] = {1,1,1,1} ;
const GLfloat medium[] = {0.5f, 0.5f, 0.5f, 1};
const GLfloat small[] = {0.2f, 0.2f, 0.2f, 1};
const GLfloat high[] = {100} ;
const GLfloat zero[] = {0.0, 0.0, 0.0, 1.0} ;
const GLfloat light_specular[] = {1, 0.5, 0, 1};
const GLfloat light_specular1[] = {0, 0.5, 1, 1};
const GLfloat light_direction[] = {0.5, 0, 0, 0}; // Dir lt
const GLfloat light_position1[] = {0, -0.5, 0, 1};
GLfloat light0[4], light1[4] ;
// Set Light and Material properties for the teapot
// Lights are transformed by current modelview matrix.
// The shader can't do this globally. So we do so manually.
transformvec(light_direction, light0) ;
transformvec(light_position1, light1) ;
glUniform3fv(light0dirn, 1, light0) ;
glUniform4fv(light0color, 1, light_specular) ;
glUniform4fv(light1posn, 1, light1) ;
glUniform4fv(light1color, 1, light_specular1) ;
// glUniform4fv(light1color, 1, zero) ;
glUniform4fv(ambient,1,small) ;
glUniform4fv(diffuse,1,medium) ;
glUniform4fv(specular,1,one) ;
glUniform1fv(shininess,1,high) ;
// Enable and Disable everything around the teapot
// Generally, we would also need to define normals etc.
// But the teapot object file already defines these for us.
if (DEMO > 4) glUniform1i(islight,lighting); // lighting only teapot.
/* Shader mapping */
vertexshader = initshaders(GL_VERTEX_SHADER, "shaders/light.vert") ;
fragmentshader = initshaders(GL_FRAGMENT_SHADER, "shaders/light.frag") ;
shaderprogram = initprogram(vertexshader, fragmentshader) ;
// * NEW * Set up the shader parameter mappings properly for lighting.
islight = glGetUniformLocation(shaderprogram,"islight") ;
light0dirn = glGetUniformLocation(shaderprogram,"light0dirn") ;
light0color = glGetUniformLocation(shaderprogram,"light0color") ;
light1posn = glGetUniformLocation(shaderprogram,"light1posn") ;
light1color = glGetUniformLocation(shaderprogram,"light1color") ;
ambient = glGetUniformLocation(shaderprogram,"ambient") ;
diffuse = glGetUniformLocation(shaderprogram,"diffuse") ;
specular = glGetUniformLocation(shaderprogram,"specular") ;
shininess = glGetUniformLocation(shaderprogram,"shininess") ;
```