## Directional & Ambient Lighting

Perhaps the most visually stunning part of a graphics system that breathes life into objects it renders is lighting. While the current hot topic in this area is real-time ray tracing, most games running in the years before rarely had the computational budget to simulate lighting accurately. We'll see this reflected in two common lighting techniques, ambient and directional lighting.

### Diffuse Lighting

Before we talk about how to simulate lighting, it would be best to get a general understanding of how lighting works in the first place!

One way to conceptualize light is to imagine it like a ray. When this ray collides with an object, it may be reflected, absorbed, or scattered. Which behavior the light exhibits is based on inherent properties of the material it collides with. For ambient and directional lighting, light scatters around the normal vector of the surface. This is called a diffuse reflection and is calculated by taking the dot product of the incoming light's direction and the surface's normal. For real time simulations, we want to make sure to clamp the result of the dot product between 0 and 1 as it will be used to calculate the resulting light color of the diffuse reflection.

```
const float dotProduct = dot(g_lightDirection, i_normal_world);
const float clampedValue = saturate(dotProduct);
```

The reflected light color can then be calculated by multiplying that clamped value with the light's color.

`float4 directionalColor = g_directionalLight_color * dotProduct;`

### Vertex Normals

Because diffuse lighting is calculated based on the normal of the surface it collides with, we need to provide shaders with this information. Since vertex shaders operate on a mesh's vertices, we can provide it the normal of the vertex as well! This is usually computed by a 3D modeling software program like Maya, where the normals of each vertex are calculated based on the normals of the vertex's surrounding triangles.

#### Vector Space

The normals outputted by the 3D modeling software would be in local space, which affects which space we need our light direction to be in. Thus, we have 3 different options:

- Light direction can be in the local space of our normal, but must be calculated for every draw call
- Light direction can exist in world space, which means our normals would have to be transformed to world space
- Light direction can exist in view space, which means we would need to transform both into view space

Since the 3rd option requires multiple transforms, one of the first two options is preferable. I like option 2 because it's easier to think of a directional light that exists in world space. This means in our shaders we have to transform our normals to world space:

```
float4 normal_local = float4( i_normal, 0.0 );
o_normal_world = mul( g_transform_localToWorld, normal_local).xyz;
```

Notice how converting the normal into a 4 dimensional vector sets the 4th dimension be 0. This is so that we ignore the translation of a transform matrix. One thing to note is that this calculation is better performed in the vertex shader since there are usually less vertices than fragments. However, the fragment shader gets interpolated inputs from the vertex shader's outputs, which means we need to make sure we normalize the normal before using it in lighting calculations.

### Ambient Lighting

In the real world, there usually isn't a situation where you are in pitch darkness. To quickly simulate this in games, we simply define a very low color value to add to our directional light.

`const float3 lightColor = directionalColor + g_ambientColor;`

### Result

Finally, with all of the above we can compute the final color a fragment shader should output.

`o_color = (g_color * textureColor) * lightColor;`

By moving the directional light around the scene, we can achieve lighting as displayed below: