I'm a gameplay programmer
focused on developing
engaging and unique experiences!

My Resume Learn more

Rendering Textures

One of the most common, but useful assets graphics systems support is textures. Textures are simply an image file that can be used in shaders to give meshes a variety of detail. How this works is that the vertex shader can pass texture coordinates to the fragment shader to use for sampling what color a fragment should be based on a texture. This method takes advantage of the interpolation of values that are passed from the vertex shader to the fragment shader. The original texture coordinates can be edited in 3D modeling software like Maya.

If we weren't using textures, we could also use vertex colors to "paint" our meshes, but it would require a very vertex dense mesh to achieve high detail.

Textures as Data

To add textures into a graphics pipeline, we have to first define where they exist. Materials would be a good place to put them because textures can act as a parameter that changes the final output of our shaders. In a simple material file, this might look like this:

return
{
	path_effect = "Effects/standard.effect",
        color = { 1.0, 1.0, 1.0, 1.0 },
        path_texture = "Textures/Stone_02_COLOR.tga",
}

This means textures also need to be integrated into an Asset Build Pipeline to be loaded at runtime.

Texture Coordinates

For our fragment shader to know what texels of the image to sample, we need to pass it texture coordinates! Every vertex of a mesh has coordinates which specify what texture coordinates to sample from. To make sure our vertex and fragment shaders get them, we need to make sure our vertex input layout accounts for them:

auto& textureCoordinateElement = layoutDescription[1];
textureCoordinateElement.SemanticName = "TEXCOORD";		
textureCoordinateElement.SemanticIndex = 0;
textureCoordinateElement.Format = DXGI_FORMAT_R32G32_FLOAT;
textureCoordinateElement.InputSlot = 0;
textureCoordinateElement.AlignedByteOffset = offsetof( eae6320::Graphics::VertexFormats::sMesh, u );
textureCoordinateElement.InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA;
textureCoordinateElement.InstanceDataStepRate = 0;

Next, we need to make sure our vertex and fragment shaders take texture coordinates as inputs:

in const float2 i_textureCoordinates : TEXCOORD

Sampler States

Just as there is more than one way to skin a cat, there is more than one way to sample a texture! The most common one used in games is Tiling, as this makes reusing our textures much simpler. Each texture can have different sample states, but I've decided to just stick with Tiling.

Result

After adding texture coordinates and sampler states to our shaders, we can finally sample our texture in our fragment shader!

float4 textureColor = SampleTexture2d(g_diffuseTexture, g_diffuse_samplerState, i_textureCoordinates);

The result can be seen in the image below on the right. That's a lot of detail on a mesh with only four vertices! Something to note on the left is a quad with a default texture. This is because we don't to make users of our materials have to specify a texture if they aren't using one.

No Comments Yet.

Leave a comment