A BRIEF INTRODUCTION TO TEXTURE MAPPING

So far we have just used a simple colored material for our mesh. If we want to create something more realistic we’ll have to start using a technique called texture mapping.

Put very simply, this means taking a square image and stretching it over the surface of a 3D object.

Of course, this will be very easy to do if the surface of the 3D object is square, and less easy of the surface is curved. Fortunately, the cube that we’ve been using so far will be very easy to apply textures to since every surface is flat.

Start by loading up the code from the previous chapter, and we’ll continue from there.

Mapping a 2D Texture onto a 3D Object Using a UV Map

How do we go about stretching a 2D texture over the surface of a 3D shape?

The answer to that is a technique called UV Mapping.

Let’s take a quick look at how that works now, and then try it out on our spinning cube.

UV Mapping Explained

A box geometry's vertices

Our cube geometry looks something like this, where each of the red dots is a vertex and has a position in 3D space defined by $(x, y, z)$ coordinates. Take a look back at Ch 1.1: Hello Cube! if you need a reminder of what the three.js coordinate system looks like.

We want to take a 2D square texture and map it onto our 3D geometry.

To do so we’ll imagine a 2D coordinate system on top of the texture, with $(0,0)$ in the bottom left and $(1,1)$ in the top right. Since we’ve already used the letters $x$, $y$ and $z$ for our 3D $(x, y, z)$ coordinates, we’ll call these 2D textures coordinates by the letters $(u, v)$, which is where the name UV mapping comes from.

Let’s create a texture now to help us visualize this. We’ll use a simple black and white checker pattern and label a few of the UV coordinates. Once we are done, we’ll have something that looks like this:

UV test grid texture

UV mapping is the process by which we map 2D $(u, v)$ coordinates onto 3D $(x, y, z)$ coordinates:

$$ ( u, v ) \longrightarrow ( x, y, z ) $$

In the figure below, we’re showing how we want the texture to map onto the front face of the cube.

3D coordinated mapped to UV coordinates

We’ve also drawn in the $(x, y, z)$ coordinates of the vertices of this face, so we can see that we want this mapping from UV coordinates to 3D coordinates:

$$ \begin{aligned} ( 0, 0 ) &\longrightarrow ( -1, -1, 1 ) \cr ( 0, 1 ) &\longrightarrow ( -1, 1, 1 ) \cr ( 1, 1 ) &\longrightarrow ( 1, 1, 1 ) \cr ( 1, 0 ) &\longrightarrow ( 1, -1, 1 ) \end{aligned} $$

It’s a very simple mapping for now since we are just mapping a square texture onto the square face of our cube, and we can use similar mappings for the other five faces.

Once we’ve set them all up, our cube mesh will look like this:

Actually, the BoxBufferGeometry that we are using has set up the mappings automatically for us, so we just need to load the texture and apply it to our material.

Take a few moments to examine the cube and the way that the texture has been mapped onto it now. You can use your mouse or touch screen to move the camera around and zoom in since we’ve added camera controls to the scene - we’ll see how to do this for ourselves in the next chapter.

We’ll come back to UV mapping in much more detail in Section 6: Understanding Geometry, but for now, let’s move on and take a look at how to load a texture.

Add a Texture to Our Material

  1. Load A texture With The TextureLoader
  2. Set the texture’s parameters
  3. Add the texture to the material’s color map slot

Load a texture with the TextureLoader and apply it to our material.map slot

  const geometry = new THREE.BoxBufferGeometry( 2, 2, 2 );

  // create a texture loader.
  const textureLoader = new THREE.TextureLoader();

  // Load a texture. See the note in chapter 4 on working locally, or the page
  // https://threejs.org/docs/#manual/introduction/How-to-run-things-locally
  // if you run into problems here
  const texture = textureLoader.load( 'textures/uv_test_bw.png' );

  // set the "color space" of the texture
  texture.encoding = THREE.sRGBEncoding;

  // reduce blurring at glancing angles
  texture.anisotropy = 16;

  // create a Standard material using the texture we just loaded as a color map
  const material = new THREE.MeshStandardMaterial( {
    map: texture,
  } );

  // create a Mesh containing the geometry and material
  mesh = new THREE.Mesh( geometry, material );

Loading a texture and applying it to a map slot in a material is very easy in three.js, as long as you are serving your page from a web server. If you’re using CodeSandBox or another online editor to follow along then everything is taken care of for you, but if you are working locally, i.e. loading the files directly from your hard disk, you will run into problems due to security restrictions on how JavaScript can read local files.

We’ll load a texture using the TextureLoader. There are a number of alternatives to this, which we’ll look at in Section 4: Materials and Textures, but using the TextureLoader is by far the most common and easiest method.

Once we’ve loaded and set up the texture, we’ll assign it to the .map slot in our material, and the TextureLoader will take care of all the technicalities involved in loading the texture for us.

Before we proceed, let’s make sure that we’re clear on all the technical terms that we’re using here.

1. Load A Texture With The TextureLoader

Create a TextureLoader and then use it to load the texture:

  const geometry = new THREE.BoxBufferGeometry( 2, 2, 2 );

  // create a texture loader.
  const textureLoader = new THREE.TextureLoader();

  // Load a texture. See the note in chapter 4 on working locally, or the page
  // https://threejs.org/docs/#manual/introduction/How-to-run-things-locally
  // if you run into problems here
  const texture = textureLoader.load( 'textures/uv_test_bw.png' );

  // set the "color space" of the texture
  texture.encoding = THREE.sRGBEncoding;

We’ll use the TextureLoader to load the texture. textureLoader.load returns an instance of Texture that we can immediately use in our material, even though the texture itself may take some time to load.

2. Set the Texture’s Parameters

Set the texture’s encoding and anisotropic filtering level

  const texture = textureLoader.load( 'textures/uv_test_bw.png' );

  // set the "color space" of the texture
  texture.encoding = THREE.sRGBEncoding;

  // reduce blurring at glancing angles
  texture.anisotropy = 16;

  // create a Standard material using the texture we just loaded as a color map
  const material = new THREE.MeshStandardMaterial( {

We need to tune a couple of settings on the texture. First up is the texture encoding.

Setting the Texture.encoding

Since textures can represent many things, three.js needs to interpret the data in different ways depending on the intended use.

In general, there are just two ways that we’ll need to interpret textures:

  1. The texture represents colors designed to be seen by human eyes
  2. The texture represents something else, such as bumps on a surface

A texture is made up of lots of individual pixels (when we’re dealing with a texture we call these texels) and each of these pixels represents a single color.

To understand why we need to interpret these colors differently in different situations, we’ll need to digress for a moment and introduce the concept of a color space.

Going back to our two possibilities above, we can now see why we need to make note of the difference:

  1. The texture represents colors designed to be seen by human eyes: this means that texture is in sRGB color space and needs to be converted to linear space before being used by the renderer
  2. The texture represents something else, such as bumps on a surface: this means that texture is already in linear space and can be used directly

This texture is going to be placed in the color map slot in our materials, so it seems a fair bet that we’re dealing with the first case above.

However, by default textures are assumed to have colors encoded in linear space - so we’ll need to tell the renderer that this texture has colors encoded in sRGB space instead:


texture.encoding = THREE.sRGBEncoding;

Reduce Texture Blurring by Setting the Anistropic Filtering Level

Next up is a parameter which will improve the appearance of nearly every scene that uses textures, although again, this only needs to be applied to textures representing colors designed to be seen by your eyes. This parameter is the anisotropic filtering level, which is stored in texture.anisotropy.

By default, this is set to $1$, which applies no filtering. We will increase this to &16&, which is the maximum level supported by most graphics cards:


texture.anisotropy = 16;

3. Add the Texture to the Material’s Color Map Slot

Assign the loaded texture to the material’s diffuse color map slot

  // reduce blurring at glancing angles
  texture.anisotropy = 16;

  // create a Standard material using the texture we just loaded as a color map
  const material = new THREE.MeshStandardMaterial( {
    map: texture,
  } );

  // create a Mesh containing the geometry and material

Now that we’ve successfully loaded our texture, we can assign it to the material.map slot. Once we’ve done so, we should see the texture show up on our spinning cube.

material.map uses a texture to describe how the color of the material changes over the surface of the object.

Our material has quite a few maps slots, but .map is the most commonly used and important map, so even though it should be called “colorMap” or something similar, this gets shortened to just .map.

Some of the other important map slots are:

Different material types may have different map slots. Make sure to check the docs page for the material you are using to see all the available slots.

Reduce the Brightness of the Light

Reduce the light’s intensity from 5.0 to 3.0

  // add the mesh to the scene object
  scene.add( mesh );

  // Create a directional light
  const light = new THREE.DirectionalLight( 0xffffff, 3.0 );

  // move the light back and up a bit

Now that we’ve put a texture on our cube, the light seems very bright, so we’ll reduce its .intensity from 5.0 to 3.0.

We’re back to using an unnamed “bare” parameter here, so to make this change we need to remember (or, more likely, check the docs to remind ourselves) that the second number passed to the DirectionalLight constructor is the intensity.

Set the Renderers GammaFactor and GammaOutput

Set the correct gamma correction factor and color space on the renderer

  // create a WebGLRenderer and set its width and height
  renderer = new THREE.WebGLRenderer( { antialias: true } );
  renderer.setSize( container.clientWidth, container.clientHeight );

  renderer.setPixelRatio( window.devicePixelRatio );

  // set the gamma correction so that output colors look
  // correct on our screens
  renderer.gammaFactor = 2.2;
  renderer.gammaOutput = true;

  // add the automatically created <canvas> element to the page
  container.appendChild( renderer.domElement );

We’re nearly done. However, since we’ve set up the texture to use the correct color space, we should also do the same for our WebGLRenderer.

Note that for our current scene, we won’t see any difference since we’re using this simple black and white UV test texture. Black and white are not affected by color correction, so changing these settings has no effect here.

However, once we come to more advanced scenes this will make a difference and we will already be set up and following best practices.

Final Result

Here’s our textured cube, happily spinning away. As we mentioned above, it may look black for a few seconds as you wait for the texture to download, which is normal.

It’s hard to examine closely while it’s constantly tumbling like that, so in the next chapter, we’ll add some interactivity with camera controls. These will allow us to pan, rotate and zoom/dolly the camera to get a view of our scene from any angle.