Discover three.js is now open source!
Word Count:4707, reading time: ~23minutes

The three.js Animation System

In the previous chapter, we introduced the glTF model format and showed you how to load three simple yet beautiful models of a parrot, a flamingo, and a stork.

These models were loaded from the binary glTF files parrot.glb, flamingo.glb, and stork.glb. Alongside the bird models, each of these files also contains an animation clip of the bird flying.

In this final chapter of the introductory section, we will introduce the three.js animation system and show you how to attach these animation clips to the bird models so that they can take flight.

The three.js animation system is a complete animation mixing desk. Using this system you can animate virtually any aspect of an object, such as position, scale, rotation, a material’s color or opacity, the bones of a skinned mesh, morph targets, and many other things besides. You can also blend and mix animations, so, for example, if you have a “walk” animation and a “run” animation attached to a human character you can make the character speed up from a walk to a run by blending these animations.

The animation system uses keyframes to define animations. To create an animation, we set keyframes at particular points in time, and then the animation system fills in the gaps for us using a process known as tweening. To animate a bouncing ball, for example, you can specify the points at the top and bottom of the bounce, and the ball will smoothly animate across all the points in between. The amount of keyframes you need depends on the complexity of the animation. A very simple animation may only need one keyframe per second, or less, while a complex animation will need more, up to a maximum of sixty keyframes per second (any more than this will be ignored on a standard 60Hz display).

The animation system is built from a number of components that work together to create animations, attach them to objects in the scene, and control them. We’ll split these into two categories, animation creation, and animation playback and control. We’ll briefly introduce both categories here, and then we’ll use our new knowledge to set up the flying animations that we have loaded from the three glTF files.

The Animation System: Creating Animations

We’ll start by examining how to create some simple animations that change the visibility, scale, or position of an object. However, it should be noted that most people don’t use the three.js animation system to create animations by hand. It’s best suited for use with animations that were created in external software like Blender. Instead, to create animations in code, most people prefer to use Tween.js for simple animations and GSAP for more complex animations (although any JavaScript animation library will work with three.js). Even official examples on the three.js website use Tween.js! Nonetheless, it’s important for us to understand how animation clips are created and structured, so let’s get started, and soon we’ll have those lazy birds up in the sky!

There are three elements involved in creating animations: keyframes, KeyframeTrack, and AnimationClip.

1. Keyframes

The lowest conceptual level in the animation system is a keyframe. Each keyframe consists of three pieces of information: a time, a property, and a value, for example:

  • At 0 seconds .position is $(0,0,0)$.
  • At 3 seconds .scale is $(1,1,1)$.
  • At 12 seconds .material.color is red.

These three keyframes each describe the value of some property at a specific time. Keyframes don’t specify any particular object, though. A position keyframe can be used to animate any object with a .position property, a scale keyframe can animate any object with a .scale property, and so on. However, keyframes do specify a data type. The .position and .scale keyframes above specify vector data, while .material.color keyframe specifies color data. Currently, the animation system supports five data types.

Data type Description Examples
Number Animate any property that is a single number MeshStandardMaterial.opacity
PerspectiveCamera.zoom
Vector Animate any property that is a vector Object3D.position
Object3D.scale
OrbitControls.target
Quaternion Animate rotations stored as quaternions Object3D.quaternion
Boolean Animate any Boolean property. This is less commonly used because there are no values between true and false so the animation will jump MeshStandardMaterial.wireframe
DirectionalLight.castShadow
String Animate any property that is a string Not commonly used

Notably missing from this list are Euler angles, which, if you recall from our chapter on transformations, are similar to vectors and are used to store rotations in Object3D.rotation. To animate rotations, you must use Object3D.quaternion. As we mentioned back in the chapter on transformations, quaternions are a bit harder to work with than Euler angles, so, to avoid becoming bamboozled, we’ll ignore rotations and focus on position and scale for now.

To create an animation, we need at least two keyframes. The simplest possible example is two number keyframes, say, animating a material’s opacity (how transparent/see-through it is):

  1. At 0 seconds .material.opacity is 0.
  2. At 3 seconds .material.opacity is 1.

An opacity of zero means fully invisible, and opacity of one means fully visible. When we animate an object using these two keyframes, it will fade into view over three seconds. It doesn’t matter what the actual opacity of the object is, the keyframes will override that. In other words, if we manually set:

Values set on an object are overridden by the animation system
    


mesh.material.opacity = 0.5;



  

… and then animate the object’s opacity, this value of 0.5 will be ignored, and the value in the keyframes will be used. Let’s take another example. Here are three vector keyframes representing positions:

  1. At 0 seconds .position is $(0,0,0)$.
  2. At 3 seconds .position is $(5,5,0)$.
  3. At 6 seconds .position is $(0,0,0)$.

When we animate a mesh with these keyframes, it will start at the center of the scene, then it will move to the top right over three seconds before reversing direction and moving back to the center, again taking three seconds to do so. The total animation will take six seconds (and you can choose whether to loop it or end there).

2. KeyframeTrack

There’s no class representing a single keyframe. Rather, keyframes are raw data stored in two arrays, times and values, within a KeyframeTrack. From here on, we’ll refer to a KeyframeTrack as simply a track. A track also stores the property being animated, such as .position, or .scale.

As with keyframes, keyframe tracks do not specify any particular object. A .material.opacity track can animate any object with a material that supports opacity, a .quaternion track can animate any object with a quaternion property, and so on.

KeyframeTrack is the base class, and there’s one sub-class for each data type:

We never use KeyframeTrack directly, instead, we will choose whichever subclass matches the data type being animated. Let’s look at a couple of examples. First, we’ll use a NumberKeyframeTrack to store these five .opacity keyframes:

  1. At 0 seconds .material.opacity is 0.
  2. At 1 second .material.opacity is 1.
  3. At 2 seconds .material.opacity is 0.
  4. At 3 seconds .material.opacity is 1.
  5. At 4 seconds .material.opacity is 0.

These keyframes will make an object blink in and out for four seconds. To create a keyframe track, we will create one array containing the times, and one array containing the values, and then pass those into the NumberKeyframeTrack constructor along with the property we want to animate.

Creating a number keyframe track representing opacity, with five keyframes
    


import { NumberKeyframeTrack } from "three";

const times = [0, 1, 2, 3, 4];
const values = [0, 1, 0, 1, 0];

const opacityKF = new NumberKeyframeTrack(".material.opacity", times, values);



  

Note how each entry in the times array maps to one entry in the values array. Next, let’s try some position keyframes and a VectorKeyframeTrack:

  1. At 0 seconds .position is $(0,0,0)$.
  2. At 3 seconds .position is $(2,2,2)$.
  3. At 6 seconds .position is $(0,0,0)$.

These three keyframes will make an object start at the center of the scene, move right, up, and forwards over three seconds, then reverse direction and move back to the center. Next, we’ll create a vector track with these keyframes.

Creating a vector keyframe track representing positions, with three keyframes
    


import { VectorKeyframeTrack } from "three";

const times = [0, 3, 6];
const values = [0, 0, 0, 2, 2, 2, 0, 0, 0];

const positionKF = new VectorKeyframeTrack(".position", times, values);



  

This time, note how each entry in the times array matches with three entries from the values array, representing a position in 3D space. This means the values array is three times larger than the times array.

Each time maps to an $(x, y, z)$ position
    


const times = [0, 3, 6];
const values = [
  0,
  0,
  0, // (x, y, z) at t = 0
  2,
  2,
  2, // (x, y, z) at t = 3
  0,
  0,
  0, // (x, y, z) at t = 6
];



  

3. AnimationClip

A dancing character from Mixamo.com

An animation of a character dancing like the one in this scene consists of many separate movements: feet pivot, knees bend, arms swing wildly, the head nods to the beat (soundtrack not provided). Each individual movement is stored in a separate keyframe track, so for, example, there is one track controlling the rotation of the dancer’s left foot, another controlling the rotation of his right foot, a third control the rotation of his neck, and so on. In fact, this dancing animation is made from fifty-three keyframe tracks, of which fifty-two are .quaternion tracks controlling individual joints like the dancer’s knees, elbow, and ankles. Then there is a single .position track that moves the figure back and forth across the floor.

These fifty-three tracks come together to create the animation, which we call an animation clip. An animation clip, then, is a collection of any number of keyframes attached to a single object, and the class representing clips is AnimationClip. From here on, we’ll refer to an animation clip as simply a clip. Animation clips can be looped, so, while this dancer’s animation is eighteen seconds long, when it reaches the end it loops and the dancer appears to dance forever.

Animation clips store three pieces of information: the name of the clip, the length of the clip, and finally, an array of tracks that make up the clip. If we set the length to -1, the array of tracks will be used to calculate the length (which is what you want in most cases). Let’s create a clip containing the single position track from earlier:

Create an AnimationClip using a single track of position keyframes
    


import { AnimationClip, VectorKeyframeTrack } from "three";

const times = [0, 3, 6];
const values = [0, 0, 0, 2, 2, 2, 0, 0, 0];

const positionKF = new VectorKeyframeTrack(".position", times, values);

// just one track for now
const tracks = [positionKF];

// use -1 to automatically calculate
// the length from the array of tracks
const length = -1;

const clip = new AnimationClip("slowmove", length, tracks);



  

Since we’ve set the length to -1, the tracks will be used to calculate the length, in this case, six seconds. We’ve given the clip a descriptive name, slowmove, to make using it later easier.

The AnimationClip is still not attached to any particular object. We’ll have to wait for the AnimationAction below for that. We can use this simple clip we have created with any object that has a .position property. However, as clips become more complex and contain more tracks, they start to become more deeply tied to a particular object. For example, you can’t use the dancing clip with one of the birds we loaded, since those don’t have the same internal structure as the human figure. However, you can use the clip with any other humanoid figure that has the same internal structure. Since this model was downloaded from mixamo.com, the dancing clip should work with other characters from mixamo.com, but it’s unlikely to work with just any humanoid model you download.

Now, let’s try making a clip that contains the opacity keyframes from earlier, as well as the position keyframes. This time, to save some space, we’ll write the times and values arrays inline rather than saving them to variables first, and we have also added a couple of extra opacity keyframes to make both tracks six seconds long.

A clip that animates both position and opacity
    


import { AnimationClip, NumberKeyframeTrack, VectorKeyframeTrack } from "three";

const positionKF = new VectorKeyframeTrack(
  ".position",
  [0, 3, 6],
  [0, 0, 0, 2, 2, 2, 0, 0, 0]
);

const opacityKF = new NumberKeyframeTrack(
  ".material.opacity",
  [0, 1, 2, 3, 4, 5, 6],
  [0, 1, 0, 1, 0, 1, 0]
);

const moveBlinkClip = new AnimationClip("move-n-blink", -1, [
  positionKF,
  opacityKF,
]);



  

This animation clip will work with any object that has a .position property and also a material with an .opacity property. In other words, it should work with a mesh. It will make a mesh move while blinking in and out. Once again, we have given the clip a memorable name, move-n-blink. Later, we might have lots of separate clips, and we can blend and mix them together. Giving each a unique name will make this easier for us. This time, note that the position track has three keyframes, while the opacity track has seven keyframes. Also, the length of each track is the same. This is not required, but the animation will look better if the lengths of the tracks match.

The Animation System: Playback and Control

Now, we have a simple animation clip that makes an object move while fading in and out. The next step is to attach this clip to an object and then play it. This brings us to the final two components of the animation system. First, the AnimationMixer allows us to turn a static object into an animated object, and finally, the AnimationAction connects a clip to the object and allows us to control it using actions such as play, pause, loop, reset, and so on.

4. AnimationMixer

To animate an object such as a mesh using the animation system, we must connect it to an AnimationMixer. From here on, we’ll refer to an AnimationMixer as simply a mixer. We need one mixer for each animated object in the scene. The mixer does the technical work of making the model move in time to the animation clip, whether that means moving the feet, arms, and hips of a dancer, or the wings of a flying bird.

Each AnimationMixer controls the animation of one object
    
import { Mesh, AnimationMixer } from 'three';

// create a normal, static mesh
const mesh = new Mesh();

// turn it into an animated mesh by connecting it to a mixer
const mixer = new AnimationMixer(mesh);

  

We also need to update the mixer each frame, but we’ll come back to that in a moment.

5. AnimationAction

The final piece of the puzzle, the AnimationAction connects an animated object to an animation clip. The AnimationAction class is also where the controls such as pause, play, loop, and reset are located. We’ll shorten AnimationAction to action from here on (it helps if you shout out “action” like a director whenever you create one). Unlike the other animation system classes, we never create an action directly. Instead, we’ll use AnimationMixer.clipAction, which ensures the action is cached by the mixer.

Let’s see this in action. Here, we take the moveBlinkClip we created a few moments ago, then connect a mesh to a mixer, and finally. we use .clipAction along with the clip to create an action.

Create an AnimationAction using .clipAction
    


import { AnimationClip, AnimationMixer } from "three";

const moveBlinkClip = new AnimationClip("move-n-blink", -1, [
  positionKF,
  opacityKF,
]);

const mixer = new AnimationMixer(mesh);
const action = mixer.clipAction(moveBlinkClip);



  

Let’s look at another example. Suppose we have a model of a human and a clip of the character walking. Once again, we connect the model to a mixer and then create an action using .clipAction. We then immediately set the action’s state to playing:

Create an and then set its state to playing
    
const mixer = new AnimationMixer(humanModel);

const action = mixer.clipAction(walkClip);

// immediately set the animation to play
action.play();

// later, you can stop the action
action.stop();

  

Note that, although we called .play, the animation will not start yet. We still need to update the mixer in the animation loop, which we’ll do in a moment.

Suppose this character can run and jump as well. Each animation will come in a separate clip, and each clip must be connected to one action. So, just as there is a one to one relationship between a mixer and a model, there is a one to one relationship between an action and an animation clip.

Each animation clip needs a separate animation action
    
const mixer = new AnimationMixer(humanModel);

const walkAction = mixer.clipAction(walkClip);
const runnAction = mixer.clipAction(runClip);
const jumpAction = mixer.clipAction(jumpClip);

  

The next step is to choose which one of these actions to play. How you go about this will depend on what kind of scene you’re building. For example, if it’s a game, you’ll connect these actions up to the user controls, so the character will walk, run, or jump when the appropriate button is pressed. On the other hand, if it’s a non-playable character, you might connect these up to an AI system and let that control the character’s movements.

Another thing you need to consider is what happens when the character stops walking and starts running. If you move instantly from one animation to another, it won’t look very good. Fortunately, the AnimationAction contains controls that allow you to blend two clips, gradually slow a clip to a stop, loop a clip, play in reverse, or at a different speed, and lots more. At the start of the chapter, we claimed that the three.js animation system is a complete animation mixing desk. More accurately, we should have said that AnimationAction is a complete animation mixing desk since this is where most of the controls are.

Update the Animation in the Loop

There is just one thing left to do before any animations can play. We need to update the animated object in the animation loop. The mixer has an update method, which takes a time delta parameter. Whatever amount of time we pass in to mixer.update, all actions connected to the mixer will move forward by that amount.

Move all animations connected to the mesh forward by one second
    


const mixer = new AnimationMixer(mesh);

const updateAmount = 1; // in seconds

mixer.update(updateAmount);



  

However, normally we don’t want to jump forward an entire second. Each frame, we want to move the animation forward by a tiny amount, so that when we render sixty frames a second, we see a smooth animation. We’ll use the technique that we derived a few chapters ago, when we first created the animation loop and used it to drive a simple rotating cube, so refer back to the chapter on setting up an Animation Loop for a refresher. In short, we measure how long each frame takes to render, store that in a variable called delta, and then pass that into the mixer’s update method.

We need to update the mixer by delta every frame
    


const mixer = new AnimationMixer(mesh);
const clock = new Clock();

// you must do this every frame
const delta = clock.getDelta();
mixer.update(delta);



  

As usual, we’ll do this by giving each animated object a .tick method. Here, .tick will call the mixer’s update method.

Use the animated object’s .tick method to update the mixer
    
const mixer = new AnimationMixer(mesh);

mesh.tick = (delta) => mixer.update(delta);

updatables.push(mesh);

  

This is similar to the orbit control’s .tick method from a few chapters ago.

Play the Animation Clips from Parrot.glb, Flamingo.glb, and Stork.glb

Now that we have seen how to create a very simple if somewhat boring animation clip that moves an object across the scene while fading it in and out, let’s turn our attention to the more interesting clips that we have loaded alongside our three bird models. Each of the three files, Parrot.glb, Flamingo.glb, and Stork.glb, contain both a model and an animation clip of that model flying. These models are not that different from the simple cube mesh we’ve used in several previous chapters. Each bird is a single Mesh, with a geometry and a material, although the geometry has a feature called morph targets (AKA blend shapes). Morph targets allow us to define two (or more) different shapes for a single geometry. Here, there is one shape with the wings up and one with the wings down. The flying clip animates between these two shapes to make it look like the bird’s wings are flapping.

Let’s put everything we have learned so far into action. Here’s what we need to do to play the animation clips that come with each bird:

  1. Locate the flying clip from the data loaded from each glTF file.
  2. Create an AnimationMixer to control each bird model.
  3. Create an AnimationAction to connect the clip to the mixer.
  4. Add a .tick method to each bird and update the bird’s mixer every frame.

Nearly everything can be done in a couple of lines within birds/setupModel.js. Over in World, we need to add the birds to the updatables array so that the animations will be updated in the loop.

Where to Find the Loaded Animation Clips

Inside the components/birds/birds.js module, we currently log the raw data loaded from Parrot.glb to the console:

birds.js: log loaded data
    
console.log('Squaaawk!', parrotData);

  

Open the browser console and take a look now. We described this data in detail in the previous chapter, so check back there if you need a refresher. The data contains two elements of interest: a bird-shaped mesh that we extracted in the last chapter, and an animation clip of the bird flying. In the last chapter, we located the mesh in gltf.scene. Here, we’ll extract the animation clip and attach it to the mesh to make the bird take flight. You’ll find the animation clip in the gltfData.animations array:

Locate the animation clip in the loaded data
    
{
animations: [AnimationClip]
asset: {…}
cameras: []
parser: GLTFParser {…}
scene: Scene {…}
scenes: […]
userData: {}
**proto**: Object
}

  

Here, each file contains just a single clip, but a glTF file can contain any number of animation clips. For example, a file containing a model of a human might also have clips of the character walking, running, jumping, sitting down, and so on.

Next, update setupModels to extract the clip:

setupModel.js: extract the clip from the loaded data
    


function setupModel(data) {
  const model = data.scene.children[0];
  const clip = data.animations[0];

  return model;
}



  

Create the Mixer and Action

Now, we’ll create the mixer and the action. First, import the AnimationMixer. We’ll use AnimationMixer.clipAction to create the action, so there’s no need to import AnimationAction. Then, create the mixer, passing the bird model into the constructor.

setupModel.js: import and create the mixer
    


import { AnimationMixer } from "three";

function setupModel(data) {
  const model = data.scene.children[0];
  const clip = data.animations[0];

  const mixer = new AnimationMixer(model);

  return model;
}



  

Next, use .clipAction to create the action, passing in the clip, then immediately set the action to playing:

setupModel.js: create the AnimationAction using .clipAction
    
function setupModel(data) {
const model = data.scene.children[0];
const clip = data.animations[0];

const mixer = new AnimationMixer(model);
const action = mixer.clipAction(clip);
action.play();

return model;
}

  

That’s all there is to it. All that remains is to update the now animated bird in the loop.

Create the .tick Method

Still in setupModel, add a .tick method to the model:

setupModel.js: create the .tick method
    function setupModel(data) {
  const model = data.scene.children[0];
  const clip = data.animations[0];

  const mixer = new AnimationMixer(model);
  const action = mixer.clipAction(clip);
  action.play();

  model.tick = (delta) => mixer.update(delta);

  return model;
}

  

Inside this method, we’re calling mixer.update each frame, passing in delta, which is the amount of time the previous frame took to render. The mixer uses delta to keep the animation in sync even when the frame rate fluctuates. Again, refer back to Ch 1.7 for a more detailed discussion.

Add the Birds to updatables

Finally, over in World, add all three birds to the updatables array:

World.js: add the birds to the updatables array
      async init() {
    const { parrot, flamingo, stork } = await loadBirds();

    // move the target to the center of the front bird
    controls.target.copy(parrot.position);

    loop.updatables.push(parrot, flamingo, stork);
    scene.add(parrot, flamingo, stork);
  }

  

At this point, if everything has been set up correctly, your birds will take flight!

You’ve Reached the End of the Book - for now :)

With our birds on the wing, you have reached the end of the book. Congratulations!

We’ve covered a lot here in a short time, including cameras, geometry, meshes, textures, physically based materials, direct and ambient lighting, rendering our scenes with WebGL, transformations, coordinate systems, and the scene graph, vectors, loading external models, the glTF asset format, and even the three.js animation system, which is a complex beast. While learning about all that, we also found the time to create a simple but well-structured application that you can build on for three.js applications of any size.

But, don’t stop now! We’ve laid the groundwork, but we still have a long way to go on our journey to becoming three.js experts. It’s time for you to take things to the next level on your own. Good luck!

P.S. we’re not quite done yet, you still have to complete all the challenges!

Challenges

Import Style
Selected Texture