Animation in Unity
Animation in Unity
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Enabling animators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Motion libraries. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Mesh. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Materials. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Skeleton . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Avatar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Animations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Animator Controller. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Layers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Blend Trees. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Character Controller. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
¬Animating in Unity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Filter by selection. . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Animation events. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Root motion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Blend shapes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Animation Rigging. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Set up an IK Rig. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Multiple Rigs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Foot IK. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Animated cutscenes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Keyframing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Track types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Activation track . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Audio track. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
Control track. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Playables track. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Signal track. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Cinemachine tracks. . . . . . . . . . . . . . . . . . . . . . . . . 87
Track group. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Sequences. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Advanced simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Rigidbody Physics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Cloth Physics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Ragdoll physics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Alembic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Particles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
AI characters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
2D Animation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Conclusion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Introduction
Animation is a crucial part of game design, with many games now featuring animated content
that’s close in visual quality to that of cinematic films.
Animators carry the story of a game forward through characterization and visual storytelling.
They are responsible for character rigging, posing, and keyframing animation sequences. As
the role of the animator continues to evolve in game development, so do the animation tools
and workflows in Unity.
This guide is for both experienced and new animators who are interested in understanding
how to use the dedicated animation tools, methods, and techniques available in Unity’s
animation system.
You’ll learn about how to import animations into Unity, the generic and humanoid animation
types, animate and rig in the Editor, create animated cutscenes with Timeline, advanced
animation options, and more.
Contributors
Pete Jepson is a professional tutor who has taught creative media subjects like film making,
animation, photography, graphic design and game design for the past decade. With ten years
of experience in Unity, he has multiple design courses on Udemy.com.
Unity Contributors
Dave Hunt is a technical artist for animation and rigging at Unity in Copenhagen. He is
passionate about empowering artists and animators with reproducible pipelines and seamless
interoperability. He worked in game development for 18 years in the Seattle area before joining
Unity in 2017. He contributed to games from the Halo and Destiny series at Bungie as leader of
the Rigging Tech Art team, while also teaching part-time in the animation capstone program at
the University of Washington.
I discovered the joy of animation as a kid, with my own stop motion projects of toy robots and
figures made of clay and wood. The exhilaration of seeing my ideas come to life drove me to
take on more ambitious projects; eventually it led me into the computer animation capstone
program at the University of Washington. Little did I know that my choice of study would
lead to a meeting with Frank Thomas at his home in Burbank, California, where he personally
reviewed my student animation project. I will never forget the youthful glow in his eyes at age
91, while he investigated and articulated just what makes an animated dog seem so much like
a real dog. He was constantly observing life and taking note of the beautiful subtleties of how
creatures move so that he could integrate them into his animated works.
Since that encounter with a true master, I’ve felt compelled to share the joy of animation with
everyone. I’ve been lucky to have the opportunity to teach animation in the capstone program
at University of Washington for over 15 years, while also working day jobs as an animator and
technical artist in game development since 2000. And I think the creative joy that comes from
animation is the same, whether you’re working at a AAA game studio or on an indie game jam
with a small team of friends.
I’m so glad to collaborate with the authors of this e-book and help both newcomers and
professionals unlock the doors to animation in Unity, where interactive animation presents
great opportunities for creative expression.
I remember well when my best friend Jay and I first downloaded the third-person platformer
tutorial with Lerpz the frog character, for Unity 3. We had a book with a chapter that included
the few lines of code we needed to take player input and move the character forward while
playing the walk animation clip. And with that we were off to create our own stories and
worlds and interactive experiences, with only our imagination as limits. We were both veteran
animators and game developers at the time yet we still benefited from that brief introduction
that helped us to get started in this new context. I hope that you too will find what you need
within these pages to bring the tools of Unity animation within your command, and I would be
so excited to see what kinds of creative works you can bring to life.
The process of moving from character concept to modeling, rigging, and animating is a well-
trod sequence for character creation in game development. But it’s not perfect and in general,
artistic productions are rarely a straightforward process!
For example, a character concept could include legs that are too long to create a natural-
looking locomotion cycle, or a composition of the arms and torso that doesn’t allow the
character to hold their weapon in a realistic way. Ideally, the character creation process should
allow for non-destructive iteration on character designs. Dave Hunt and Forrest Söderlind
give us a glimpse of how they structured their work in their GDC 2015 talk. They developed
a system that worked well for their large-scale production. This system included using a
modular, procedurally-controlled rig building framework in Maya that enabled gameplay,
storytelling and worldbuilding to come together efficiently. You can see some of the key steps
in their process in the image below. This workflow could be done in Unity as well.
Main stages of character development, as show in the GDC 2015 talk, character images are from the Unity demo Adam
Enabling animators
In game development it’s often the technical artist that enables efficient collaboration between
artists, developers, and designers.
Scripting, rigging and technical art are disciplines that can allow a larger production to
maintain thousands of files with automated processes.
Here are a few general tips for organizing your animation files, but ultimately every project is
unique and the goal should always be to make a pipeline that works for your production.
— Naming conventions: Characters are made up of many objects, geometry, bones, and
accessories. A good practice is to standardize names so everyone in the team knows
how to navigate the hierarchies. Find a good balance between simplicity and readability.
Standardized, easy-to-understand naming can also be useful if custom tooling has been
created for the animators on your team.
— Scene organization: If you are an animator planning to work directly in Unity, consider
making a sandbox scene and/or working in prefab mode for common gameplay
animations.
— Asset version tracking and automation: The AssetPostprocessor class in Unity enables
you to run code upon asset import or apply automated import settings with Presets. This
can be useful to efficiently verify that assets comply with the team’s standards, enabling
you to focus more on the actual content creation.
— Mock up in Unity and use the FBX Exporter: Designers can mock up animations and
cinematic sequences in Unity with systems like Timeline that make it efficient to express
general intent and timing. Prototyped animations can then be exported via the FBX
Exporter to the animator’s preferred DCC software for further polishing.
— Visualizers and custom Editor tools: Unity offers great flexibility through Editor tooling,
allowing you to create custom interfaces for your needs, like visual control rigs, or other
tools that make it easier for artists to use many of the animation tools’ APIs in Unity.
— IK in Unity with the Animation Rigging package: Enabling runtime rigging enables your
characters to be in contact with the game world. Sweeping a hand across a surface or
gripping and turning a door handle are just a couple of examples of subtle movements
that require modifications in the bone chain to make them look realistic. Unity’s
Animation Rigging package will allow you to create these detailed movements, making it
a great addition to your character creation process.
The Unity Manual is also an important resource and includes a comprehensive section on
Animation. The following short intro is from the manual.
— Easy workflow and setup of animations for all elements of Unity including objects,
characters, and properties
— Support for imported animation clips and animation created within Unity
— Humanoid animation retargeting – the ability to apply animations from one character
model onto another
— Convenient preview of animation clips, transitions and the interactions between them
The screenshots in this guide are made using the Universal Render Pipeline (URP), Unity’s
multiplatform rendering solution built on top of the Scriptable Render Pipeline (SRP)
framework. URP is one of three render pipelines available in Unity, the other two being the
High Definition Render Pipeline (HDRP) and the Built-In Render Pipeline.
Learn more about URP, HDRP, and Unity toolsets for artists, technical artists, and designers in
the following e-books:
Many studios use software like Autodesk Maya or Blender to create animations that can then
be imported into Unity.
Large professional projects often use motion capture, employing actors to perform the
motions that will result in realistic movement in the content. Actors wear motion capture suits
that have markers spaced across them at key points aligned to a 3D skeleton. High-speed
infrared cameras capture the actor’s motion, with the data then passed into software like
Autodesk MotionBuilder. Face motion is captured in the same way, with markers spread across
the actor’s face to record realistic facial movement.
Unity’s demo team in Stockholm used motion capture to create the animated shorts Adam,
The Heretic, and Enemies.
Realistic human face and body movement in games and other real-time 3D content often relies on motion capture.
In this blog post, animator Krasimir Nechevsky discusses the process of previs, motion
capture, and animation for Unity’s Adam.
You can also use Unity’s ARKit, together with an Apple iPhone, to capture facial movements
and expressions to control a character’s facial rig and mouth movement. See the Unity project
The Realtime Rascals for a demonstration of this method.
Motion libraries
If you don’t have access to a motion capture suit and motion capture software, you can
use premade animated clips from motion capture libraries. These can save you time and be
effective for animations on a smaller budget, but you probably won’t achieve the same level of
fidelity possible with the performance of a real actor.
Adobe Mixamo and Reallusion ActorCore are motion library websites that feature thousands of
small motion clips that can be downloaded into your 3D software of choice, including Unity.
In addition to having a range of free characters to choose from, you can upload your own
character. If each character has a different skeletal structure the animations that play on one
of them might not play on another, unless they are using a humanoid animation type. If you
plan to use a humanoid animation type ensure that you download the T pose to begin with, as
this is necessary for the humanoid setup (this is covered in a later section of the guide).
When you download the models from Mixamo you need to select the FBX for Unity format and T-pose settings.
The Unity Asset Store is a great place to find premade animations. Search the animation
category and download packages to Unity through the Package Manager. The animations
should be set up automatically and ready to use.
In addition to premade animations you can also download ready-made tools to enhance the quality of your project. Sakura Rabbit, whose
,
character and animation work is seen in the image above, uses Unity Asset Store packages like Final IK MagicaCloth, and Face Capture.
— Setting up AI characters.
Video tutorial
However, the automatic export option will make assumptions that might not apply to every
situation. Often at large studios, a technical artist will use scripting to create a custom tool for
exporting that uses the method best suited to their team.
For more detailed control, follow these steps to manually export your character:
1. Switch off any objects you don’t want to export by clicking on the visibility (eye) icon that
appears alongside the object. The character you want to export should be the only object
still visible.
FBX is the preferred format for 3D models that have embedded animations. You can import
FBX files into Unity by dragging them directly from your computer’s file browser into the
Editor’s Project window.
Structure of an FBX file (from Unity’s Starter Assets - ThirdPerson CharacterController package)
The FBX file will have a number of embedded components and properties, including mesh,
geometry, materials, skeleton, avatar, and animations.
Mesh
A mesh is 3D model data that defines
the shape of an object. In Unity, a
Mesh asset holds all the vertices,
normals, blend shapes and UV data of
a model.
Geometry
Geometry is the physical model
powered by the mesh. This contains
a Skinned Mesh Renderer component
that defines how this looks in the
scene and is skinned to the skeleton,
so it moves as the bones of the
skeleton move. Materials can be
assigned to the geometry in the
Skinned Mesh Renderer component.
Materials
Materials are assets that define how
the surface of an object appears
when rendered and use a Unity
shader to combine textures which
are applied to the geometry to
define the look of the model in the
scene.
Skeleton
The skeleton is a series of bones
(at least 15 for a biped character),
that are skinned to the mesh
to animate the movement of
the character in the scene. The
skeleton is hidden by default in
the scene and game views. You
can access it in the Hierarchy.
If you want to visualize the
skeleton, use the bone renderer
explained in the Animation
Rigging section later in this
e-book.
A skeleton that is used to animate the FBX file
Avatar
The avatar defines the bone layout
for a character which defines how a
character or object can be animated
using the Unity Animator system.
Generic avatars can be non biped
structures suitable for animals and
creatures.
Humanoid avatar layout defining the bone names for an FBX file
Animations
FBX files can store multiple embedded
animation clips. Each animation clip stores
keyframed motion data that animates the
skeleton of the character.
— Model tab
— Rig tab
— Animation tab
— Materials tab
Unity uses two main animation types to control character animations, Generic and Humanoid.
Humanoid provides a universal structure that can be applied to all biped characters. Generic is
specific to each character’s skeletal structure. The animation curves are position, rotation, and
scale, and normalized muscle space.
The generic animation type can be used for animators who want to keyframe animations
directly inside Unity using forward kinematics (FK), and in cases where it’s unnecessary to
share animations between characters with different skeletal structures.
If you import your own FBX file it will be set to Generic by default. This animation type is
unique to that character and its skeletal layout. However, other animations for characters
that have a different skeletal setup might not work with this character when using a generic
animation type, but generic animations will still work with characters that have the same
skeletal layout.
You can change its animation type at any point from the drop-down menu in the Rig tab of the
FBX file. You can also create an avatar from this model if it does not already have one. It’s best
if you can create avatars from models in the T or A pose.
Animation-only FBX files will also use a generic animation type by default. They will need
to reference the avatar from the FBX character file to ensure the animations are matched
correctly to that character’s skeleton.
You can define animation clips in the Animation tab of the FBX import settings by clicking the
+ icon beneath the clips section. Sometimes there are multiple motions within the same clip,
in which case it’s necessary to separate each one into their own clip. If nothing is defined, the
clip is named Take01.
You can change the timing of the clip by adjusting the in/out markers, or typing a frame number
into the Start or End slot. Click Apply to import the animation clips to the character prefab.
You also have the option of setting the clip to loop as well as setting a cycle offset if you find
the loop is not seamless. Adjusting that by a few frames can ensure the first and last keyframe
of the clip match up.
Animator Controller
An Animator Controller is responsible for storing and driving all animations for a character.
Right-click in the Project tab and create an Animator Controller via Create > Animator
Controller. The character will need an Animator component to hold the Animator Controller.
Animator component
A quicker way to achieve this is to drag a clip from the Project view onto the character prefab
in the Hierarchy. This adds an Animator component to the character and the animation clip to
an Animator Controller.
To access the controller, open the Animator window via Window >Animations >Animator or
double-click the Animator Controller in the Project view. You can now drag in the animation
clips to assign them to the character. Ensure that the animations match the same animation
type set for the character: Generic animations for a generic character, and humanoid
animations for a humanoid character.
The orange block seen in the screenshot of the Animator window above, is an animation state
node. This is the default animation that will play first. Add transitions from one node to the
next by right-clicking on a node, choosing Make Transition, and moving to the animation node
of choice.
In the example above the transitions go from Idle to Walk_N and back again, creating a
constant loop. The two other nodes for the jump have no direct transition to Idle or Walk_N,
so they will not play. For an animation to be active it needs a transition from any node that
is connected to the default state issuing from the orange node, or a transition from the Any
State node.
Click on a transition line to see the settings for that particular transition including the
conditions for when and how the transition happens.
The small blue section highlighted by the red box in the image above is the blend time
between the two animations. The longer the blend, the smoother the transition. This prevents
noticeable pops when changing from one animation to the next.
Drag the two blue arrows to extend or shorten the blend time between the two animations.
If you select the option Has Exit Time, the first animation will have to play entirely before
making the transition to the next. Select this if there are no conditions set.
Parameters
The Animator Controller needs to be interactive so the player can drive animations, for
example, by pressing certain buttons. To be interactive, the controller needs parameters.
These are variable inputs that can be set as conditions in the animation transitions.
Click the + icon to add a Bool to the parameter called Walking. A Trigger parameter called
Jump can also be set.
Now in the transition from Idle to Walk_N, the Walking condition can be set true. Deselect
the Has Exit Time option and the transition happens immediately without having to play the
entirety of the Idle animation.
A bool condition is set from the walking parameter to control when the transition happens.
For the transition from Walk_N back to Idle, the Walking condition can be set to false.
How a script controls the transitions from two nodes in the animator
The script gets the Animator component on the character in order to control it using the
animator = GetComponent<Animator>(); in Start(). During the Update method it waits for the
player to press the W or Up arrow key (this is vertical input). When this happens the walking
bool in the animator is set to true. Now the transition from Idle to Walk_N takes place.
It continues to play the Walk_N animation as long as the player is pressing up. When the player
stops pressing up, the walking bool in the animator is set to false. The transition from Walk_N
to Idle takes place and the Idle animation continues to play until the player pushes the Up
arrow key again.
The Update method is also waiting for the player to press the jump key (by default this is set
to the spacebar). When this happens the animator runs the Jump trigger that plays the jump
animation once for every press on the spacebar.
It has two transitions for the jumping animation, from both idle and walking, because you
might decide to jump from either state, with the condition of both transitions set to Jump.
A trigger parameter condition is set to control when the jump happens from both Walk_N and Idle.
Another way to achieve this is to have a transition from the Any State node. Be aware that this
can then be called from any clip in the entire animator. To keep the animation loop active it has
a transition back to Idle once completed.
Using the Any State node for transitions from any animation node
You can click on the animation node to adjust the play speed to either slow this down or speed
it up. You can also assign certain animations tags in the Tag field, which are useful when
coding. In the example below, the Walk_N animation has a motion tag, but the Idle animation
has none. This will allow the code to identify when the character is in motion.
Use an AnimatorStateInfo struct to monitor animations from script. This passes information
to the script every frame on the current activity of the animator. In the example below, in the
Update method, it gets information from the animator on the base layer, which is layer 0.
If an animation clip with the motion tag plays, it will move the character forward using the
transform.Translate command.
The AnimatorStateInfo struct can be used to check when animations are playing or transitioning,
providing you with a powerful way to time your code with the Animator component.
Layers
You can also use layers to divide up different sets of animations for a character. The different
body parts of the character can use different animations, allowing you to combine multiple
animations into a new unique animation.
The basic walk animation is on layer 0, called the Base Layer, in the Animator:
A new layer in the Animator is added and uses a jump animation, which has the arms in a lifted
position. The transitions are the same as the Base Layer. The Avatar Mask is applied to the
Mask slot which is accessed by clicking on the cogwheel Settings icon.
You can control this layer by its weight. At weight 0 it has no effect over the base layer. At
weight 1 it has full strength and overrides the bones of the base layer. You can vary the weight
to get interesting results and animate to achieve a smooth blend between animations.
To control layer weight in a C# script you can adjust the parameters in brackets (layer index,
layer weight). The code example below shows the Arms layer is at index 1 and has a weight of
1 when you press Shift, and a weight of 0 when you release Shift.
During Play mode, when holding Shift and walking, the script combines the two animations.
Blend Trees
Use a blend tree when more than two animations need to be blended together dynamically
based on input parameters. The Unity-made Starter Assets – ThirdPerson package uses blend
trees to control a third-person character motion.
Right-click in the Animator Controller window and choose Create state > From New Blend
Tree in the context menu to add a blend tree.
Double-click on this to enter the blend tree setup. If you want to exit the blend tree use the
buttons at the top left.
Click the blend tree node to add the animations that you want to blend together.
The example below blends three animations – idle, walking, and running. This is driven by a
float input based on pressing the W or up arrow, which will go from 0.0 to 1.0. The animations
will blend together based on that number.
Blend trees can be based on a single input (1D blend), or two inputs (2D blend), for example,
pressing the WASD keys to control character movement both horizontally and vertically.
The 2D blend tree will need two parameters to drive the different directions. You can set the
max values for the pos X (horizontal) and pos Y (vertical).
The image above shows that the backwards walking animation is created by reusing the
Walk_N animation set at a playback speed of -1 so it plays in reverse.
In code you can now pass the input axis of the W and S keys or up and down arrow keys into
the animator float parameter (this is Input GetAxis(“Vertical”)). This controls the blend
for a 1D blend shape.
2D blend code
Character Controller
A Character Controller component is used to control a playable character and features a
number of properties like a climb slope limit and a step height limit. Typically, a Character
Controller component is used for third-person or first-person player control that does not
make use of Rigidbody physics.
In addition to the Starter Assets – ThirdPerson package, you can download another Unity-
made package, Starter Assets – FirstPerson, from the Unity Asset Store. Both packages
provide configured character controllers for quick prototyping of your scenes.
Let’s go through the main steps for setting up your own character controller for a playable
character.
1. Create a new blend tree with three animation clips: A walk or run forward, an idle, and a
walk or run backward. Ensure to uncheck the Automate Thresholds checkbox. Set the
thresholds to -1 for walk backward, 0 for idle and 1 for walk forward.
a. Press the W key to set the vertical axis to 1 and make the character walk forward.
c. Press the A key to set the horizontal axis to -1 and make the character walk to the left.
d. Press the D key to set the horizontal axis to 1 and make the character walk to the
right.
The blend tree will now match the correct motion to the vertical and horizontal axes.
3. Create a float parameter called Vertical to control the motion and set it as the
parameter for the 1D blend, as seen in the image below:
This will require a script to control character movement, using a simple character controller
move.
5. Adjust the Step Offset to allow the character to walk up steps. If the character needs to
climb very steep steps make this number higher (however, if it is too high the character
may be able to step over small walls).
6. Adjust the Slope Limit properties to allow the character to walk up sloping surfaces. 45
degrees is the default value, however, you can set this higher for the character to climb
steeper surfaces. 90 degrees or higher will mean the character can walk up the sides of
buildings.
7. To raise the character controller to match the center of the character set the Center Y
value to 1.
8. The height can then be set to 2 to match the character’s height. If your character is taller,
you can make the height value greater.
9. Change the radius value to adjust the width. For wider characters, set this to 1 or above;
for thin characters, set it to 0.3 or below.
Cinemachine is a Unity asset that simplifies the process of creating dynamic camera systems
for games and simulations by providing advanced camera controls, procedural animation,
and smart camera behaviors. We will cover it in more detail later on in the book that covers
Cinemachine. For now install Cinemachine via Window > Package Manager, in the Unity
Registry menu.
Add a player follow camera via GameObject > Cinemachine > Virtual Camera. Add the
character to the Follow and Look At fields, and adjust the values to set up the camera
view in the Scene view.
Now when you play the game, you can control the character with WASD or arrow keys.
Any object can be animated directly in Unity. Open the Animation view by going to Window
>Animation >Animation.
Animations embedded within FBX files are read-only. You need to duplicate the clips in order to
edit them, but be aware that once you duplicate a clip, you can no longer change its animation
type (you’ll need to access the original clip in the FBX file to change its animation type).
Duplicating animations
Add the duplicated animation to the character’s Animator Controller to edit it. Click the Idle
node and drag the new animation file into the Motion field.
Replace animation clips by dragging the new clip into the Motion slot.
With the character selected you’ll have access to the animations inside the Animator Controller
directly from the Animation view. These can be selected using the drop-down menu from the
left panel. Notice the duplicated clip is editable whereas the embedded ones are read-only.
You can also create new animation clips from the same drop-down menu by selecting Create
New Clip.
Viewing keyframes (top left), and changing animation options (top right)
Select keyframes by clicking on them. You can delete, move, duplicate, or edit them; to modify
them, switch on the Record button.
Click the icon of three dots at the top right of the Animation window to change the options.
You can view the timeline in seconds or frames or change the sample rate from 30fps to
60fps, or vice versa.
Create your own forward kinematic (FK) animations in a new clip. A new clip will be blank with
no keyframes set. Press the Record button to begin setting keyframes.
To set a keyframe, select a bone in the Hierarchy, right-click at the rotation in the Inspector
and choose Add Key. This sets an initial keyframe.
Right-click next to the Rotation field to add a key for the bone.
Bones are animated between keyframes, so this is useful when you want a bone to stay at the
same position over time.
You can also use the Add keyframe button to add keyframes at the selected frame for all
bones in the animation.
The Add keyframe option adds new keyframes for all bones at the set frame.
If you want to set keys for certain bones only, select the bones in the list and click the Filter
by selection button. Now when you press the Add keyframe button it only adds keys for those
bones. This can be quicker than manually right-clicking on the rotation of each bone and
adding a key.
You can manually adjust the values for Rotation, Transform, and Scale on the X, Y, and Z axes
separately, as well as typing in specific values.
At the bottom of the Animation window you can change from a Dopesheet view (that shows
the keyframes), to the Curves view to click individual keyframe curves to edit or move them.
Change between Dopesheet (keyframes) and Curves using the buttons at the bottom.
Right-clicking allows you to change the shape of the curve, altering its easing between start
and end positions, as well as using the bezier handle to set a curve.
Curves editing
By default curves are set to Clamped Auto which adds an ease in and ease out to the motion
between keyframes. It will start slow, speed up in the middle, and slow down again toward the
end, providing a fluid smooth motion.
Animated settings appear in red in the Inspector and in blue when coming out of Record mode.
Items that have been animated turn red or blue, like the Position and Rotation values in this image.
Preview lets you set candidate keys on properties you need to manually keyframe. Recording
will automatically add all modifications to the clip.
In the Curves view press Enter with one or more keyframes selected. A pop-up dialog will appear
where you can enter exact value and time values (note that this does not modify tangents).
Filter by selection
This shows the properties of only selected objects in the Scene view. When this is
deactivated, all properties in the Animation Clip will be shown.
Animation events
Events can be called from an animation that trigger a function to run in a C# script. These
functions can be in any script attached to the GameObject.
In the example below there are two custom functions that will spawn in a water splash and
play a footstep sound when the left foot or right foot makes impact with the ground.
In the Animation window you can time the events to the correct place in the animation. Add an
event by clicking on the event icon.
In the Inspector it will prompt you to select a function from a script that you want to trigger.
When you play the animation the result should be splash effects spawning in as the feet hit
the puddle, with footstep sounds timed to the walk cycle.
Open the Events section in the animation section of the FBX file. You
can now move to the timing of the clip, watching the preview window
at the bottom of the screen. Click Add Event.
Event icon to add a new event
This requires you to manually type in the name of the function in the script. There are also
options to add required parameters such as integers, floats or strings.
Root motion
Many clips are in-place animations, where the character stays in one place while the walking
animation plays. You can control the character movement using script.
Some clips contain movement keyframes that move the character in 3D space. For example, in
a walk animation the character will walk forward for the duration of the clip. However, once the
clip loops back to the beginning, the character will also jump back to the origin point, creating
an unusual look.
In the Animator component there is an option to apply root motion. Checking this box ensures
that, as the clip loops, it will continue from its current position, instead of jumping back to its
original position. This results in the character walking forward realistically.
Root motion can be effective as the movement is timed with the keyframes of the character.
As the feet make contact with the ground the character moves forward in a realistic way and
avoids the foot sliding issue that can happen when moving a character with code.
Root motion is suited to complex movement, like a boxer character with tricky, multi-punch
combos, and feet quickly moving back and forth, and left and right.
Clips containing jumping animations can be useful for cutscene animations, where you
can control a character’s position using Unity’s Timeline. For jumping animations in actual
gameplay, bake the Y position of the character into the clip to prevent the character from
floating into the air.
You can control the character’s jump using a script. Use Rigidbody.AddForce for the best
jumping results.
Blend shapes
Blend shapes, sometimes referred to as morph targets in other 3D software tools, are vertex
deformations to the character mesh shape that can be blended with the original mesh shape.
Blend shapes are baked into the FBX file and can be tweaked in the Inspector. The sliders seen
in the below image can be adjusted to set the weight of the blend, where 0 has no weight and
1 is full weight. Multiple sliders can be mixed together to achieve new blend morphs.
Facial rig blend shapes are often used for more advanced characters, such as Gawain from
The Heretic. The spots on the face can be dragged around in real time to create a variety of
expressions. This is combined with a digital human shader which uses wrinkle maps for adding
realistic wrinkles to the face as it deforms.
Blend shapes on the digital human character’s face from the Unity demo The Heretic
You can also add blend shapes to items of clothing to create secondary motion in Unity. These
types of blend shapes can be created in 3D software such as Blender or Maya.
Butterfly wings are another example of using blend shapes to create motion for an animated
butterfly..
The four parts of the wings can be animated through blend shapes
The humanoid animation is used for animations that need to work on most human characters.
Keyframing directly in Unity using forward kinematics (FK) is not possible with a humanoid
animation. Animations can be created in other software and imported into Unity. Humanoid
animations can be controlled with animation rigging which is covered in a later section.
The humanoid animation type is more complex than the generic one due to features like
support for twist bones and leaf bones. It doesn’t rely on named bones like the generic
animation type, but instead uses reference points for bone positions, which is why humanoid
animations work on most human characters.
To set up a humanoid animation type ensure that the character model is in a T or A pose for
best target matching. In the Rig section of the FBX model, choose humanoid from the drop-
down list and create an avatar from this model. Click Apply to change its animation type.
When you choose the option Create from this Model in the Avatar Definition field (see
the image above and below) then the humanoid conversion process analyzes the skeletal
structure of the chosen character. Bones are identified at key positions and placed into slots
at key reference points, for example, an upper arm reference point now uses the upper arm
bone of this character. It no longer matters what the bone is called, the animation is able to
use the reference points to animate the skeleton.
If the conversion is successful you will receive a message in the rig section.
If there are issues with the conversion Unity will highlight which bones could not be located,
meaning you might have to try and match the bones manually. If the character’s skeleton is too
complex it may not be possible to convert to humanoid.
If you use a generic animation on a character with a humanoid animation type it will result in
the character going into the setup pose as shown below.
When a generic animation plays on a character with the humanoid animation type, it assumes the setup pose.
The conversion process for the humanoid animation type works best with a character in the
T or A pose because the bones are then at a location Unity expects. If a character is in an
animated pose, the bones may be at unusual positions, making the conversion more difficult.
Create a humanoid avatar in the T pose, and then use this avatar to convert any animation clip
files you import.
For example, when importing animation clips from Mixamo, choose the option Copy From
Other Avatar in the Avatar Definition field during the conversion process, and choose the
avatar from the T pose character.
Converting animations to a humanoid rig and using the avatar from the T pose character
It’s important that the avatar for the T pose character has the same skeletal structure as the
animation clip files, so be sure to import a T pose model alongside animation clips from the
software you’re using to create or download animations.
You wouldn’t use an avatar from a different character in T pose to convert these clips to
humanoid as that would cause issues.
Once the clip has been converted to humanoid animation type it will now work on the original
character as well as most other biped characters that use a humanoid animation type.
Imported walk animation working on three characters with different skeletal structures
Humanoid clips have more settings than generic clips, such as adjusting offsets for transform
and rotation to match other clips.
As you can see in the image below the loop match option is green for all settings. This means
that the first and last keyframes are matching, creating a seamless loop. For Transform this
means the character may be animating on the spot, so it doesn’t have root motion keyframed
into the animation. This type of clip is useful when moving the character using code.
If the spots are red this means the keyframes don’t match and you will see a pop as the
animation begins to loop. Although not all clips are designed to be looped, a red spot means
that motion has been keyframed into the animation. This is root motion and when using a
humanoid animation with root motion, ensure the Apply Root Motion option is checked in the
Animator component:
A character with rigidbody physics can cause conflicts with root motion files. To fix these
conflicts you can bake the motion into the clip using the Bake Into Pose option.
Loop using root motion and the option of baking into the pose
Check this option for the settings Root Transform Rotation and Root Transform Position (Y),
which is the jump up and down motion. However, to maintain walking forward and backward
motions don’t check the Bake Into Pose option for the setting Root Transform Position (XZ).
In video games, the environment or location of a character will change during gameplay. Often
you can’t anticipate all the variations you’ll need for an animation to make the character move
in the world in a believable way. That’s why being able to adjust existing animations based on
the game variables or specific needs at runtime is key and a time saver.
The Animation Rigging package allows you to create Inverse Kinematic (IK) rigs for any type of
character or object inside of Unity. It will work on both generic and humanoid animation types
as well as inside the Timeline editor.
Inverse kinematics (IK) is a technique used in animation and robotics to determine the
positions and rotations of the joints in a skeletal structure based on the desired position
of the end effector, such as a character’s hand or foot, allowing for more natural and
realistic movement. Imagine you’re controlling a character’s hand in a game. With inverse
kinematics (IK), instead of manually moving each joint of the arm to place the hand where
you want it, you simply specify the position you want the hand to reach. The IK system
then calculates how to adjust the shoulder, elbow, and wrist joints automatically to achieve
that position, resulting in more natural and fluid movement.
Install the Animation Rigging package by going to Window > Package Manager and
selecting Unity Registry from the top left drop-down. Once you install the package, it’s also
recommended to download the samples package, which contains example scenes of each
rigging constraint so you can see how to implement them in your projects.
The samples are in the Samples folder; preview how they work by entering Play mode and
examining their structure in the Hierarchy.
The constraints are based on predefined scripts that can make setting up an IK solution
for Unity more easy. The constraints are also modular; instead of using a single IK rig that
controls every bone in a character’s skeleton, you can mix the individual constraints to suit any
character type, including non biped characters such as animals or creatures. You can also turn
off parts of the IK rig independently of the rest.
Rigging constraints
Set up an IK Rig
Select your character and go to Animation Rigging > Rig Setup. This will add an Animator
component and a Rig Builder component. Adding the latter component to this root character
GameObject automatically creates a child Rig GameObject of that root, with the default
(chanageable) name Rig 1 (see the image below).
You can also display the bone layout by going to Animation Rigging > Bone Renderer Setup.
This provides a visual representation of the skeletal layout but it’s not mandatory for the
Rigging package to work. The Bone Renderer component is removed from the rest of the
examples in this guide to make the character easier to see.
To add the modular Rig components add an empty “GameObject named Rig 1.” You can then
choose which constraint to add from the components menu.
A Two Bone IK Constraint is used to add an inverse kinematic constraint to arms, legs, or
fingers. Drag the hand from the Hierarchy into the Tip slot.
Click the three-dot options icon and choose Auto Setup from Tip Transform. This will
automatically find the connected bones and add target and hint objects in the correct
positions.
Auto Setup from Tip Transform sets up the Two Bone IK Constraint automatically
It will also add effectors to visualize the target and hint objects. The target object will control
the hand position and rotation, and the hint object will control the elbow rotation.
There are a variety of effectors to choose from and these can be modified in size, position,
rotation, and color.
All constraints have a weight slider that affects the overall influence of the targets. At a
strength of one the target will completely control the hand position. At a strength of zero it
will have no effect over the hand. You can also control the position weight and rotation weight
independently.
During Play or Record mode you can move or rotate the target around to affect the position of
the character’s arms.
You can use a Multi-Aim Constraint to get the character’s head to look at a target, like a
moving object. The targets are parented to those objects and the character will look at them
as they move.
Using the Multi-Aim Constraint to have the character looking at a moving target
The Multi-Aim Constraint component using the Scene Up option in the World Up Type field to ensure correct rotation
If the head looks like it’s rotating incorrectly, check the Aim Axis and Up Axis direction of the
head. Ensure you are in Local space. The head is looking along the Y axis and the Up Axis field
is set to - X.
The head is using the Y positive for the Aim Axis and the negative X for the Up Axis.
A minimum and maximum rotation amount can also be set, e.g., you might want to restrict
head rotation to -90 and 90.
Multiple Rigs
You can create as many rigs as you like for a character by going to Animation rigging > Rig
setup. It will create a second rig that gets added to the Rig Builder component. Be aware that
any constraints for Rig 1 are calculated independently of Rig 2. Multiple rigs are useful when
you want the two or more rigs to be independent of each other.
The benefit of this type of set up is the ability to control the character in any given situation.
Some of the constraints are IK and will take full control over the character, where some others
are overrides that will adjust position and rotations of bones without affecting the underlying
animation.
Precise foot or hand positioning is possible with a Multi-Referential Constraint that allows the
bone positions to be controlled from a specific pivot. In the case of foot positioning, the toes
and ball of the foot have been used to set up precise positioning of the foot.
Target points must be set up for each point, in this case the heel, ball, and toe of the foot.
These are then added to the constraint along with the right leg target (this is the target that is
going to be controlled by the pivot of each point).
During Play or Record mode the driving pivot can be set from the drop-down. Select the target
point and rotate to see the foot rotate from that pivot point. This is covered in more detail in
this Unite session.
Foot IK
Most animators are familiar with foot IK because it’s a key concept in crafting authentic-
looking animations. Foot IK reduces foot sliding and feet dipping below the ground. It ensures
that feet stay in the correct position, for example, when a character is sitting in a chair.
The example below is a new rig with a hip control, to begin with. Placing this constraint above
the two feet constraints is important as IK is calculated in the order it appears in the Hierarchy.
So it calculates the hip movement first and then the feet position. If it did it the other way
around the feet would be calculated before the hips have moved.
The two feet use a Two Bone IK Constraint to ensure the correct bending of the knees as the
character sits. The feet will remain on the ground the entire time.
During Play mode the hip target can now be positioned to get the character to sit on the chair.
The left and right leg hints can be used to adjust the knee positions if needed.
Easy sitting animation using IK with the feet remaining planted on the ground
The Override Transform component, however, can merge animations with IK control and
is useful for making small adjustments to existing animations without stopping them from
working on different body parts.
To set this up you will need targets set up at each of the bone positions. The best way to do
this is to right-click a bone in the character’s skeleton, for example the spine, and create an
empty GameObject.
You can assign the targets an effector for visualization and drag them down to the rig section.
The Override Transform component is added to each of these targets, the bone to modify is
added in the Constrained Object field, the target itself is added to the Override Source field.
Overrides will adjust the bones while the animation is still controlling them.
You can now animate these targets to make adjustments such as spine rotation, shoulder
adjustments, neck position, or leg rotations while the animation continues playing. See this
video for a good example of this process.
Watch session
— Teams can work inside Unity to directly iterate on the rig to achieve the desired
animations.
— Real-time IK updates for preview and animation purposes help to make iteration and
keyframing efficient
— The modular rig system allows you to create custom IK rig setups for biped and non
biped characters.
— The modular system makes creating rigs for different types of characters efficient
— Bidirectional motion transfer allows you to bake the motion back to the character.
skeleton, or to bake motion from a skeleton to an IK control rig
Timeline is an animation editor that looks and works like a video editor, enabling you to create
linear animated sequences directly in Unity. You can control animations on all the objects in
your scene from a single location, as well as control when the animations will occur relative to
all the other objects and characters.
It works with both generic and humanoid animation types and supports the Animation Rigging
package.
In the Package Manager, there are a number of packages available in the Cinematic Studio
collection that are useful for creating animated sequences. To install this go to Window >
Package Manager.
— FBX Exporter: Send geometry, lights, cameras, and animation made in Unity to 3D
modeling software
— Alembic: Bakes animation data into a file so that you can play the animation in Unity
— Cinemachine: Advanced camera system that solves the complex mathematics and logic
of tracking targets, composing, blending, and cutting between shots
Animated cutscenes
Traditionally, cutscenes in games were created in software like Autodesk 3ds Max or Maya,
or Blender, and then rendered out as video files that were played back during the cutscenes.
The video files were huge and needed to be compressed using video encoding, which could
diminish their overall quality.
But since the addition of the Cinemachine and Timeline systems to Unity, you can now play
back cinematic-quality cutscenes directly inside the game engine with no need for video
compression.
Select the TimelineHolder object in the Hierarchy and create a Timeline asset. This will store
all the data on the timeline including any animated clips.
It will also create a Playable Director component that binds the timeline to the TimelineHolder
GameObject. Now you’ll have full access to the track editor when you have TimelineHolder
selected in the Hierarchy.
Clicking on the Timeline asset in the Project window allows you to define the properties for the
Timeline sequence, including the frame rate, which is 60fps by default, but you can change it
to a range of other frame rates.
You can also set a specific duration for the Timeline sequence or extend it based on how many
clips you add.
Timeline properties
Drag animation clips directly onto the track and move them into place via click and drag. You
can also right-click the track and choose Add from Animation Clip, then select the chosen
animation.
Right-click and choose the option Add from Animation Clip to choose an animation clip.
Be aware that your character might disappear when you do this if it has a root origin of
world 0. This means the character will jump to a world position of 0 on X, Y, and Z and will begin
playing the animation there. This might be far away from where you want your character to be.
To fix this, create an empty GameObject at the character’s current position and place the
character inside it to parent it.
Child objects are relative to the parent object’s position. So, although the child object in this
case will still have a Transform position of 0 on X, Y and Z axis, these are no longer the root
origin of 0 in world space, but are now the current position of the empty GameObject.
Now the Adam character plays the animation at the correct position.
Animation clips appear as solid blocks on the track in the Timeline window that you can drag
forward or backward in time across the length of the track.
Animation clip for the Adam character on the timeline and the clip options in the Inspector
Each clip has a pre- and post-extrapolation method if there are spaces to the left or right of
the clip. This determines what happens to the animation before and after it plays. By default it
will be set to Hold, which creates a still pose from the first or last frame of the animation.
Much like a video editor you can cut or resize the animation clips by right-clicking on the clip
and choosing Editing. The shortcut keys are also displayed to make editing more efficient.
You can adjust a clip to fade in and out if there are no other clips to the left or right of it. This
prevents popping and leads to a gradual change of pose by adjusting the blend in and out
values in the Inspector.
Adding a second animation allows you to then blend them together; drag the second
animation into the first to create a blend overlap.
Click the track to add an avatar mask. The animations you place on this track will override the
bones on the base track, allowing you to create interesting, unique animations.
If you are using an in-place animation that doesn’t contain root motion you can use an override
track to control the position, rotation, or scale of the character.
You can add as many override tracks as you like to an animation track, but if two override
tracks control the same bone or bones, then the override track that is furthest down the list
will override any that are above it.
Click Record on a new override track to add keyframes to an infinite clip as you adjust the
position of the character. During recording the track will turn red.
Right-click and choose Convert to Clip Track to turn keyframes into an animation clip. That
gives you access to all the options available for an animation clip including speed, looping, and
blend style.
Double-click the animation clip to open the keyframes in the Animation window. From here you
can adjust the keyframes and add events in the same way as covered earlier in this e-book.
You can duplicate animation clips by pressing Ctrl + D or going to Edit > Duplicate. You’ll
notice that the character position will jump back to its origin when transitioning from the
first to the second clip. To fix this, right-click the second clip and choose Match Offsets to
Previous Clip. Now it will act like an animation that uses root motion, and the character will not
jump back to its origin point but will continue from the last position of the previous clip. You
can also match offsets to the next clip if there is one to the right of your current clip.
Keyframing
Timeline also allows you to create animations directly from this window. For generic animation
types you can use forward kinematics (FK) to adjust the rotation of each bone. Overrides can
also be used to adjust and refine the position of bones.
Base animations can also be adjusted with keyframing by using overrides which allow you
to customize the animation. You can also access curves here, to refine the easing between
keyframes.
Generic rig keyframe overrides for an existing animation and curves adjustment
You can’t animate humanoid animation types using forward kinematic keyframing. Instead, use
the inverse kinematics (IK) rig in the Animation Rigging package to animate the position of the
targets. See the Animation Rigging section in this e-book.
In order to see the rig update in the scene you need to first set an initial transform keyframe
for the parent GameObject that contains the Rig Builder component. As you add override
tracks, adjust the position and rotation of the targets and see them update in real-time in the
Scene view.
Track types
Activation track
Use the activation track to switch objects on for the duration of the block during the animation.
Active block that switches objects on for the duration of the block
Audio track
Use the audio track to place sound files and music at set points in the animation. You can use
multiple audio tracks to mix together a complex audio score.
Use multiple audio tracks to make a rich sound track for a sequence.
If the audio source slot is left empty the sounds and music will play in 2D space. If you place
an object with an Audio Source component into this slot you can set 3D sounds to play from
specific parts of the 3D world, creating surround sound audio scores.
Control track
Use the control track to activate
particle effects in your scene, for
example, for timing an explosion
to happen at a set point in the
animation. You can choose what will
happen after the particle track ends
– it becomes inactive, continues to
loop with the active setting, or returns
to a dormant state using the Revert
setting, and waits to be activated
again at a later point.
Use the control track to orchestrate other timelines from your current timeline. If you
have several timeline sequences which you want to begin at set times, you can drag the
TimelineHolder GameObjects onto the control track to activate them at the appropriate time.
Playables track
The playables track allows you to create your own custom scripts and add them to this track
to drive objects and characters during the sequence. A good explanation of this is in this Unity
blog post.
Signal track
The signal track works very much like events in the Animation window. You can set a signal
event that accesses components on GameObjects and also custom public functions in scripts
attached to an object.
To create a footstep sound to match with the character walking, a script can be created with
two public functions, one for the left foot and one for the right.
Attach the script to the character along with an Audio Source component to play the sounds.
Now add a signal track in the timeline. Right-click at the point of the first footstep and add
a Signal Emitter. It will have a yellow warning, as a Signal Asset does not yet exist. You can
create this in the Inspector.
Cinemachine tracks
Cinemachine creates complex cinematic camera setups in an efficient way. The Cinemachine
track allows you to control the virtual cameras used in Cinemachine to time and adjust your
shots.
Drag your main camera into the empty slot and it will create a Cinemachine Brain component
on that main camera. This is used so the main camera will conform to the virtual camera
settings.
After creating a few virtual cameras and framing them they are added to this track. If one
camera follows the next you get a straight cut between the two.
If one camera block is dragged into another you get a crossover blend so the camera will
move between the first shot and the next.
Note that Cinemachine version 2 is used in the screenshots above. For more information on
the new features in Cinemachine version 3 see this blog article.
Track group
As you add more tracks you could find it difficult to navigate your timeline.
Track groups allow you to order certain tracks together and keep them well organized.
You can then minimize the track groups to create more space on the timeline, making it easier
to navigate.
Sequences
You can use the Sequences package to make large projects easier to handle when working
in Timeline. It allows you to dynamically load scenes at set points during playback, making
scenes easier to work with.
See this Unity Learn tutorial for a step by step introduction to working with the Sequences
package.
Unity’s Built-In 3D Physics engine simulates physics events in real-time. Use it to simulate
objects reacting to gravity, and realistic movement for objects, like hair, fur, cloth, capes,
curtains, and flags.
Rigidbody Physics
To add solid physics, add a Rigidbody component to the object, as well as a Mesh Collider
component, so it doesn’t fall through the ground.
Solid physics refers to the realistic simulation of physical interactions between objects in a virtual environment.
Properties like mass, gravity, friction, and collisions are simulated to show objects behaving in a way that mimics
the real world.
You can configure the Rigidbody component with a set amount of mass and drag to simulate
how real-world objects interact with the air as they fall. You can include or exclude certain
layers, ensuring the object only collides with other objects on set layers. You can also use the
Freeze Position and Freeze Rotation settings to get more control over the simulation.
Leave the IsKinematic property unchecked to ensure the object uses gravity. If IsKinematic is
checked then the object will not use gravity but can still process physics-based interactions
like collisions.
Colliders are physical colliders by default. This will prevent objects from passing through other
objects. However, you can check the Is Trigger option to set the collider as a trigger collider
which disables the physical aspect, allowing the object to pass through other objects, but still
return a trigger when it intersects another object. This can be useful for coding.
You can add a Physic Material asset to a Mesh Collider to ensure the object acts in a way that
simulates a real-world object. For a bouncy rubber ball, for example, you need to add a bouncy
Physic Material into the Material field of the Collider component.
To create a new Physic Material right-click in the Project window, choose Create > Physic
Material, and adjust the values to get the desired result.
Defining a Physic Material for a wooden staff and assigning it to the Mesh Collider
Physics settings can be adjusted by going to Edit > Project Settings. From here you can
change the gravity amount. By default real-world gravity is set at -9.81, but you can alter this,
for example, to simulate a character floating in space.
Use the layer collision matrix at the bottom to define which layers will be able to interact with
each other.
Various physics colliders, joints, and forces can be added from Component > Physics.
Physics components
The constant force is useful when you want to apply force in a specific direction to a physical
object, such as to simulate wind or objects being thrown.
You can also add force to a Rigidbody component using code, for example, to add force on a
mouse click to drive a knife forward at a set speed.
Code for adding force to an object when a player left-clicks the mouse
The Rigidbody component is frozen on the Y movement axis ensuring the knife doesn’t drop to
the ground, but travels forward at the force speed set in the code.
You can use the Rigidbody component to control the movement of a character. This is useful
for creating realistic motion that reacts to physical objects in the game world.
To do this, access the Rigidbody component in code and use the MovePosition or
MoveRotation commands for it.
Cloth Physics
Cloth physics simulate the movement of cloth and can be applied to any mesh that has a good
level of subdivisions. A mesh needs to have enough vertices to make the simulated material
move in a believable way, but not too many that it increases the performance cost.
Add a Cloth component to an object and then use the settings to define the material type.
Adjust properties like Stretching Stiffness and Bending Stiffness to simulate leather, silk,
cotton, etc.
The Cloth object will intersect with the rest of the character as it simulates, so you’ll need
to place colliders to prevent the cloth from moving through the character. Cloth supports
Capsule and Sphere colliders only.
You can build up complex shapes by using multiple capsules and spheres to define an exact
shape and then add them to the array of capsule and sphere colliders in the Cloth component.
Add either a capsule or a sphere to the correct bone in the Hierarchy and position it. The
colliders stay on the bones, helping to provide realistic, correct movement for the character.
Once the collider’s in place, remove the Renderer component on it by right-clicking and
choosing Remove Component.
Once both legs have colliders, add them to the Cloth component for each object.
Ragdoll physics
Ragdoll physics simulates the flailing of a character’s legs and arms when it falls or collapses
to the ground. Unity has a quick setup system to add the physics components to a character’s
skeleton.
Select the character and go to GameObject > 3D Object > Ragdoll. This opens the setup
window. The character should be in either a T or A pose. Drag the bones into the correct slots
and click Create.
This now adds a Rigidbody component to each bone in the list and a Character Joint
component that simulates the ragdoll-style physics.
Any animations that are playing will override the ragdoll simulation, so make sure to deactivate
the Animator component to see the ragdoll physics in Play mode.
You can choose to deactivate the Animator component with a script, for example, to play
the ragdoll effect when a character gets shot. However, you will notice weird results when
going directly from an animation to ragdoll – the character physics can act in an unpredictable
manner.
To fix this issue, set the Rigidbody components on each bone to isKinematic to prevent them
being affected by gravity until the Animator is switched off. Then, set isKinematic back to false
to see the ragdoll effect play without any issues.
Switching Rigidbodies to isKinematic when switching off the Animator component; this is placed on the parent GameObject
This Lion demo shows real-time content created with Wētā Digital, SpeedTree, Ziva, SyncSketch, and artist tools in the Unity Editor
You can take a look at the sample scenes for the hair shader by installing the Materials
Samples folder from the Package Manager under the High Definition RP section.
You’ll find the hair scene in the Scenes folder. It features strand- and card-based hair
rendering, with the material options in the Inspector.
You can also create your own dynamic hair systems that work with gravity and forces by
following this tutorial on Unity Learn.
Learn how to add your own hair simulations to characters and objects with the Unity Learn tutorial “Get started with hair simulation.”
Alembic
The Alembic package makes it possible to use alembic files in Unity. Use this package to
create complex animations for water and other fluids, as well as hair and cloth movement,
without the performance cost of calculating such advanced realistic simulations. Alembic
bakes the vertex movement into a file that’s played back during Play mode.
Install the Alembic package in the Package Manager to import alembic.abc files into Unity.
The Alembic importer and playback package in the Unity Package Manager
The simulations are usually made in other software, like Maya, Blender, or Marvelous Designer
and then exported using the .abc alembic file extension. Alembic file sizes can be very large
as they store vertex positions per frame – expect file sizes to be over one gigabyte or above,
depending on the complexity of the simulation. Therefore, they’re best-suited to animated
sequences rather than real-time gameplay.
Import the alembic file by dragging it into the Project window, and then add it to your scene.
To play back the alembic file, add the object to a Timeline sequence using an Alembic track.
Particles
Particles are used to create effects like fire, smoke, water, magic attacks, and more. You can
render thousands of particles in a scene with little impact on the frame rate, making them very
useful. Find a wide range of particle effects on the Unity Asset Store, and discover how to
craft particle effects in Unity with the e-book The definitive guide to creating advanced visual
effects in Unity.
AI characters
When you have to animate a non-playable character (NPC) or have a scene with a large crowd,
you can use Unity’s AI Navigation system, which will help you avoid spending hours animating
each character. This system allows you to control hundreds of characters in the scene and
ensure they avoid walking through buildings or objects, as well as bumping into each other.
To start with the navigation system, create a blend tree for a character controlled by a variable
that goes from -1 to 1 to transition between idle and walking.
Now you can add a NavMesh agent to your NPC. You can change the settings to control how
fast the character walks and rotates, as well as the radius of the agent, which determines how
close other characters can get to them.
The NavMesh agent will need a NavMesh Surface component to direct their movement. Go
to GameObject > AI > Navmesh surface. Switch off any dynamic characters before baking
the NavMesh. This will show up as a blue surface when you have the Show NavMesh option
checked. Add a target object for the NPC to walk toward.
Baking and showing a NavMesh in the scene; the NPC can only walk in the blue areas
You will need a script to control the NavMesh Agent. This uses the velocity of the agent to tell
the animator whether it should play the walk when in motion or idle when stopped.
Drag the target into the slot of this script. You can have as many targets and characters as you
like. If possible, keep things simple by having one target per character.
The NPC will always walk toward the target and stop once it reaches it.
Now the character will walk toward the target during Play mode. You can animate the moving
target to get the NPC to follow it. Notice how easy it can be to create entire crowd simulations
using AI navigation.
If you work in a team or want to refine animations in a DCC tool there might be numerous
times when you will need to export animated characters from Unity to other software to make
adjustments.
The character should have the animations set up in the Animator window. Right-click the
character and choose Export to FBX.
Once exported you can now open the file in other 3D software to make adjustments to either
the 3D mesh or to the animation files.
The embedded animations are available in the non linear animation window in Blender. Once you
adjust the file you can export it as an FBX file and open it in Unity.
Unity Recorder
If you are creating an animated feature in Unity or need high-quality footage of your game
then export out the final animation as a video file.
Capturing raw footage while playing the game in the Editor or a Build simultaneously to running
video capturing software can be tasking on the resources of your computer making the game
not run as it should for marketing or video content purposes. To ensure a stable frame rate,
not stuttering due to frame skipping, and achieve the best possible recording quality Unity
Recorder lets you create video or image sequences in real-time from inside the Editor.
With your scene setup and ready to be recorded go to Window > General > Recorder >
Recorder window. You can set a frame rate from a set list or define your own. Keep the bit
rate constant for a more stable video file.
To create a new recording click the Add Recorder tab and choose from the available options.
You can create a movie, an image sequence, or export audio.
Select the Movie option and customize the settings to fit your export requirements. The video
will be recorded from the Scene view, so anything displayed there will be recorded. You can
also choose to record video only from a specific camera, or a 360° view for VR.
You can also set an output resolution from a set list or define your own and the Scene view will
then conform to that resolution when recording. Be aware that if your scene has been set up in
a different ratio, for example 4:3, and you select 16:9 as an output, the Scene view will change
to 16:9 as it records, resulting in your video looking different to the scene in the Editor.
In the output format you can select which encoder you prefer to use. The Unity Media Encoder
supports the H.264 MP4 and WebM codecs. The Apple ProRes encoder supports Apple
ProRes 4444 and 422 codecs. The Gif encoder is used for exporting animated gifs.
You can also export video with or without audio and the files will be saved in your project in a
folder called recordings.
You could also export out your project as an image sequence in JPG, PNG, or EXR files. These
may be useful if you want to do the final encoding in video editing software, perhaps after
color grading and adding video effects.
Press Start Recording to go into record mode, and the sequence will begin to play as the
video is recorded. You may notice that it will cap at the target frame rate and even play slower
than the target frame rate. This is a result of the encoding process; the video however, will
play back smoothly at the target frame rate.
If the recording mode is set to manual you will have to press stop at the set time. You can type
in the start and end frames for the recording under the recording mode setting.
You can also use Timeline to add a recorder track. Right-click the track to add a RecorderClip.
This acts like an active block. It will record for the duration of this block.
new capture!!!
2D projects in Unity share similar tools and workflows. The equivalent in a 2D project to
an FBX 3D model is a 2D sprite. There are also 2D physics and lighting toolsets in Unity, in
addition to a 2D Sprite Editor, the 2D Tilemap system and 2D Sprite Shape. The Animator
Controller and Timeline tools also work for 2D projects.
Traditional 2D animation can be the most time-consuming and challenging part of developing
art for a 2D game. 2D skeletal animation can help you save time, create smooth and adaptable
animation, and enable you to create more animations with fewer assets.
When you select an imported PSD or PSB file, it shows options similar to the sprite import
settings, but with additional ones for 2D animation and rigging. The key options you’ll need to
select are:
— Import Hidden Layers: This option imports all layers from a PSD or PSB file, including
hidden layers.
— Individual sprites (Mosaic): This setting is only available if the Texture Type is set to
Multiple. It makes sprites from imported layers and puts them onto a texture atlas. Leave
this option enabled if you want to rig a character.
— Use as Rig: This option generates a character prefab with the same layer hierarchy and
position as the PSD or PSB file.
— Use Layer Group: This adds layer grouping from the PSD or PSB file. Turn it on to group
parts of a character, like sprite-swapped parts.
In the Layer Management section you can see the hierarchy of layers, and manually add or
remove layers to be imported.
Use the Edit Bone button to refine bones. The Split Bone button enables you to split bones in
two. This is a good option for making limbs. If you make one leg bone and click where the knee
should be, the bone will split into the thigh and calf sections.
To verify that the hierarchy is correct, select the Preview Pose button and test some poses. To
reset the position of bones, press the Reset Pose button on the toolbar.
It’s a good idea to set all sliders to zero to keep the geometry as simple as possible. Enable the
Weights option to bind bones to the sprites automatically. Clicking the Generate For All Visible
button creates and sets bone weights for all sprites. To do this for an individual sprite, double-
click the sprite. This is useful for tweaking the geometry of certain sprites.
To fully control a number of vertices and how the geometry bends, you’ll need to edit the
mesh manually with the tools in the Geometry section.
Use as few vertices as possible. Fewer vertices help you save on performance, since each
vertex position needs to be calculated based on the bones’ rotation. Fewer vertices also make
for better-looking mesh bending since it’s easier to set weights for fewer points. Additionally,
each target platform has an ideal game vertex count, so it’s always a good idea to optimize
geometry.
Clean geometry is important for optimized animations, especially in areas that experience high levels of deformation, like the articulations.
Weights
Once the geometry of your sprites is clean, it’s time to set up weights. Weights define bone
influence on each vertex from zero to one. Zero means that a bone has no influence over the
vertex, and one means that the vertex will move like it’s glued to the bone. Good weights on
the mesh can result in realistic-looking bending. Setting weights incorrectly can break the
game’s illusion and distort your sprites.
To start setting up weights, you need to define which bones will affect a certain sprite. Click
the Bone Influence button, and select a sprite. In the small pop-up window, you can set which
bones influence the selected sprite. Set only the relevant bones that need to affect the sprite,
and remove the rest.
Animating the main character in Happy Harvest with 2D IK solvers. The left image shows modifying the position of the bones and rotation. The
right image shows changing the sprite for the hand with sprite resolver
Conclusion
For artists looking to create animations in Unity, be sure to check out the full Unity
Documentation, as well as Unity Learn, the Unity Blog, and the Animation Forum.
You can find all of Unity’s in-depth developer and artist e-books in the Unity best practices
hub or the best practices section in the Unity manual.
The Unity Product Board provides an overview of current animation features being developed,
in addition to what’s coming up next. You can even add your own feature requests.