top of page

Adding Facial Expressions to Characters

FINDING AND ANIMATING BLENDSHAPES ON A SKINNED MESH

Using Unity Version 2021.3 || Tutorials may contain affiliate links if the asset is relevant and useful



We've all played games with deadpan characters. Even ones that are able to lip-sync fluidly when they speak feel robotic if there's no emotion behind what they're saying. Thus, I would argue that facial expressions are one of the most critical details you can add to a character -- and that holds true whether the character is super-realistic or wildly stylized. Why? Because, whether we know it or not, we're built to read and expect facial expressions when looking at a person. We rely on this information to determine personality, emotions, motivations, or intent, and because of that, character expressions are a powerful tool, capable of enhancing body animations to help us subtly communicate more about our character than we would be able to otherwise.


Today, it's quite common for characters to contain a variety of facial expression data. For instance, my character assets (along with many others) contain a selection of Blendshapes for facial expressions and lip syncing. But, that's only the first step. Blendshapes must be accessed and manipulated, which isn't entirely intuitive. In fact, one of the most common questions I'm asked is how to add facial expressions to characters.


What Are Blendshapes?

You probably know that the mesh for your model is made up of vertices, which have been placed at particular 3D coordinates and connected by triangles. An easy way to think about a Blendshape is that it's essentially a copy of your mesh, but the 3D coordinates for the vertices are different. Adjusting the Blendshape value, therefore, will interpolate your mesh from its’ current state to the Blendshape state. Fundamentally, a Blendshape allows you to morph geometries that have the same topology.


This is most often used for facial expressions and positioning the mouth for lip-syncing, and that’s what we’ll focus on in this article. However, keep in mind they can also be used for morph customization, or correctively adjusting animations (like fine-tuning muscle shapes). In fact, I’m sure there are hundreds of creative ways to use them.


Creating Blendshapes

To utilize a Blendshape, your mesh must contain Blendshape data. This happens in the modelling software before you even import it into Unity, which is beyond the scope of this tutorial. If you are looking for information about creating the Blendshapes themselves, I recommend the following overviews here:


Basics of shape keys in Blender:


An overview of facial shape keys (it’s specifically for motion capture, but the processes are the same):



A full how-to for shape keys in Maya:


Where Are the Blendshapes?

Let’s assume we have a model containing Blendshapes. How do you find them?


Blendshapes are found on the Skinned Mesh Renderer component of the mesh containing the Blendshapes. For facial expressions, then, this will be the mesh which includes the face.


Provided your mesh contains Blendshapes, the Skinned Mesh Renderer component will include a dropdown which holds all of them as individual floats. The float value is essentially the amount of that shape that is applied to the current mesh state.

Blendshapes can be accessed and adjusted in two primary ways: code or animation clips.


Using Blendshapes with Code

To access a Blendshape via coding, you will want to make use of the following:


Find the Blendshape index by name Blendshapes are contained in an array and therefore, accessed via their index. Here is a simple script which will return the index value of the Blendshape name you specify.


Find the Blendshape name by index

Conversely, you can get the name of a Blendshape at a specified index with the following, which will return a string:

Get the current Blendshape value Once you know the index for the Blendshape, you can get its' current value with the following:


Set the current Blendshape value Additionally, once you know the index for the Blendshape, you can set its' current value with the following:

It's important to note that these snippets do not account for lerping/tweening, so could be considered incomplete. In its' current state, the above code will set the new value of the Blendshape immediately, so if you prefer to manipulating the values through code, you may want to take advantage of a Tweening tool, such as DoTween (which is free, although the paid version is quite reasonable). Still, the benefit of fluidly moving shapes is why I prefer to use animation clips (see below).


Because Blendshapes can be accessed so readily through code, there are several third-party assets which can help manipulate them, particularly for speech.


I’ve tried both SpeechBlend and the Salsa LipSync Suite. Both work well for managing visemes associated with speaking, along with the option to add facial expression data. There is a slight learning curve in using them, but Salsa in particular provides a robust selection of tutorial videos for setup and keeps a website of resources for their users.





Using Blendshapes with Animation Clips

To adjust a Blendshape with an animation clip, you will want to do the following:


Create a new Animation Clip You could add Blendshape animation data to an existing clip, but for the sake of efficiency and re-usability, I strongly recommend keeping expression files separate. This is because, while humanoid animations can be shared, Blendshape data normally can't -- unless the index values and names for the shape are exactly the same across all of your characters (and even then things can go awry).


Select the mesh containing the Blendshape data, open the Animation tab, and select "Create New Clip..." from the Animation array dropdown (your character must have an Animator Controller component). Save it in an appropriate place, possibly in a folder called "Face Anims", within your project and give it a name. This clip will automatically be added to the active layer on the Animator Controller, but it can be removed later if necessary.


Record the Blendshape Adjustment While you still have the mesh containing the Blendshapes selected, click on the red record button.



When you do so, you will notice your character immediately go into the weird, dropped "no animation data" pose. We will address this in a minute. For now, set up your view so you can see the characters' face and create your animation by adjusting the Blendshape values as needed. Because you have activated the recording setting, every change you make will be captured by adding a keyframe when the value is changed.


Setting up an Avatar Mask As mentioned above, the new clip now contains the Blendshape data, but it is missing any other animation data for the rig of the character. We are going to additively trigger our Blendshape clips instead.


Create a new Avatar Mask

First, let's set up an Avatar Mask that masks everything aside from the face mesh. Right-click in your project view, preferably where you have saved your face animation clips, and Create > Avatar Mask. I've called mine "Head Mask". Now, select that new Avatar Mask and in the inspector, set the entire Humanoid dropdown to red by clicking on the various body parts. Next, select the Transforms dropdown and deselect everything except the mesh containing the facial expressions.


With that ready, lets go to our Animator Controller tab. Make sure you have the character selected so you have the correct Animator Controller window active.


Create a new layer in the Animator Controller with the "+" button and name it Face Layer or something similar. Now, click on the little gear for that new layer and adjust these values to work properly with the Body Layer. Here are the settings I've used for my new layer:



This is set up so the full weight of the animation clip will be shown. I've populated the Mask field with the newly created Avatar Mask. Blending is set to "Override" so my expression clip will play over any existing facial animation. Sync and IK pass are toggled off as they are not necessary for this layer.


NOTE: How you set up your transitions is up to you. I usually create triggers with the expression name so I can use them across any animation, but you could sync them with the animations if you intend to only use them with a specific clip.


Now you can delete the new animation clips from the body layer (if they were automatically added when you created them) and add them to the face layer by dragging them into the Animator Controller window. Set up your transitions however works best for you. This is how mine are set up, with a trigger for each facial expression:


Hopefully this helps get you started with facial expressions! Feel free to leave a comment if you have any other questions.


Looking for stylized character resources, which contain a robust series of Blendshapes? My character assets are available on the Unity Asset Store! Don't forget to check out the newest addition to the family, the Modular Character Series, which features a full mix-and-match system!



 

524 views0 comments

Recent Posts

See All
bottom of page