PART ONE: TRACKING A TARGET
Using Unity Version 2019.4
There are a few creative hills I’m willing to die on, no matter how basic the game is. One of those hills is eye movement.
At the very least, eye movement can direct player attention, but there’s much more to it than that. In reality, our eyes also convey focus, personality and emotion; we abandon a crucial human element if we negate that in our character design. In fact, a character staring lifelessly in one direction can hurt immersion significantly.
So for me, eye movement is a must. And fortunately, it’s quite simple to implement.
Your character has dull, lifeless eyes. You want them to look at points of interest or look around their surroundings, but they only stare blankly ahead.
This is the first in a two-part tutorial that will go through two different ways to tackle eye movement: 1) Using eyes that are rigged to bone transforms, and 2) using an offset on the texture UV for the eyes. In part one, we'll set up our movement system and implement a function to track targets. In part two, we'll get a bit fancier.
Obviously, there are exceptions to the below criteria, but basically, you will want to use:
Eye bone transforms
If your character has separate eye meshes rigged and weighted to bones, and
The eye meshes are spherical
Texture UV Offset
If your character has eye meshes that are not rigged to bones, or
The eyes are part of the character mesh, but have their own material, or
The eye meshes are flat or non-spherical
Before we get into specifics, let’s break down how these approaches work differently.
Unity’s API includes an interface that will point a transform at another transform. Specifically, the animator.setlookat functionality rotates a transform's forward direction (which should be the front of the eyeball) to “look at” another transform (the target).
This process offsets the texture rather than rotating a mesh. Based on the target position, it will offset the texture on it's x and y axis so the eye appears to be looking at a target.
If you have the transforms, or plan to use blendshapes instead of UV offsets, there are several high-quality assets on the store that take care of eye movement for you. My favourite is Realistic Eye Movement, but there are many others. Lipsync Pro, for example, includes an eye controller in addition to lipsync functionality. So if you want to abandon this tutorial and use one of these assets, you can pick them up here:
However, if you want to do it yourself, keep reading.
Animator Controller Setup
I’m going to make use of the animator for both methods in this tutorial. Mainly because I like to keep things that move the character in one place. Also, by dedicating a layer to eye movement, we can sync it with our other character animations.
Why would we do that? Well, if your character is running forward, they’re likely just looking ahead, so you don’t need to bog down your update function with eye movement calls. But when they are near a point of interest, or in a fighting stance, you want to be looking at the target. If they’re standing idle, they might just be looking at random points. And finally, if they’re in a dialogue state, their eye movements would ideally correspond to their state of mind. Thinking, shyness, anger, and so on — all would be expressed through different eye movements. In the part two of this tutorial, I’ll show you how to expand the scripts in this part with coroutines for some of those uses.
Begin by creating a new layer in your animator called “EyeMovement” or whatever you like. This layer will use an avatar mask for just the eye mesh, the head bone and the eye bones, so cancel out the entire humanoid form and import your character’s avatar to the Transform field.
The layer should override the other layers and have the IK Pass enabled.
We'll set up states for eye movement, even though they co