EMBODYING EMOTION

(GDC 2002 course notes)
Ken Perlin

Goals

The work I presented at GDC 2002 is a framework for procedural emotion shaders. The idea is to allow an artist/programmer to add subtleties to the movements of 3D virtual characters so that they convey mood and personality, without destroying the base moves or paths of the characters as they move through a scene.

Empathy versus Agency

Traditionally, there is a dialectic between games and linear forms such as movies. Games maximize a player's agency, while conveying little empathy for the main character (since the player makes all the choices). Movies optimize empathy for the main character, while giving the viewer no agency. At a high level, I'm trying to help to avoid losing empathy for a game character, while still retaining what games are good at: giving the end user agency.

Conveying character

If you look at movies for a moment as a rough model, it's clear that there are a number of distinct talents needed to make a good film: writing, directing, acting, and the technical skills of camera, lighting, set-construction, costumes, etc. Without good acting, the other skills could not result in a good movie.

Yet for games, linear animation content (the equivalent of pre-canned acting), does not suffice - characters need to change the way they act as the game play changes the situation. For example, a character should use very different body language when in the presence of a love interest, or when in the presence of a 30 foot fire breathing dragon.

Two types of emotive affect need to be conveyed by a game character, if we expect the player to feel empathy toward that character: personality and mood. Personality constitutes the semi-permanent, character-defining traits of that character, such as extrovertion, cheerfulness, assertiveness, and degree of being controlling. Mood constitutes moment-to-moment situationally driven affective responses, such as boredom, nervousness, exhaustion, impatience, or fear.

What is called for is a procedural approach to modifying body movements so as to convey various moods and personalities. If you'd like to read a more comprehensive discussion of the reasoning behind the use of procedural methods to create better acting in computer games, look here.

Stand-alone examples

You can play with the interactive demo that I showed at the course by clicking here. Note that I'm only granting you permission to play with the applet, not to copy it or to download or reuse it in any other way. As you'd expect, the specific technical details that make it work are legally protected property, which you can license from me if you'd like to use them commercially.

The rendering is all software, Java 1.0 compatible, so it will work in pretty much any browser. You should probably use at least a 700Mhz PC.

You can follow the instructions on the left side of the demo to try things for yourself - drag the hands and feet with your mouse, and try the various sliders and buttons to modify body posture (click a menu bar twice to "lock" it open). If you click on various areas of the floor, the character will walk there.

Under the hood

The run-time loop consists of three phases:
  1. Get parameters, gaze direction and hand/feet positions;
  2. Run "shaders" to tweak joint angles;
  3. Make the body move so as to try to match those angles.

I only use simple two link inverse kinematics. My reasoning is that shoulder and pelvis positions are so psychologically and culturally informed that they should be considered content. Any automatic placement of them is bound to produce bad acting. Instead, I use procedural emotion shaders to nudge these things to where they should go. A complete description of my inverse kinematics solution is here. You can click the IK button in the demo to toggle inverse kinematics. That will show you which part is being played by the two link IK.

In the talk, I showed the code for some specific moves, such as earthquate, hip gyration, jumping and sidling. I'm still in the process of transferring these over to these notes properly, so bear me.

Filtering motion

Practically speaking, you want to be able to apply these motion shaders to existing animations within a system. For this reason, I have a version of things that just modifies existing animations. As I showed in the course, you can see it applied it to a skeleton supplied by a game company that I've been working with. The skeleton has a different set of bones from the ones I use, but the system is robust enough to just use some joints for the filtering, and pass the others through intact. You can see that demo here. To control the external figure in the demo, click on the F2 button to select it.

In the rendering pipeline

When used as a filter, the system proceeds as follows:
  1. Animation -> Matrices
  2. Matrices + Parameters -> New Matrices
  3. New Matrices -> Renderer

Why they are who they are

The characters in the demo have the personalities that you see because of a thin layer of A.I. that I hard-wired in for the demo. This A.I. basically tells the characters things like how to adjust their posture and attention when you grab their hands or feet, or what posture to assume and what to do when you click on the floor. It also tells them to try to avoid each other.

I tuned the parameters of the A.I. to make the characters seem somewhat endearing, as though they just got into the world and are fascinated by things like their hands and feet. That wasn't really a technological choice, but rather a content choice, since characters with this personality make for a more fun demo.