My facial animation system

 

 

Through using the help of the above research on the human facial anatomy and expressions, my project aim was to create a facial animation system using the tools provided in Maya. The system should produce facial expressions through mimicking the actions of the human facial muscles, and be operated through a custom user interface.

 

I would like to create a facial animation system that works by copying real muscle movements, as this is not the traditional way to animate a characters face. The more common ways to do animate a face is generally through the use of morph targets (or blend shapes), or interpolation. I am also attempting this project for personal reasons; human anatomy interests me greatly, and this is an excellent way for me to study it further.

 

The human face is an extremely complex structure, consisting of layers of skin, bone, fat, muscle, nerves, cartilage, blood vessels and glands. Up to this date, no facial animation systems have been developed that are based on this level of detail, which makes this area of rigging and animation still open to new and innovative ideas. However, very complex systems have been developed which are based on a simplified version of the human face. PDI enhanced their facial animation software for use in the Dreamworks motion picture ‘Shrek’. With this system, the skulls of the characters are formed in the computer, which are then covered with layers of computer-generated muscles. A program then layers over the skin, which responds to the muscles movement just as human skin would. Another example of a system like this can be found at www.opart.org.

 

     A visual speech synthesizer – the Balbi system

 

The Balbi system (a computer animated talking face) was developed at the UC Santa Cruz Perceptual Science Laboratory by Michael Cohen and associates. It is a piece of software that controls the animated production of accurate, visible speech that is produced by an animated character. It is also capable of facial animation. The character is controlled through an interface, with the speech and facial animation being controlled by algorithms. This system produces very accurate results, has an easy to use interface and can be used as an educational tool. However, the actual development of it is very complex indeed, and so it is not practical to create a system like this in Maya for just a simple character animation.

 

The Balbi system

 
 

 

 


     MEDUSA

 

MEDUSA is a facial animation and modelling system created by Kolja Kähler and associates. Data for the head models is obtained through the scanning of real humans. A model of the human skull and major facial muscles is then matched to the mesh through transformation algorithms. The muscles contract through the use of another algorithm. This system produces excellent results and high quality models. Any face mesh can be used with the system, which would save time when setting up lots of characters. However, the processing of the face data is a very slow process, and can lead to complex meshes that are difficult to work with.

 

     Cane-toad facial animation system

 

This is a bone-driven facial animation system that was created in Maya by David Clayton and Andrew Silke. It uses bones that slide in arcs to create the muscles movement. The bones are skinned to the characters face, and driven keys were set up to control all of the different expressions and phonemes that were needed. This system is a superb example of a simple, yet effective, facial animation system that was created in Maya. This system allows the animation to be tweaked whenever needed, and expressions can be created quickly and easily. The disadvantages of this system however is that the initial set-up is a lot more complex than using traditional blend shapes, and it is more difficult to create skin deformations using bones.

 

 

                      

The cane-toad facial animation system

 
 

 

 


    The creation of my system

 

The aspects that my system must contain:

 

·        It must create facial expressions by mimicking the muscle movements of the human face

·        The ‘muscles’ must be arranged in an anatomically correct way

·        The expressions must be easy to create through the use of a simple user interface

·        It must be able to produce the expressions happy, sad, anger and surprise.

 

I wanted my system to be as realistic as possible, both in the modelling and in the creation of the expressions. A head model was borrowed from Daniel Wood (‘Disco Dan’) that I was to use to demonstrate my system on. The model was very realistic, with quite a high-resolution mesh. I used a polygon model as opposed to one made from NURBS due to polygons excellent properties that make them easy to work with and deform. I modelled a skull out of polygons to fit the mesh, using reference images as guides. The skull would be the basis onto which I could place the muscles that I would use to animate the face. Using my research on facial anatomy I modelled the twelve most important muscles of the face (frontalis, nasalis, levator labii superioris, zygomatic, risorius, depressor anguli oris, mentalis, orbicularis oculi, orbicularis oris, procerus, corrugator and depressor labii inferioris) using polygons. I modelled them in strips so that they could contract like real muscles. I was aiming on using these polygon muscles to deform the skin. The only really practical way I could think of to achieve this in Maya was to use wrap skinning. I sampled this method on a more simplified mesh and muscle structure, and using the paint set membership tool I could make the muscle affect the area of skin that I wanted. However, when I wrap skinned the muscles to the Disco Dan mesh, it became impossible to work with. The paint set membership tool was so unresponsive due to the high resolution of the mesh it was unusable. It was also very time consuming to control to control the contractions of the muscles using the polygon models, as I would have needed to use either blend shapes or lattice deformers to make the muscles deform correctly.

 

 

 

A 3D study of a human head, complete with skull and facial muscles

 

 

The skull I created to fit inside the mesh, complete with eyes, teeth and tongue

 

 

      I decided that to resolve this problem I would change my system to use bones to represent the muscles rather than actual models of them. This was similar an idea to the Cane-toad system, but rather than having single joints moving in arcs I would use joints that scale along their length to represent the muscles contracting. I also decided to use a much more low-resolution mesh, due to problems with the high-res Disco Dan. I downloaded the model ‘Sara’ from www.ant-online.com. I used the joint tool to place joints where the facial muscles would be. I used my previously modelled skull/muscles, along with my research, to help position the joints correctly. I also put joints in the teeth and tongue. I then skinned the entire skeleton to the face, teeth and tongue. Using Maya’s paint weight tools and the component editor I painted the weights of all the joints so that their movement was representative of their real life counterparts. When the joints are scaled, the skin that is attached to them deforms appropriately, pulling the face into the relevant expression. To help me with this I consulted my research on facial muscles movement, and looked at pictures, videos and real life examples of facial expression. Once all of the weights had been painted, I set up a series of driven keys. These animation controls controlled each of the individual muscles in the face model, along with controls for the jaw, tongue, the phonemes and a set of posed expressions.

 

These animation controls were available to the user through the channel box, but I thought that it would be more user friendly to have sliders controlling them instead. I wrote a simple script to create three windows to contain all of the animation controls. This made the system a lot simpler and fun to use, and blending different poses and muscle movements together can quickly create a variety of expressions. 

 

 

 

The joints of my facial animation system. Click on the ‘FACE_CONTROL’ node to access

the animation controls in the channel box

 

 

The different facial expressions can be controlled through these sliders

 

 

How successful is my animation system, and what are its limitations?

 

I think that my system is quite successful, as it has achieved all of the goals that were required of it. The joints that were used as the muscles were arranged in an anatomically correct way within the face, and their movement mimics genuine muscle movement – sliding under the skin by being scaled along their length. I also created a simple user interface, which uses sliders to control all of the muscles. The expressions happy, sad, anger and surprise have already been set up by the system, along with some other frequently used expressions and phonemes.

 

      Although my system has achieved its goals, it still has its limitations:

 

·        The amount of expressions created is limited by the amount of muscles (joints) used. My system utilises only the major twelve facial muscles. Incorporating more muscles into the system will increase the amount of expressions that can be created, as well as increasing the realism.

·        Increasing the amount of joints used to make up each muscle could further refine the movement of the muscles.

·        Using joints makes creating skin deformations, such as wrinkles, much more difficult. The skin of my character does not deform in such a way. Maybe blend shapes could be used alongside this system to create skin effects.

·        I could have used my polygon muscles as influence objects to further improve the skin as it deforms.

·        My system is not portable with other models; the joints were made to fit this particular model. The teeth, tongue and eyes are generic, and could be used with any model. This system could be set up so that the joints would fit in any model (considerations may have to be made between male and female models, as the shape of the head varies). A script could then be written which would copy across the skin weights, to save having to skin every model. This would mean that you could quickly set up all of your characters using this system.

·        I could add buttons to my user interface so that keys can be set on expressions from the interface window.

 

 

 

A number of facial poses created using my facial animation system

 

 

 

<Previous                                                                                                                                             Next>