Sunday, April 11, 2010

MOCAP: AN INTERVIEW WITH ELOI CHAMPAGNE

  Eloi CHAMPAGNE is a motion designer, 
a 2D and 3D animator 
and the President of STUDIOCRONOS[4] 
   located in Montreal, Canada.

*

Q:  How exactly does human motion get translated from the built-in wire channels and data points on the actors' black spandex suits into digital motion and what role do animators have in the process. What do you start with on your computer screen when you begin a MOCAP scene. A stick figure, a CG model?

EC: I start with a file full of numbers, actually. Since MOCAP is not a visual process, it tracks data points on the actors’ suit during the performance and these data indicate the positions of specific points in 3D space, the X, Y, Z axes. 

Q: Do you import this file into a software programme?

EC: First I analyze and edit the file of MOCAP data on software like MotionBuilder and when I’m happy with it, I import it into another computer programme, Softimage. This will interpret the MOCAP data and build a skeleton.

Q: Can you put MOCAP on the screen without CGI animation?

EC: It is possible for MOCAP to go directly to the screen, with different degrees of success. There are 3 ways this can be done: 

1. The performance is recorded, the file is edited by a MOCAP specialist, then the cleaned-up file is imported to the animation software and applied to the previously created character. The resulting movement, a little stiff and wooden, could be put directly on the screen without going through an animator.  It's a workflow that we will see more and more in the gaming industry and possibly on some TV shows. 

2. A character, especially designed for real-time animation, can be moved directly by a performer in a MOCAP suit. I saw this done a few years  back for a children’s TV show. The new generation of GPUs (Graphics  Processing Unit), combined with the computing power that we now have, make it possible to render in real-time some impressive characters with fairly realistic lighting, texturing and physics simulation, although the action is still a bit stiff.

  3. This is kind of mix between the two preceding options. The actor’s performance is linked directly to the 3-D animation software which records the action as KEYFRAMES. I remember using this technique over 10 years ago in Softimage! The idea was to animate a saltshaker pouring salt on fries. Should’ve been easy but it wasn’t. Solution: linking the directional channels X, Y, Z to the mouse, with a different percentage for each mouse movement. All I needed to do was to shake the mouse back and forth, as I would have done with the saltshaker and voilà, basic MOCAP ready to render!  

Q: What do CG animators do to the MOCAP files?

EC: The work of animators often involves animating extra limbs or elements like a tail, hair, strange ears, clothing etc. and any secondary motion, like the bouncing of flesh, that’s not already controlled by the rig or the physics simulation of the animation software. 

Q: Can you do traditional animation tricks like stretch-and-squash, overlap, slow in-and-out in MOCAP animation?  

EC: One of the reasons why motion capture was so difficult to use a few years back was that it was so hard to edit. But now, with the creation of non-linear animation systems where the animation data can be treated like clips (layered, stretched cropped, looped or modified), it is possible to adjust the characters’ actions to obtain a cushioned slow in-and-out effect as well as a squash-and-stretch effect.   

Q: Do you think the work on BENJAMIN BUTTON was more advanced than AVATAR? 

EC:  No. BENJAMIN BUTTON shines in the superior quality of the 3D modeling, texturing, lighting and rendering of the heads in the various stages of the character's life and in the impressive compositing of the heads with the live actor. But the motion capture, as impressive as it was, was nothing new.

Q: What were the big technological advances in AVATAR?

EC:  AVATAR really pushed the envelope with the use of a huge MOCAP stage, life-sized props and the capture of both body movement and facial expression (PERFCAP) simultaneously, as well as the use of virtual cameras and the innovative software Motionbuilder. MotionBuilder made AVATAR possible. To fully understand the technological impact of  AVATAR, read Autodesk’s white paper: “The New Art of Virtual Moviemaking". (This is a PDF file and will download)

Q: James Cameron claims that AVATAR is not animation, but I'm told that what we see on screen is MOCAP "cleaned up" in CGI.

EC:  Even though James Cameron used the most sophisticated motion capture tools on AVATAR, he still needed CG animators to perfect and refine the action, to animate  tails, hair, clothing, tools and props etc., as well as the facial expressions. So AVATAR is MOCAP "cleaned up" or processed by CG animators. But it’s not animation because the body action and facial expression are generated by actors, not animators. It could be compared to rotoscoping which entails filming an actor, then tracing each frame of the film by hand to create a 2D hand-drawn character. This is not to say that Max Fleischer, who invented rotoscoping, was not a real animator, or that Disney's SNOW WHITE which uses it, is not real animation.  

Q: Many CG animators have told me that they don’t stretch and squash their work because “humans don’t stretch and squash.” But humans do stretch and squash and most CG animation and MOCAP have no weight and feel lifeless because of this.  How to fix this problem?

EC: LOLl! I know that this is your pet peeve! Well you’re right and I think the reason it‘s been this way for so long is that is was incredibly difficult to build a character rig that would permit control of stretch-and-squash. And making a good character rig is still very difficult. I find it painful! But it can be rewarding because if the rig is well done, then animating is much more fun and bouncy and lifelike. Take a look at this example of an excellent animation rig and the squishy plasticine-simulated animation made with that rig. Hard to believe it’s CGI isn’t it? 

Q:  What do you see in the future for MOCAP and animation?

EC:  First, some history. Real-time MOCAP animation was the goal when Montreal-based Kaydara Inc. developed Filmbox in 1996, later renamed MotionBuilder in 2002. Kaydara was acquired by Alias in 2004, then Alias was aquired by Autodesk in 2006. And now, after over 14 years of development, Autodesk’s MotionBuilder has become a really impressive piece of software that makes MOCAP a useful tool for filmmakers. Will this replace the work of skilled animators? No doubt it will be used increasingly in the gaming industry. It will certainly create more realistic-looking aliens and monsters in the movies, but I really don't see why Pixar, Disney or smaller animation studios would have any use for MOCAP in their animated movies anytime soon, if ever. I think there will always be a place for pure 2-D and 3-D animation, rotoscoping, stop-motion or any other type of animation.   

Thank you very much, Eloi, for giving us a clearer understanding of how MOCAP is done and the part animators plays in it.  Now we know that the actor’s motions are translated into a file that looks like this,  that software transforms these numbers into this and, with the help of animators, it ends up looking like this:


Here's what Eloi had to say on YouTube about how this video was made: 
    May 16, 2010 - A short demo I quickly put together (quickly being a very relative term when working in animation and 3D...) for N.L. Lumière's blog post/interview about Mocap.
I did not want to invest too much time in it since it was not part of an internal project or a paying job. The idea was to show how easy it is to work with Mocap in a small project, so I voluntarily cut more than a corner or two. For instance, I did not spend as much time as I would normally to adjust the *envelope weights,* so the 3D model deformations are far from perfect.  I did not make any effort to get proper muscle movement or flexing. In addition, other than what was animated via Mocap, I only animated the "tutu" (no facial expressions etc.).      Finally, the texturing, rendering and compositing was kept as basic as possible just to make things go faster.
Nevertheless, I think it turned out to be a fun little informative video for people curious about 3D and mocap.
*

No comments: