The VES recently held a Motion Capture event where a range of motion capture experts presented a snap shot of the state of the art in the industry, David Stripinis reports a personal perspective, from someone who works with it daily.
Motion capture is one of the more polarizing technologies used in the world of visual effects. Some people love it, some people fear it. Personally, as an artist working with motion capture for over ten years, I see it as just another tool. Perhaps out of ignorance, experience with unskilled artists, or impossible promises made to them in the past, many producers and directors have an inaccurate view of the technology. They either believe it a cure all that will let them get rid of all those pesky and expensive animators or see it as an evil shortcut that will leave them with a movie full of zombie eyed people.
Of course, neither is entirely accurate. To help educate producers, directors and the general visual effects community, the VES, in co-operation with the Motion Capture Society, hosted an event at Sony Pictures Imageworks in Culver City, California. Entitled “Demystifying Motion Capture Techniques”, it was a panel discussion with representatives from ten different companies, representing a good cross section of technologies, techniques and business models.
The two major themes of the night was an attempt to rebrand motion capture as performance capture. Though the two terms mean the same thing, performance capture has a less clinical sounding name, making it appeal to the more artistic filmmakers. The second overarching idea of the presentation was an emphasis on realtime production, either in the actual capture itself, or by being extremely low impact on the production, allowing filmmakers to more or less ignore the technology.
Each speaker was given approximately ten minutes, and a question and answer period followed.
Eyetronics
First up was Nick Tesi, from Eyetronics talking about their inertial based capture system. The unique benefits they touted of such a system were the unlimited volume, and since optics were not involved, occlusion was not an issue. They also showed off a facial capture system based on structured light projection that was used on “The League of Extraordinary Gentlemen”.
Motion Analysis
Next was industry veteran Dave Blackburn from Motion Analysis. Dave used his time to talk about using motion capture not only for the exact recreation of motion, but also for more artistic applications. To this end, he showed a feminine hygiene commercial from the UK market featuring a motion captured dancer, that left many in the audience nervously laughing.
Henson Studios
THe only female panelist followed, Kerry Shea from Henson Digital Puppetry Studios, part of the Jim Henson Company. Kerry discussed the decades long legacy of Jim Henson, and their dedication to the sanctity of the performance of the puppeteer. The video she brought showed the development of the Waldo and it’s evolution into the Henson Digital Puppet System. It was followed by the first public exhibition of their PBS show “Sid the Science Kid”. It showed capture performers in costumes providing the proportions of the characters, with giant screens showing the capture results in realtime. Each performer was paired with a puppeteer who both performed the facial animation with hand controls as well as the voice. By having all this in realtime, it lets them have dailies, much like a regular show.
Vicon / House of Move
Brian Rausch from Vicon / House of Moves followed. He showed a variety of work from film, television and video games. He used his time mainly to discuss Naughty Dog’s “Uncharted Territory”. He stressed that technology should never get in the way of performance. He got slightly off on a tangent talking about animation and motion capture working together and burned up the remainder of his time – to the point where panel moderator Demian Gordon kept flashing a light at him till he quieted down. It was all in good fun, as the two are good friends.
Sony Pictures Imageworks
John Meehan from Sony Pictures Imageworks spoke about his experiences on “I Am Legend” as motion capture supervisor. John is a good friend of mine, and both his humor and intelligence was on display. He spoke of the challenges of capturing 150 moves for their Massive motion tree in two days, and how those moves were enhanced by animation, either by adding moves that weren’t able to be captured or enhancing the ones that were. He stressed the partnership between animation and motion capture at Sony and on “I Am Legend” in particular.
Giant Studios
Kevin Cushing from Giant Studios filled in for an ill Ryan Champney, who had to go to the hospital an hour before the presentation began. Kevin honored all the hard work Ryan had put in on the presentation. Giant’s presentation stressed their ability to do complex capture, including retargeting, all in realtime. They showed footage of The Incredible Hulk, and Jon Favrau in a mocap suit for Iron Man. They also showed their ability to capture live on set, even during principle photography. The video they showed was actually filmed on our stages at Avatar, and I was in the background in some of the HD video shots. Ryan is fine now, by the way.
ILM
Mike Sanders from Industrial Light & Magic spoke about both their proprietary iMocap system and the motion capture stage at their Presidio facility. One of the most interesting things about ILM’s methodology was that they never use motion capture as a 100% solution, simply as a starting point for animation. Their iMocap system, which is rooted in matchmoving, is very low impact on-set, completely relying on a post solution. Mike also mention that they keep the motion capture stage at ILM live all the time, so if an animator wants motion capture reference for a shot, they go down, put on a suit and perform it. Because they rely on a completely automated tracking system, not needing the fidelity of hand trackers for a 100% solve, the captured motion is usually waiting for the animator by the time they are back at their desks.
ICT
Research legend Paul DeBevec wowed the crowd with his latest work on facial capture. Furthering the research that resulted in his Light Stage technology, he showed an amazing method for extracting complex deformations of the face based on image analysis of specular highlights as light is cast from multiple angles. There is a calibration pass that has to be performed first, but the results were quite impressive. In 10 minutes he obviously raced through the material, (see fxguidetv for more indepth on this).
Image Metrics
Wrapping up the evening was Patrick Davenport from Image Metrics. Image Metrics technology is dedicated to analysis of markerless facial performances, and generating data from it. They showed some impressive duplication of an actress brought in specifically for data acquisition. Even more impressive was an analysis and digital replacement of Marilyn Monroe, done in partnership with Double Negative. The results were seamless, producing an audible gasp from the audience.
Gordon opened the session to questions, starting it off with a simple “What has been your favorite project?” I’m proud to say Kevin Cushing named us on “Avatar” as his, garnering more than a few laughs from the producer and director heavy crowd when he mentioned the difficulty in pleasing the director. Very quickly, the discussion digressed to a conversation about the rights of actors and the data created from their performance. It’s an interesting philosophical and legal debate, and one that is ongoing. While the panelists did their best to answer the questions raised, there wasn’t enough time to fully cover the topic in the time allotted.
All in all it was an amazing evening gathering many of the best mocap artists in LA together and hopefully proved educational for the DGA and PGA members in attendance. Hopefully more discussions and presentations in the same vein will continue to educate those in the position to make decisions that motion capture is neither an instant solution to a problem, nor is it something evil to be afraid of.