|ikinema used on wrath of the titans|
|Wednesday, 04 April 2012 19:00|
Framestore Brings Wrath of the Titans an Eye for Detail
Press release from Framestore:
Framestore created some 300 VFX shots for two elaborate sequences for Wrath of the Titans, the Warner Bros. sequel to 2010’s Clash of the Titans. The film was produced by Bill Iwanyk and Polly Johnson, and directed by Jonathan Liebesman, with Nick Davis returning as production Visual Effects Supervisor.
Leading the Framestore team was Visual Effects Supervisor Jonathan Fawkner. “We were chiefly tasked with two sequences,” he says, “The first involving an encounter between Perseus’s team and a trio of Cyclops, the second (which follows the first closely within the film’s chronology) concerning Perseus’s groups assault on The Labyrinth, a towering maze which provides access to Tartarus wherein Zeus is imprisoned. So we had two very contrasting types of visual effects to deliver: photo-realistic near humans, albeit giant ones, interacting with environments and human actors; and an impossibly vast, constantly moving architectural environment.”
“For the three Cyclops, we decided that performance capture was the only route from the start, but that we’d play it a little differently from usual,” explains Fawkner. "An initial session was recorded before the shoot to explore the behaviour of the Cyclops and inform the cast and crew. Then the plates for the sequence were shot in forest land in Dorking, England in April 2011, without mo-cap referenced directly in camera but rather relying on the tried and tested "tennis ball on a stick" technique providing the most flexibility for the director and cast.”
”By the time we came to the actual performance capture sessions at Shepperton studios, we had a sequence in the can, cut and camera tracked. We had an accurate scan of the set and by carefully laying out proxy trees and other obstacles, and by matching the topology with a movable sloping deck, we were able to composite the Cyclops into the plate live providing an incredibly intuitive and comprehensible tool.
Former international rugby player Martin Bayfield was cast for the mo-cap shoot, not least because, at 6ft 10in tall, he had something of an edge when it came to playing giants. Involved from an early stage was Animation Supervisor Paul Chung, who says, “Since Martin Bayfield’s performances would be used for all three – very different – Cyclops characters, I did a lot of research into how each of them might move and perform. I wrote some biographical notes on each family member to give Martin some material he could inform his performances with. The hot-headed, muscular younger brother, his fatter older sibling who always has to rescue him from the scrapes he gets in, and their dad, who both of them are a bit scared of – that sort of thing. Bayfield took all this on board and gave excellent physical performances.”
Bayfield would study the plate, the timing and the rhythm and, with Fawkner directing, attempt to hit his marks in each shot. The team would try as many variations as time would allow, and the various takes were delivered to the client to pore over and make selects from within hours of the capture, ready for editing in the normal fashion. Says Fawkner: "The massive benefit of doing the capture this way was that the animation was effectively blocked and locked very early on, leaving more time to finesse the details.”
Motion-capture has a bit of a reputation – not unearned - for creating a slightly unnatural ‘mo-cap look’. There are a lot of different elements that contribute to that, from the actor wearing the markers and where they are placed, to how you solve it, to how you put it on the character. There are maybe a dozen stages that this data goes through and detail can be lost at any one of them. So Framestore has developed a new pipeline. They use several witness cameras to complement the mo-cap input, increasing its accuracy. With Wrath, they also became the first company in the world to forge a partnership with IKinema, a company that makes software for the process of taking mo-cap and transferring it to a creature of a different scale. Framestore also built tools based on IKinema software that could help quality control the solving process, neatly comparing solve with witness cam footage. Not only does the solver prove incredibly accurate, but it also gives a newfound flexibility that streamlines the entire mo-cap solving and retargeting pipeline.
Nicholas Scapel, Head of Rigging, takes up the story. “So we got the action perfectly from Martin Bayfield the actor to Martin Bayfield the digital model. The key to great mo-cap is how you give it to animation. Many studios have a mo-cap department which has a large motion editing team – they cannot do technical animation – and they fiddle with the mo-cap to make it work in the shots and then it goes to Animation, who are often less than thrilled with what they get. We want to give as close as possible to the raw performance to the animators and to let them work it up from there.”