MCS Login



MCS Login


Tour ILM - Inside ILM

Inside Industrial Light and Magic

By Darren Waters

Technology editor, BBC News website

  • Actor Bill Nighy

ILM technology turned an on-set Bill Nighy into Davy Jones

Visual effects wizards Industrial Light and Magic (ILM) have helped redefine the film industry over the last 32 years. The BBC News website has been given unprecedented access to the talented staff and ground-breaking technology at the firm. Over the next three days we will be looking at the work of the firm and of parent company Lucas Film? itself.

The Letterman Digital Arts Center nestles on one of the hills of San Francisco in the historic Presidio district of the city.

It is home to ILM, parts of Lucas Film, video game developers Lucas Arts? and other arms of the entertainment multimedia empire built by Star Wars creator George Lucas.

Artists, animators, technicians and staff are housed in a state of the art digital complex covering 850,000 sq ft on 23 acres of land, which overlooks the Golden Gate Bridge.

"That's actually a matte painting and we can change it," says Miles Perkins, director of marketing and communications at Lucas Film, who leads my tour.

"Sometimes it's the Eiffel Tower," he adds.

Optical illusion

He is, of course, joking, but it would not be beyond the powers of ILM to produce such an optical illusion, even on that scale.

We tend to anticipate what the future technologies are three to five years out.

  • Michael Sanders, Digital Supervisor

ILM has worked on more than 250 films since it was set up by Lucas to provide special effects for Star Wars, released 30 years ago next month.

"In each film you don't have a director who comes back to you and says 'Can you give me exactly what you gave someone else a year ago?'," explains Mr Perkins.

"They are always asking for more, for something no-one has ever seen before. If they wanted one wave a year ago, now they want two waves, or an entire ocean."

~Oscar win

After a long hiatus, the firm recently picked up its 15th Oscar for visual effects, for the work done on Pirates of the Caribbean: Dead Man's Chest.

Visual effects have come a long way since the days of clay models and stop frame animation and recently ILM has been blurring the lines between live performance and digital effects.

Much of the ground-breaking work has been done by the motion capture team led by digital supervisor Michael Sanders.

Deep inside the center at the motion capture sound stage he is helping the firm seamlessly meld live action and computer generated material.

Creature integration

Michael Sanders in the ILM motion capture studio

"Every year at ILM we are able to outdo ourselves with the technology of motion capture and creature integration," he explains.

"A few years ago it was a post-production process; shooting it on our stage and then integrating it into the film.

"Over time we have realised that making it real time and making it integrated with live action gives more control to the key creatives."

On the most recent Pirates of the Caribbean film director Gore Verbinski had a specific request.

"He asked us if we could invent a technique which allowed us to film Bill Nighy on the Pirate's ship, in the harbour, during a storm or in any conditions without any calibration equipment and a bunch of intrusions on set.

Spectacular CG creations

"We said 'Sure. It's about time somebody asked for it because we wanted to build that kind of technology'."

Golden Gate Bridge

Matte painting or stunning view? ILM is based in San Francisco

Nighy's character Davy Jones is one of the most spectacular CG creations of recent years - dripping in barnacles and tentacles the character is one of the most memorable in the film.

"That's where the technology is going - invisible tools to provide for the director so they don't have to change their paradigm of shooting, whether it's a virtual element or a live element. It shouldn't matter with a low footprint and low impact on set.

"So we had to invent a technology where we could shoot the actors live in the set without any calibrated equipment; basically extracting their performance from the film camera."

'Computer pyjamas'

The solution was to have Nighy wear a set of head sensors on the film set - which the actor called his "funky computer pyjamas" - and which did not interfere with his performance nor the smooth filming of the production.

"As long as he is wearing this while performing, we can extract his 3D skeleton performance that matches his exact motion through the shot.

"We dump that into the digital character, add the claw simulator, tentacle simulator, facial animation, etc - this is where the animators and artists come into play.

"They are extracting Bill's performance and then embellishing. But it's Bill Nighy's eyes, antics; his performance."

Without this technology separate filming would have to be done on a motion capture stage with elements then "filled in" post-production.

Image streams

"The future for us to a extract a high-fidelity actor's performance, from one or multiple image streams, plug it into the creature and put it right back in the shot for the director.

"So rather than composing for the actor in the set, they'll be composing for the creature in the set in real time, under any lighting condition, under any shooting condition."

The real true nuance and performance comes from the impromptu talent of an actor - all we are doing is replicating it in CG Michael Sanders

Sanders says his team is tasked with always thinking beyond the current needs of today's directors.

"We tend to anticipate what the future technologies are three to five years out. We sit on a concept until it's asked for and then with short development cycle we can do pretty much anything."

So far, says Sanders, a director has not tasked them with an effect or technique that they have not been able to do.

At the Letterman Center the team has a dedicated sound stage, with 40 state-of-the-art Vicon cameras which emit infrared light that bounces off markers on a performer's suit.

'Active space'

"In the computer we see the markers and they are associated to a skeleton and that's how we derive a motion from a performance.

"We can shoot a dozen performers in here and they can go anywhere in the room. It's all active space.

"We can also do real-time motion capture; virtual cinematography - you have a virtual space and virtual character in real-time.

"We use a camera with a tracking device as a digital virtual camera. As you move the camera through the space you are actually observing on the monitor a 3D perspective as if you were playing a video game.

"If there's an actor here you can also see your virtual creatures in real time in the 3D space."

For the future Sanders and his team are working on removing all barriers between live performance and digital creations - but actors will never be replaced, he says.

"The real true nuance and performance comes from the impromptu talent of an actor - all we are doing is replicating it in CG.

"We're not going to be replacing actors. They are the ones with the talent. We are just capturing and/or embellishing their performance."