FANDOM


Behind the Magic is a wiki page created to keep a collection of interviews, facts and procedures about the development of the characters, places and the movie itself, Rise of the Guardians.

Characters and Effects

As with most animated film production pipelines, everything after story and layout, except animation, falls within visual effects: character effects, simulation, modeling, rigging, surfacing, lighting, and rendering. For this show, the visual effects crew created unique skin textures for all the superheroes, feathers for the Tooth Fairy, Sandman’s sand, Jack’s frost, and Pitch’s shadowy nightmares. The team rigged North’s reindeer and controlled his elves, and dipped Bunny’s eggs in a painterly river.

“I’ve worked on a lot of live-action blockbusters,” says visual effects supervisor David Prescott, who joined DreamWorks Animation after 10 years as an effects and CG supervisor at Digital Domain on such films as Transformers and Day after Tomorrow. “Every time I looked at animated features, it seemed like the effects were an afterthought. On this show, with Sandy, effects go further than helping tell the story. They are part of the story and part of characters’ personalities.”

Head of Character Animation Gabe Hordos and Animator Alexis Wanneroy spent about a year developing the characters’ personalities before moving them into the pipeline. Hordos had been supervising animator for the character Toothless in How to Train Your Dragon; Wanneroy had animated on the Hiccup team and was a character lead for Fishlegs in that film. Rather than testing the characters on shots from the movie, Hordos and Wanneroy developed characters that would work no matter how the story twisted and turned.

“On other films, you typically get to touch the character after the rigs are done, usually quite late in the process,” Hordos says. “You do one or two shots, but you haven’t developed the character. You just show that the rig works. And then you spend time trying to nail down gorgeous animation. We would spend a whole month on one character.”

Later, Hordos would assign a supervising or lead animator for each character and give them sequences based primarily on the character most prevalent in the sequence. Dave Pate supervised Jack; Pierre Perifel, North; Philippe Le Brun, Bunny; Antony Gray, Sandman; Wanneroy, Tooth; Steven “Shaggy” Hornby, Pitch; and Bob Davies, Jamie.

“It was almost like inventing actors,” Hordos says of the development process. “Every film has such a different style, and we had three realms in this film: humor, serious dramatic acting, and action. But, at the end of the day, the theme of this film is ‘belief.’ So when I came on, I wanted to make the animation as believable as possible. So, the riggers put extra effort into making the shapes of the faces work well; the rigs maintained volume with every control. We had animators from Madagascar 3 take some funny parts and make them sing, but the film also gave them a chance to flex their acting muscles.”

New skin in the game

New Skin

Tooth and Jack were the first characters to try on new skin developed by the lighting and shading teams for this film. “Because of my live-action background and sensibility, I wanted to build and photograph something for reference,” Prescott says. “So we went to Legacy Effects, where artists have created [practical] skin for 30 or 40 years. We showed them our artwork for the characters. We told them we wanted to see texture under Jack’s skin; we wanted him to look cold, but not dead.”

In all, Legacy provided 19 different types of silicon skin that the crew at DreamWorks photographed in different environments. Then, DreamWorks reverse engineered the techniques the Legacy artists had used to create the skin. “They showed us how they put particular colors at various depths when they put in the veins,” Prescott says. “Then [Surfacing Supervisor] Andy Harbeck rewrote our skin shader using subsurface scattering, translucency, everything everyone does, but instead of painting on the surface, he had multiple layers beneath the skin.”

For example, following Legacy’s model, Harbeck painted North’s veins under the skin. “That allows light to hit the skin and rays to disburse before hitting the veins,” Prescott says. “It creates detail on the skin that isn’t painted on the skin. And, if you rotate the light, you have a completely different tangent to the vein underneath, which is what happens to us. The skin looks slightly warmer. I don’t know if the result is better or worse, but it’s different.”

The goal was not to imitate human skin. In fact, the characters don’t have skin pores. “We didn’t want to go into the uncanny valley,” Prescott says. “We wanted them to look like models or a maquette you could buy.”

Night Light

To light the nightmares, the team used only key lights. “We never lit them with the color of the environment,” Prescott says. “If we did, they lost the oily, iridescent, creepy feeling. So, we’d use a key light to get direction, but the color remained in their character. The hardest challenge was the aesthetic. You can make a nightmare look scary, but we wanted it to look beautiful at the same time.”

Pitch got a little color from the environment, but for the most part remained black and white and gray. “It made him feel creepier,” Prescott says. “As if he were from somewhere else.”

Throughout the film, Prescott aimed for a photographic look through the lighting. “I’ve felt that CG movies often seem a little gray,” he says, “as if they are afraid to bloom out to pure white or go to black. We put a full-blown color workflow in place to deal with that range of color and brightness. We wanted the [lighting and shading artists] to be more aggressive and graphic in lighting but still have a photochemical feel in the colors. We also used lens effects. In terms of technology, the biggest development in the lighting area was the modifications to our skin shader. The rest was an evolution of global-based illumination. And, we relied more heavily on [The Foundry’s] Nuke.”

The lighting artists also affected the film’s aesthetic in an unusual way: by controlling eye welling, “You know, when the bottom of the eye gets a little fluid before you cry,” Prescott explains. “I felt that emotional response was important. I don’t think anyone will say Jack has eye welling, but it’s such an emotional key, almost body language. We don’t use it a lot, but it’s very effective.”

Prescott and Hordos evaluated the idea together as the system evolved, but lighting artists rather than animators would control the effect. “Rigging did an amazing job creating a rig that could scale the amount of fluid,” Prescott says. “It was so different for rigging and lighting. Instead of rigging for character animators, the riggers built controls for eye welling in the shaders. And, instead of just putting lights in the sets, the lighting artists created this emotion.”

Epic Environments

Crowds add texture and life to environments throughout the film—eggs, elves, yeti, mini teeth, nightmares, and so forth. To control them, the crew used a combination of Massive software and in-house systems. “Often, we needed to add 10 yeti when we populated North’s environment, or 10 nightmares,” Prescott says, “not hordes running across a plain, which is what you often think of using Massive for. We used Massive as a placement tool and the Massive brains as a great way to transition between animation cycles. But, animation was important. We have lots of shots with one or two elves running into each other in the background of a crowd scene, bumping into walls. These characters needed more acting ability.” In North’s factory, elves are mischief-makers; the yeti, big, lumbering beasts, make the toys.

Like North, each of the Guardians except Sandy has his or her own environment— the North Pole and North’s toy factory, Tooth’s Fairy palace, Bunny’s warren. Even Pitch has a lair. Jack’s “environment” is the town.

“This isn’t a traveling film, but we go to different locations to tell different parts of the story, so we have over 30 sets and locations,” Prescott says. “I think this is becoming more common for feature animation. If you look at the first CG features, they had only a couple locations. Now, we have better pipelines, workflows, software, and experience, all those things we always talk about, so we can make more models and surfaces, and render them more efficiently.”

Source

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.