Hollywood Notes / Reports
Home | New Page Title

Hollywood Hair is Captured at Last: Details in SIGGRAPH 2008 Paper

Los Angeles – August 13, 2008: UC San Diego computer scientists presented a new method this week for accurately capturing the shape and appearance of a person’s hairstyle for use in animated films and video games.
Photobooth demo
The left two images demonstrate different aspects of a real hairstyle that the computer scientists captured. The third image from left is the reference photograph of the real hairstyle. The new algorithms created the image on the right, which has photorealistic highlights and texture, even through there are no photographs that were taken at that angle.

Imagine avatars of your favorite actors wandering through 3D virtual worlds with hair that looks almost exactly like it does in real life. Now imagine this hair blowing in the wind and shining in the sun. This level of realism for animated hairstyles is one step closer to the silver screen, thanks to new research being presented at ACM SIGGRAPH, one of the most competitive computer graphics conferences in the world.

The breakthrough is a collaboration between researchers at UC San Diego’s Jacobs School of Engineering, Adobe Systems Incorporated (Nasdaq: ADBE) and the Massachusetts Institute of Technology.

“We want to give movie and video game makers the tools necessary to animate actors and have their hair look and behave as it would in the real world,” said UC San Diego computer science professor Matthias Zwicker, an author on the SIGGRAPH paper.

The computer graphics researchers captured the shape and appearance of hairstyles of real people using multiple cameras, light sources and projectors. The computer scientists then created algorithms to automatically “fill in the blanks” and generate photo-realistic images of the hairstyles from new angles and new lighting situations.

Adobe researcher and SIGGRAPH paper author Sylvain Paris explained that replicating hairstyles for every possible angle and then getting individual strands of hair to realistically shine in the sun and blow in the wind would be extremely difficult and time consuming for digital artists to do manually.

Giving Hair “The Matrix Treatment”

Photobooth_fig11

Side-by-side comparisons of computer generated hairstyles and actual photographs of the same hairstyle. For fair comparison, the reference photographs were removed from the data set used by the authors’ image-based rendering method.

The makers of the movie The Matrix used digital face replacement to generate realistic images of human faces even though they had no photographs from these angles, explained Zwicker. “Our graphics group at UC San Diego helped to create computer graphics algorithms that do the same thing for hairstyles.”

 If you had an infinite number of cameras and light sources, there would be no angles, views or shots that need to be computer generated. “But this is totally impractical,” said Zwicker.

Instead, for each of the hairstyles that received “The Matrix treatment,” the researchers captured about 2,500 real-world images using 16 cameras, 150 light sources and three projectors arranged in a dome setup. With all this data, the computer scientists determined the physical position and orientation of all visible strands of hair. The algorithms then generate complex hair models, producing on the order of 100,000 hair strands.

 From here, the computer scientists found a new way to precisely simulate how light would reflect off each strand of hair. The result is the ability to create photo-realistic images of the hairstyle from any angle. The automated system even creates realistic highlights. This process of creating new images based on data from related images is called interpolation.

“To make hair that looks like hair, you have to think in terms of individual fibers,” explained Will Chang, the computer science Ph.D. candidate from UC San Diego’s Jacobs School of Engineering who did much of the algorithm development for this project.

By determining the orientations of individual hairs, the researchers can realistically estimate how the hairstyle will shine no matter what angle the light is coming from. “You can’t just blend the highlights from two different angles to get a realistic highlight for a point in between,” said Chang. “Instead of blending existing highlights, we create new ones.”

The new computational approach can be used for much more than generating images of a hairstyle based on what the style looks like from other angles.

One possible extension of this work: making an animated character’s hair realistically blow in the wind. This is possible because the researchers also developed a way to calculate what individual hair fibers are doing between the hairstyle surface and the scalp. They call this finding the “hidden geometry” of hair.

“Our method produces strands attached to the scalp that enable animation. In contrast, existing approaches retrieve only the visible hair layer,” the authors write in their SIGGRAPH 2008 paper. An animation of a hairstyle is available as a download from the “hair photobooth” Web site created by Sylvain Paris: http://people.csail.mit.edu/sparis/publi/2008/siggraphHair/

SIGGRAPH 2008 Paper citation:  “Hair Photobooth: Geometric and Photometric Acquisition of Real Hairstyles,” by Sylvain Paris and Wojciech Matusik from Adobe Systems, Inc., Will Chang, Wojciech Jarosz and Matthias Zwicker from University of California, San Diego; and Oleg I. Kozhushnyan and Frédo Durand from Massachusetts Institute of Technology.

Hollywood Night Life ( NightLife A), RMC

Please get in touch to offer comments and join our mailing list for sales and specials!

You can e-mail us at: