(H)ITech, IAliens!

The hi-tech scanner that turns actors into aliens, warts and all The Times Kaya Burgess

http://www.thetimes.co.uk/tto/multim...7__431629c.jpg

In the cinemas and video games of the near future, computer-generated characters will be so lifelike that individual skin cells and hair follicles will be visible, because of a new high-definition form of digital animation technology.

The distinction between real-life actors and computer graphics has already been blurred by films such as Beowulf and Avatar — which used computer-generated imagery (CGI) to create animated versions of leading actors — but until now they risked seeming rather plastic-looking.

But thanks to new super-high resolution facial scanning you will now be able to see every blemish and crease in Angelina Jolie’s virtual cheek or Zoe Saldana’s digital forehead.

Researchers at the University of Southern California and Imperial College London have developed techniques to scan centimetre-square patches of skin from the cheek, forehead, nose, chin and temple in such high resolution that a single skin cell covers three pixels on the screen.

The team has also polarised the light source used during the scanning to pick up not only the light reflecting off the skin’s surface but also light that penetrates below the epidermis and scatters back, providing greater depth and tone to the final image.

The scanning, which uses high- resolution stills cameras in a laboratory, also captures how the skin behaves under different types of light and during different facial expressions. The scanned patches can then be mapped on to a 3-D image of the actor, created with motion-capture technology.

As a result, computer-generated characters will no longer be so “plastic-looking”, according to Paul Debevec, the associate director of graphics research at USC, whose earlier techniques were used on James Cameron’s Avatar. “The bumpiness of the surface of the skin, at the micron scale, actually affects how light reflects off the surface,” Professor Debevec explained.

“That’s what makes it look healthy or oily or pasty or chalky. It makes someone look like a human being made out of organic material and not like a computer-generated zombie.” To make Avatar, artists had to go back to the CGI imagery of the blue-skinned Na’vi characters and add blemishes, such as moles or creases, by hand. This vastly increased the man-hours and expense of the film, which was nearly 60 per cent computer-generated and cost more than £150 million.

The process will now be much cheaper, Professor Debevec said, and video game developers at Activision have already created mathematical algorithms that can mimic many of the effects of the high-definition scanning, greatly reducing the time, expense and processing power needed.

This will allow hyper-realistic CGI characters to appear on video games consoles and could allow film directors to create CGI scenes in real time.

Professor Debevec said: “In the future it might be the less expensive movies that use CGI technology, while big budget movies will be the only ones who can still afford to go out on location and shoot in Paris or Bermuda and take up actors’ time.”

Abhijeet Ghosh, from the computing department at Imperial College London, helped to develop the “facial microgeometry scanning” process and was approached by Avon cosmetics company to help them to analyse the effects of make-up on the skin.

He predicted that cosmetics customers may be able to use apps in future to see how their faces would look with different types of foundation. “When you start scanning skin at that scale, it could also have medical or dermatological applications,” Dr Ghosh said.

http://www.thetimes.co.uk/tto/arts/f...cle3815697.ece