Professor’s research contributed to iPhone X


Emily Smith | Daily Trojan

Upon its debut, facial recognition was among the iPhone X’s most talked-about features. USC assistant professor of computer science Hao Li was one of the developers who brought this idea to life, as he conducted research on facial motion capture that later contributed to the iPhone X’s animated emoji, or “Animoji,” feature.

Li’s contribution began in 2015 when Apple acquired Faceshift, a tech startup he co-founded in 2011. Faceshift is a facial motion capture software that uses Microsoft Kinect’s depth-sensing technology to capture movements in real time.
“One of the most appealing features [of Faceshift] was markerless facial performance capture,” Li said.

Two seminal components were conducive to the technology behind Animoji: facial tracking using structured-light 3-D scanners and non-rigid registration.

Non-rigid registration is a concept that uses an algorithm to smooth out the deformations in depth scans so it matches the person’s movement as much as possible.

“These two components were sort of key,” Li said. “They evolved to all real-time facial things that we see nowadays — for sure the features on the iPhone X.”

Li joined USC three years ago as an assistant professor to start a research lab. He said a university setting was his best choice for creating something “foreign and long-term” because there was no pressure to turn his research into a product.
Li also prefers working with graduate students because of their passion and energy for getting things done.

Li began researching facial models and markerless performance capture about seven to eight years ago, when he earned his doctorate in computer science in Zurich.

In addition to being a professor, Li is also the director of the Vision and Graphics Lab at the USC Institute of Creative Technologies.

The aim of his early research was to revolutionize traditional motion capture methods that used dots or markers.

“The thing with the face is that if you put dots, you can only get very sparse motion so everything looks very rigid and you can’t get the subtleties of the face,” Li said.

Through Faceshift’s depth-sensing technology with the $100 Kinect, Li demonstrated the accessibility of human digitization.

“That’s something super powerful because what it means is that everyone can generate interesting content,” Li said. “You don’t need an animator.”

“In movies, that has always been the Holy Grail, like ‘How can we create a real-looking visual character?’” Li said. “Anything that advances that is really interesting to me.”

He has used his technology in films such as Captain America: The Winter Soldier, The Hobbit: Battle of the Five Armies, Star Trek Into Darkness and Furious 7, where he used facial tracking technology to animate and map Cody Walker’s face onto a digital Paul Walker.

Li is now focused on his tech startup Pinscreen, which aims to change online communication via user-generated and personalized 3-D avatars.

“We are trying to develop an algorithm using [artificial intelligence] and deep learning where we can create a complete copy of a person with just a single reference picture,” Li said.