Two unlikely partners, the Viterbi School of Engineering and the School of Theatre, are teaming up to study human behavior in an innovative way.
In 2008, the National Science Foundation gave Viterbi a grant to fund a project that uses innovative motion-capture technology to study and model human interaction.
The project aims both to analyze how people encode emotions and to provide mathematical models of interactions. To do this, Viterbi researchers have groups of actors from the School of Theatre improvise scenes in a lab equipped with cameras. The actors don special black suits equipped with sensors and move around in the lab.
The fifth and final staging of this semester was held Friday. Now, researchers are hoping to use the data collected so far in their next published report, said Jeremy Lee, a Ph.D. engineering student who is working on the project with Angeliki Metallinou, also a Ph.D. engineering student.
“What we want to study is when you [put] people in various situations of improvisation, we want to see how they express themselves,” said Shrikanth Narayanan, Viterbi professor of electrical engineering. “Their use of voice and body communicates intent, emotions and expressions.”
In the digitized space, researchers map the actors’ body movement on a computer to track their motion with consequent speech.
“We want to see how their specific intentions are expressed through their body language,” said Lee.
The actors are given two sentences and two verbs and then interact using the filmed space.
“We fit this data in a fairly sophisticated mathematical model. In what we call machine learning, you learn specific movement and voice patterns,” Lee said.
Alex Lubischer, who graduated from the School of Theatre in December, said participating in the stagings is a unique experience.
“You feel like a superhero,” Lubischer said after putting on the suit.
Lee said that actors are integral to the project’s success because they provide insight into predictable human behavior.
“Humans express a lot of noise because they’re hiding their emotions. But with actors, we script out that noise so we can see more clearly what humans do when they truly express emotion,” Lee said.
The actors benefit from the engineers’ research by partaking in a new technology typically accessed only by professionals.
“It’s a new thing to use [motion capture] in films,” said Sharon Carnicke, a professor of theatre and co-principal of the project. “It’s been done primarily for effect. We’re really interested in improving the quality of acting within that field.”
Lubischer said he thinks knowing the technology will give him a competitive advantage.
“It’s a chance to work on something a lot of actors haven’t yet,” he said.
Studying human behavior is not a new idea, Narayanan said, but the way his team is quantifying the data is. He said the research could be applied to fields including autism, therapy sessions and addiction interventions.
The technology, Lee said, is similar to that used in the film Avatar but takes a different approach.
“The hardware you see is pretty much the same [as Avatar], but we’re studying it in a mathematical-engineering way and they did it in a production way,” Lee said.
In the future, the researchers said their data could possibly humanize the gaming industry or provide data to developers looking to add life to their technology. They noted, however, that it is still too early to make concrete plans.
“A lot of work has to be done on the analysis part; it is more about understanding right now than production,” Lee said.
Although there will be no more recordings for this academic year, the grant is funded through 2012.