Theatre and engineering schools collaborate to study interactions
Posted April 6, 2010 at 9:55 pm in News
Two unlikely partners, the Viterbi School of Engineering and the School of Theatre, are teaming up to study human behavior in an innovative way.
In 2008, the National Science Foundation gave Viterbi a grant to fund a project that uses innovative motion-capture technology to study and model human interaction.
The project aims both to analyze how people encode emotions and to provide mathematical models of interactions. To do this, Viterbi researchers have groups of actors from the School of Theatre improvise scenes in a lab equipped with cameras. The actors don special black suits equipped with sensors and move around in the lab.
The fifth and final staging of this semester was held Friday. Now, researchers are hoping to use the data collected so far in their next published report, said Jeremy Lee, a Ph.D. engineering student who is working on the project with Angeliki Metallinou, also a Ph.D. engineering student.
âWhat we want to study is when you [put] people in various situations of improvisation, we want to see how they express themselves,â said Shrikanth Narayanan, Viterbi professor of electrical engineering. âTheir use of voice and body communicates intent, emotions and expressions.â
In the digitized space, researchers map the actorsâ body movement on a computer to track their motion with consequent speech.
âWe want to see how their specific intentions are expressed through their body language,â said Lee.
The actors are given two sentences and two verbs and then interact using the filmed space.
âWe fit this data in a fairly sophisticated mathematical model. In what we call machine learning, you learn specific movement and voice patterns,â Lee said.
Alex Lubischer, who graduated from the School of Theatre in December, said participating in the stagings is a unique experience.
âYou feel like a superhero,â Lubischer said after putting on the suit.
Lee said that actors are integral to the projectâs success because they provide insight into predictable human behavior.
âHumans express a lot of noise because theyâre hiding their emotions. But with actors, we script out that noise so we can see more clearly what humans do when they truly express emotion,â Lee said.
The actors benefit from the engineersâ research by partaking in a new technology typically accessed only by professionals.
âItâs a new thing to use [motion capture] in films,â said Sharon Carnicke, a professor of theatre Â and co-principal of the project. âItâs been done primarily for effect. Weâre really interested in improving the quality of acting within that field.â
Lubischer said he thinks knowing the technology will give him a competitive advantage.
âItâs a chance to work on something a lot of actors havenât yet,â he said.
Studying human behavior is not a new idea, Narayanan said, but the way his team is quantifying the data is. He said the research could be applied to fields including autism, therapy sessions and addiction interventions.
The technology, Lee said, is similar to that used in the film Avatar but takes a different approach.
âThe hardware you see is pretty much the same [as Avatar], but weâre studying it in a mathematical-engineering way and they did it in a production way,â Lee said.
In the future, the researchers said their data could possibly humanize the gaming industry or provide data to developers looking to add life to their technology. They noted, however, that it is still too early to make concrete plans.
âA lot of work has to be done on the analysis part; it is more about understanding right now than production,â Lee said.
Although there will be no more recordings for this academic year, the grant is funded through 2012.