Using Video Game Platforms to Understand Thinking Styles of People Engaged in Collaboration

From The Theme

What if we could automatically detect how well members of a group are “synching” on a collaborative project?

VHIL Lab Photo

We set out to better understand correlations between degrees of group “sync” in collaborative projects on one hand, and team creativity and productivity on the other. Our research involved leveraging machine learning and video game technology to analyze the behavior and interaction of teams of two people (dyads) in a creative task.

We sought to assess – and then predict – collaborative innovation with bottom up, statistical methods for pattern identification. We used commercial video game technology and machine learning to track participants’ physical movements and nonverbal behavior, identify patterns, and associate them with creativity. We also examined the measures of synchrony in the movement data collected from participants, and explored the effectiveness of top-down hypothesis based predictions.

Based on automatically detected measures of nonverbal behavior, our models were able to predict creativity, with a success rate as high as 96%. Using the movement data from both participants was more predictive than using data from just one participant. In addition, we found evidence that synchrony declined as the time lag between nonverbal behavior of the two dyad members increased.

mediaX Research Project Update, Fall 2013

mediaX Research Theme Update, Spring 2013

Watch Videos About This Project

VHIL Research: Automatic Detection of Nonverbal Behavior Predicts Learning in Dyadic Interactions

Jeremy Bailenson is founding director of Stanford University’s Virtual Human Interaction Lab, the Thomas More Storke Professor in the Department of Communication at Stanford, and a Senior Fellow at the Woods Institute for the Environment. He designs and studies virtual reality systems that allow physically remote individuals to meet in virtual space, and explores the manner in which these systems change the nature of verbal and nonverbal interaction. In particular, he explores how virtual reality can change the way people think about education, environmental behavior, and health.

Andrea Stevenson Won is Assistant Professor of Communication at Cornell University, where she directs the Virtual Embodiment Lab. At the time of the project, she was a PhD Candidate in the Department of Communication at Stanford University, researching the capture and expression of nonverbal behavior and the physical and psychological effects of mediated embodiment.

Wenqing Dai; Graduate Student, Computer Science

Le Yu; Graduate Student, Computer Science