Real Time Tracking, Detecting and Correcting of Motion

From The Theme

What if a virtual reality tracking & rendering system could automatically detect when your physical motion deviated from the optimal path, and show you exactly what you did wrong - replaying your motion on an avatar that looks just like you?

We proposed a three-part plan that: (1) created a technological system that learns to compare a physical motion with some optimal motion. (2) incorporated a feedback mechanism into that system which can alert the user of that deviation by showing himself in the third person performing the actual and optimal motion, and (3) ran an experiment testing learning of the physical motion from the system.

We developed a state-of-the-art, image-based tele-immersive system, capable of tracking and rendering many degrees of freedom of human motion in real time. In Experiment 1, participants learned better in VR than in a video learning condition according to self-report measures, and the cause of the advantage was seeing one’s avatar stereoscopically in the third person. In Experiment 2, we added a virtual mirror in the learning environment to further leverage the ability to see oneself from novel angles in real time. Participants learned better in VR than in video according to objective performance measures.

Bailenson, J.N., Patel, K., Nielsen, A., Bajcsy, R., Jung, S., & Kurillo, G. (2008). The Effect of interactivity on learning physical actions in virtual reality. Media Psychology, 11, 354–376.

Fox, J., & Bailenson, J. N. (2009). Virtual self-modeling: The effects of vicarious reinforcement and identification on exercise behaviors. Media Psychology, 12, 1-25.

Fox, J., & Bailenson, J. N. (2010). The use of doppelgängers to promote health and behavior change. Cybertherapy & Rehabilitation, 3(2), 16-17.

Fox, J. (2010). The use of virtual self models to promote self-efficacy and exercise. (Doctoral dissertation, Stanford University)

Jeremy Bailenson is founding director of Stanford University’s Virtual Human Interaction Lab, the Thomas More Storke Professor in the Department of Communication at Stanford, and a Senior Fellow at the Woods Institute for the Environment. He designs and studies virtual reality systems that allow physically remote individuals to meet in virtual space, and explores the manner in which these systems change the nature of verbal and nonverbal interaction. In particular, he explores how virtual reality can change the way people think about education, environmental.

Jesse Fox is an Assistant Professor in the School of Communication at Ohio State University, where she runs the VECTOR (Virtual Environment, Communication Technology, and Online Research) Lab. Dr. Fox’s research is concerned with the effects and implications of new media technologies, including social networking sites, virtual worlds, video games, websites, blogs, and mobile applications.

HCI, Presence, Communication, Learning, Technology, Sensors, Coaching, Feedback