Oct 24

Sensing and Tracking for 3D Narratives

October 24, 2016

Event Description:

Sensing and Tracking for 3D Narratives:
Put ME in the Story!


Great storytellers track the attention and responsiveness of their listeners, adapting the story – its context, plot, pace, voice and imagery – to acquire and hold audience engagement. Because they can and because they have, users now want to be in the story – included, immersed and interactive. Each person in the audience is increasingly in control of her own media experiences – selecting content, creating content, pausing at will – making her own stories. Media experiences are personalized, sharing is globalized, and sensory experiences are mediatized.
 
Tracking and sensing technologies open new opportunities to create stories in ways that not only include the audience but adapt to it. Augmented reality, virtual reality, machine learning, and data-driven algorithms open new opportunities for creating and experiencing narratives. Immersive experiences for business, art, education, and entertainment stories leverage insights from Stanford research to forge the partnerships between storytellers and the audiences. 

Join us on October 24th from 1pm-5pm in Tresidder Memorial Union Oak Lounge West to hear about the research behind these insights and glimpse the new horizons with Keynote talk by Jeremy Bailenson and Chris Chafe. You'll also hear additional research presentations and experience exhibits and demos.

Presenters:

Jeremy Bailenson is founding director of Stanford University’s Virtual Human Interaction Lab, Thomas More Storke Professor in the Department of Communication, a Senior Fellow at the Woods Institute for the Environment, Faculty Director of Stanford’s Digital Learning Forum, and a Faculty Leader at Stanford’s Center for Longevity. Bailenson studies the psychology of Virtual Reality (VR), in particular how virtual experiences lead to changes in perceptions of self and others. His lab builds and studies systems that allow people to meet in virtual space, and explores the changes in the nature of social interaction. His most recent research focuses on how VR can transform education, environmental conservation, empathy, and health.

Chris Chafe is a composer, improvisor, and cellist, developing much of his music alongside computer-based research. He is Director of Stanford University's Center for Computer Research in Music and Acoustics (CCRMA). At IRCAM (Paris) and The Banff Centre (Alberta), he pursued methods for digital synthesis, music performance and real-time internet collaboration. CCRMA's SoundWIRE project involves live concertizing with musicians the world over. An active performer either on the net or physically present, his music reaches audiences in dozens of countries and sometimes at novel venues.

Julia Sourikoff heads up the virtual reality division of the award-winning commercial production company, Tool of North America. In her role she oversees 360 live action film and game engine rendered experiences for rotational and roomscale VR. Her expertise includes cross-platform distribution strategy, emerging production technologies, and creative approaches to storytelling in VR. Before joining Tool, Julia helped launch the Future of StoryTelling (FoST) in New York and was integral to growing the company from a small start-up to an oversubscribed thought leadership summit and online community of over half a million followers. At FoST Julia specialized in programming and curation, exhibition production, and partnership development, working with world class brands like Adobe, American Express, General Electric, Google, Microsoft, and Time Warner.