Human Machine Interaction and Sensing

Human AI Handshake

What insights about people and technology are needed to better understand the use of sensors in human machine interaction?

To help answer this question, mediaX sponsored six research projects (led by Stanford faculty) exploring new insights into Human-Machine Interaction with a focus on the detection or sensing of human comprehension, emotional states, gesture or touch. The projects were launched in the Spring of 2006 and spanned a wide range of topics, including: affective computing, smart home technologies, design, communication and collaboration, teaching and learning, training, physical therapy and injury prevention, and AR.

The research initiative highlighted projects exploring technologies for emotion detection, real time video capture, gesture recognition, vision based reasoning, machine learning, biofeedback and augmented reality.

Research Initiatives

Jeremy Bailenson: Detection of Comprehension and Emotion from Real-time Video Capture of Facial Expressions During Learning

Andrea Goldsmith: Smart Home Care Network Using Distributed Vision-Based Reasoning

Scott Klemmer: Designing Sensor Based Interactions by Example

Amy Ladd & Jessica Rose: Human Machine Interaction and Sensing of the Golf Swing

The Late Cliff Nass: Revealing and Using Emotion Detection

Andrew Ng: Gestures, Speech and Vision–Towards a Multi-Modal Augmented Reality Human-Robot Interface