Interacting with Integrated Information

From The Theme
ONLINE MEDIA CONTENT

WHAT IF
What if we could determine the requirements to support user interaction with systems that integrate information from multiple sources?

Integrated Information Image

WHAT WE SET OUT TO DO
We set out to investigate user responses to content that filters, summarizes or synthesizes information from disparate sources. Our exploration included the relationship between the effects of multiple product filters (such as search engines and recommendation systems) and the type of product attribute being evaluated. We also were interested in the effects and trade-offs for information providers of presenting content via multiple perceived sources and information filters, including the issue of specialization vs. uniformity or consistency in voice interfaces.

WHAT WE FOUND
We conducted a series of lab-based and Web-based experimental studies to assess the social and psychological aspects of user interaction with integrated information. During our studies, we systematically varied a set of content manipulation, presentation, and personalization dimensions, and observed the effects on users’ perceptions of quality, provenance, and information usage.

We found that results from multiple search engines are perceived as more heterogeneous and varied with respect to quality than if gathered via a single system. Our experiments also found that the point when a product attribute can be verified (i.e. before or after purchase) may be an important moderating variable, affecting participant use of one or multiple search engines or filters. In addition, our research into specialization in voice interfaces highlights a critical tradeoff in user interface consistency. On the one hand, there is a clear advantage to specializing voice content: users respond to this illusion of differentiation and perceive speakers to be of a higher quality. However, this increased perceived quality comes at the cost of decreased perceived reliability or consistency of content quality. That is, users will perceive the content to be less uniform in quality.

LEARN MORE
Rieh, S.Y. & Danielson, D.R. (2007). Credibility: A multidisciplinary framework. In B. Cronin (Ed.), Annual Review of Information Science and Technology, Vol. 41.

Danielson, D.R. (2005). Web Credibility. In C. Ghaoui (Ed.), Encyclopedia of Human-Computer Interaction. Hershey, PA: Idea Group, Inc, 713-721.

PEOPLE BEHIND THE PROJECT
Cliff NassThe Late Cliff Nass was the Thomas M. Storke Professor of Communication at Stanford University and held courtesy appointments in Computer Science, Education, Law, and Sociology. He was also affiliated with the programs in Symbolic Systems and Science, Technology, and Society. Nass consulted on the design of over 250 interactive products and services for leading technology and consumer-electronics companies.

David R. DanielsonAt the time of the project, David R. Danielson was a PhD Student in the Department of Communication, Stanford University, and a researcher at the Persuasive Technology Lab. After Stanford, Dr. Danielson went on to work with Oracle Corporation and Seagate Technology.