How do expert researchers go about assessing the credibility of information on the internet? Not as skillfully as you might guess – and those who are most effective use a tactic that others tend to overlook, according to scholars at Stanford Graduate School of Education.
A new report released recently by the Stanford History Education Group (SHEG) shows how three different groups of “expert” readers – fact checkers, historians and Stanford undergraduates – fared when tasked with evaluating information online.
The fact checkers proved to be fastest and most accurate, while historians and students were easily deceived by unreliable sources.
“Historians sleuth for a living,” said Professor Sam Wineburg, founder of SHEG, who co-authored the report with doctoral student Sarah McGrew. “Evaluating sources is absolutely essential to their professional practice. And Stanford students are our digital future. We expected them to be experts.”
Wineburg and McGrew observed that even historians and students who did read laterally did not necessarily probe effectively: They failed to use quotation marks when searching for contiguous expressions, for instance, or clicked indiscriminately on links that ranked high in search results, not understanding how the order is influenced by search engine optimization. Fact checkers showed what the researchers called click restraint, reviewing search results more carefully before proceeding.
In the state’s 2016 election alone, he noted, voters were confronted with 17 ballot initiatives to consider. “If people spent 10 minutes researching each one, that would be an act of incredible civic duty,” he said. “The question is, how do we make those 10 minutes count?”
Read the entire Stanford News Story by Carrie Spector HERE