A new report compares how effectively three different groups of “expert” readers—fact checkers, historians, and undergraduate students—evaluate the credibility of information online.
Of the three groups, the fact checkers proved to be fastest and most accurate, while historians and students were easily deceived by unreliable sources.
“We purposely sought out people who are experts, and we assumed that all three categories would be proficient.”
“Historians sleuth for a living,” says coauthor Sam Wineburg, founder of the Stanford History Education Group (SHEG) at Stanford University, which released the report. “Evaluating sources is absolutely essential to their professional practice. And Stanford students are our digital future. We expected them to be experts.”
The report’s authors identify an approach to online scrutiny that fact checkers used consistently but historians and college students did not: The fact checkers read laterally, meaning they would quickly scan a website in question but then open a series of additional browser tabs, seeking context and perspective from other sites.
In contrast, the authors write, historians and students read vertically, meaning they would stay within the original website in question to evaluate its reliability. These readers were often taken in by unreliable indicators such as a professional-looking name and logo, an array of scholarly references, or a nonprofit URL.
When it comes to judging the credibility of information on the internet, Wineburg says, skepticism may be more useful than knowledge or old-fashioned research skills.
“Very intelligent people were bamboozled by the ruses that are part of the toolkit of digital deception today,” he says.
The new report builds on research that SHEG released last year, which found that students from middle school through college were easily duped by information online. In that study, SHEG scholars administered age-appropriate tests to 7,804 students from diverse economic and geographic backgrounds.
For the new report, the authors set out to identify the tactics of “skilled”—rather than typical—users. They recruited participants they expected to be skilled at evaluating information: professional fact checkers at highly regarded news outlets, PhD historians with full-time faculty positions at universities in California and Washington state, and Stanford undergraduates.
“It’s the opposite of a random sample,” Wineburg says. “We purposely sought out people who are experts, and we assumed that all three categories would be proficient.”
“These are tasks of modern citizenship…”
The study sample consisted of 10 historians, 10 fact checkers, and 25 students. Each participant engaged in a variety of online searches while SHEG researchers observed and recorded what they did on-screen.
In one test, participants were asked to assess the reliability of information about bullying from the websites of two different groups: the American Academy of Pediatrics (AAP), the largest professional organization of pediatricians in the world, and the American College of Pediatricians (ACPeds), a much smaller advocacy group that characterizes homosexuality as a harmful lifestyle choice.
“It was extremely easy to see what [ACPeds] stood for,” Wineburg says—noting, for example, a blog post on the group’s site that called for adding the letter P for pedophile to the acronym LGBT. Study participants were asked to evaluate an article on the ACPeds website indicating that programs designed to reduce bullying against LGBT youth “amount to special treatment” and may “validat[e] individuals displaying temporary behaviors or orientations.”
Fact checkers easily identified the group’s position. Historians, however, largely expressed the belief that both pediatricians’ sites were reliable sources of information. Students overwhelmingly judged ACPeds’ site the more reliable one.
In another task, participants were asked to perform an open web search to determine who paid the legal fees on behalf of a group of students who sued the state of California over teacher tenure policies in Vergara v. California, a case that cost more than $1 million to prosecute. (A Silicon Valley entrepreneur financed the legal team, a fact not always mentioned in news reports about the lawsuit.) Again, the fact checkers came out well ahead of the historians and students, searching online sources more selectively and thoroughly than the others.
The tasks transcended partisan politics, Wineburg says, pointing out that advocates across the political spectrum promulgate questionable information online.
“These are tasks of modern citizenship,” he says. “If we’re interested in the future of democracy in our country, we have to be aware of who’s behind the information we’re consuming.”
Read laterally, not vertically
The fact checkers’ tactic of reading laterally is similar to the idea of “taking bearings,” a concept associated with navigation. Applied to the world of internet research, it involves cautiously approaching the unfamiliar and looking around for a sense of direction. The fact checkers “understood the web as a maze filled with trap doors and blind alleys,” the authors write, “where things are not always what they seem.”
Wineburg and report coauthor Sarah McGrew observed that even historians and students who did read laterally did not necessarily probe effectively: They failed to use quotation marks when searching for contiguous expressions, for instance, or clicked indiscriminately on links that ranked high in search results, not understanding how the order is influenced by search engine optimization. Fact checkers showed what the researchers called click restraint, reviewing search results more carefully before proceeding.
The authors of the report say their findings point to the importance of redeveloping guidelines for users of all ages to learn how to assess credibility on the internet. Many schools and libraries offer checklists and other educational materials with largely outdated criteria, Wineburg says. “Their approaches fit the web circa 2001.”
In January, SHEG will begin piloting new lesson plans at the college level in California, incorporating internet research strategies drawn from the fact checkers’ tactics. Wineburg sees it as one step toward updating a general education curriculum to reflect a new media landscape and the demands of civic engagement.
In the state’s 2016 election alone, he notes, voters were confronted with 17 ballot initiatives to consider. “If people spent 10 minutes researching each one, that would be an act of incredible civic duty,” he says. “The question is, how do we make those 10 minutes count?”
Source: Carrie Spector for Stanford University