There are games (still) afoot!
Starting around 2015/16, the term "fake news" was everywhere—to the point that the term became meaningless. Since then, terminology has shifted towards "misinformation" (that is, misleading or false information) and there has been a deluge of media coverage, studies, trainings, paradigms, and curricula for all ages to learn skills to stop the spread of misinformation (DPL among them). However, only about six months ago, a poll by the Associated Press and NORC noted that 95% of Americans still think the spread of misinformation is a problem, and 81% identified it as a major problem.
I agree, and that's why I'm passionate about using my skills as a Reference Librarian to address this problem. I'm not always a fan of placing the onus of combating systemic problems on the shoulders of individuals. But I do believe that there are things individuals can do to stem the spread of misinformation, and I believe those skills are important to learn. Misinformation has long been a part of our world and it is not going away anytime soon. We can each do our part by becoming misinformation detectives. First, a bit of background before we put on our detective hats (deerstalker, fedora, or otherwise).
The Story So Far
There are a lot of anti-misinformation tools out there, many of which are aimed at students, and rightly so. The Stanford History Education Group (SHEG) published a study in 2016 showing that students struggled with identifying credible information online despite being so-called "digital natives." (Thankfully new research shows that even a relatively small amount of instruction may help these skills develop.)
I am a public librarian who works primarily with adults. I don't have dedicated time to work with program participants repeatedly on teaching and applying information literacy skills. I may see them for one hour one time ever (or not at all if they're reading my blog posts!).
But adults don't learn to spot misinformation on our own, even if we are highly educated. SHEG published another study that compared how three groups judged the credibility of online information: Stanford undergraduates, historians with PhDs, and professional fact-checkers. The fact-checkers blew the other two groups out of the water, and historians and undergrads were both easily fooled by website appearances.
SHEG used what they learned from these studies to develop their Civic Online Reasoning (COR) curriculum, which is designed for classroom teachers, but also translates well to adults because, at its core, it revolves around three common-sense questions that will always be relevant:
- Who's behind the information?
- What's the evidence?
- What do other sources say?
Professor Mike Caulfield developed a similarly flexible paradigm called SIFT, which stands for Stop, Investigate the source, Find trusted coverage, and Trace claims to their origin. Together, these are the skills we can use to transform ourselves from information consumers to information detectives.
First: a Gut Check
Anyone can be fooled by misinformation, and some of its creators take great care to hide nefarious content. It's also hard to take time to evaluate every piece of information you come across. One easy way to see a misinformation red flag is to note how a piece of information makes you feel. A lot of misinformation is designed to distract your rationality by engaging strong emotions. (You can read more on this in a past blog.)
Even if it you're not feeling outraged, elated, or disgusted, it's a good idea to take a brief moment to ask if you know what you're looking at and gauge familiarity with the resource putting out the information. It may not be worth engaging with information at all, which can save you time and anguish.
Lateral Reading: a Background Check
If you decide to engage with a piece of information after pausing, it's time to move from gut check to background check. Both COR and SIFT frameworks center on the fact that information is not created in a vacuum. All information has a creator and creators have some kind of bias or even specific agendas. What set the fact checkers apart from historians and undergrads in the SHEG study was that fact-checkers' first step was not to study the information in-depth. They actually skimmed the content and then left the original information source to learn about the creator from an outside source.
This act (leaving the page to investigate who published the information) is called "lateral reading" or "reading across," meaning fact checkers sought outside information by opening a new browser tab. Historians and students read vertically, which means they let the original source speak for itself, and they gave surface-level elements outsized importance when evaluating credibility, such as graphic design or whether a website used a .org domain. It may seem counterintuitive, but think about it: is any content creator going to intentionally make themselves look bad? If they are trying to be persuasive, are they going to be upfront about why they might not be trustworthy?
One of the examples SHEG has used in their studies is the website co2science.org. On its surface it has a lot going for it: it has "science" in its name, it uses a .org domain, it's a nonprofit organization, it has science-y accoutrements like graphs and databases, its copyright is up-to-date, and it's published by the official-sounding Center for the Study of Carbon Dioxide and Global Change. Seems legit, right?
[puts on detective hat] Or is it?
If I peel that official sounding name off this webpage, open a new tab, and do a Google search, I find it's not that simple. Wikipedia is usually a good starting place for lateral reading and it is what fact-checkers tend to use as a jumping off point. Wikipedia is not perfect, but it is widely used, it has incredible breadth, and ideally every fact should be linked to an external reference source. I can view these external resources by hovering my cursor over the citation links (the superscript, bracketed numbers embedded throughout the article) or by going straight to the References section near the bottom.
The Wikipedia entry for the Center for the Study of Carbon Dioxide and Global Change notes that more than one news outlet (The Guardian and Mother Jones) have identified its ties to the fossil fuel industry, and others still (including Seattle Post-Intelligencer and USA Today) have noted that its major funders include ExxonMobil Foundation and Peabody Energy (the country's largest coal mining company). If I wasn't familiar with these news sources, I could open another tab to learn more about them. But even if I go no further, I have learned information that makes me view CO2 Science in a different light. I'm not sure if I can trust claims about climate change from an organization that gets large donations from major energy companies that have a vested interest in continuing to use non-renewable energy.
I could have read CO2 Science's "About Us" page and tried to interpret their trustworthiness based on their own words, which are probably written to make them look infallible. I could have could have wasted time and waded into their graphs and statistics that I don't understand. But based on only a couple minutes of looking into information about the creator, I can now look at CO2 Science more in-depth with a skeptical eye, or stop looking at it completely if it won't suit my informational purpose.
You can learn more about these methods by watching Mike Caulfield's walkthrough of SIFT or SHEG's playlist of COR videos, including Intro to Lateral Reading. And stay tuned in the coming weeks for more blog posts on these misinformation sleuthing skills!
If you'd like to learn more about these and other skills to help you spot and stop misinformation, please register for the next session of our online program "How to Spot Misinformation," which will be held the evening of May 18. In the meantime, if you need to get in touch with a librarian on this or any other topic, just ask!