On April 19, 2017, I participated on a panel of presenters at the Carmel City Library (Indiana) dealing with the topic of “Fake News.” The other panelists included an assistant professor at the Indiana University Media School and President of the Indiana Coalition for Open Government; an investigative reporter for a local television station; and an assistant news director for another local television station.

Fake News is of great interest currently, presumably because of the recent Presidential election, which brought Donald Trump to the White House. The concern goes beyond the US, however. A prominent story in the Wall Street Journal of April 15, 2017, was headlined “Fake-News Flow Puts Facebook to Test in France.” Facebook reported that it had vetted over 30,000 accounts in France alone the week before national elections there. Similar concerns arose in preparation for elections in Germany as well. A Google search for “Fake News” brings up hundreds if not thousands of sites. It is a hot topic at the moment.

The journalists on the panel expressed frustration with the difficulties they experience in trying to combat this wave of disinformation. As a professor of communication and author of a book on communication ethics, I am quite worried also.

Certainly disinformation and falsified reports have been a part of history for some time. One need only think of documents such as the “Protocols of the Elders of Zion,’ which was first published in Russian in 1903 (but reproduced many times in the 20th century). Of course there was the impact of so-called “yellow journalism,” including the reporting on the US warship the Maine exploding in the Havana harbor leading to the Spanish-American War of 1898. The current phenomenon seems more widespread and dangerous for a couple of reasons. First, there is the effect of the social media, which accounts for the rapid dissemination of fake stories. Second, the polarization in the present political climate, which motivates people to latch on to reports that could put the other side in a bad light. Third are the 24/7/365 demands of the cable news networks for continuous access to scandalous or embarrassing scoops. Think of the fake news story surrounding Comet Ping-Pong pizza shop in Washington, D.C. The story sped around the social media networks, claiming that Hilary Clinton and other high-ranking officials in the Obama Administration were running a child pornography ring out of the pizza place (Comet Ping-Pong has the same first letters as child pornography, C. P. Also, an order for “cheese pizza” was supposed to be code as well—C. P). One man eventually showed up with a rifle at the store, fired some shots, and stated he had come to “self-investigate” the so-called ring.

The tack that I took on the panel was to focus on the receiving side of the communication process. Fake News seems more dangerous now than in the past because of lay people’s difficulty with critically analyzing the messages bombarding them from social and regular media. Many psychologists and philosophers challenge the notion that we are in fact as rational as we like to think. Joseph Heath, a Canadian philosopher, in his 2014 book, Enlightenment 2.0, suggests that we are neither rational nor logical most of the time. Critical thinking abilities are derailed depending upon how we process incoming information. Among the many examples in his book, Heath points to our need to see patterns, even where none exists. Daniel Kahneman, the Nobel Prize winner in economics, stresses this point in his recent best seller, Thinking: Fast and Slow. The two modes of thinking have been labeled “System 1,” and “System 2.” Most of the time we get by with the quick (and dirty) process of System 1—we take the first plausible (or even implausible) pattern that we perceive without too much thinking.

System 1 thinking (fast thinking) relies on ready-to-hand rules of thumb that allow us to function amid all the events and messages we face everyday. These rules are termed “heuristics,” from the classical Greek, originally meaning learning tools. For example, Heath reports this case. He was presented with this problem: a friend is thinking of a rule that will generate a series of numbers, such as 2, 4, 6. Heath was to suggest more series, and the friend could only answer yes or no as to whether these new series fit the rule. So Heath tried 14, 16, 18—“yes,” said the friend that fits the rule. So he tried 118, 120, 122, thinking he had already discovered the rule. “Yes,” the friend said again. “Aha,” said Heath, who described the rule as ascending, consecutive even numbers—“No,” said the friend. Heath had quickly jumped to a conclusion based what he saw as a pattern. The actual rule was any ascending numbers, consecutive or not, even or not—so 12, 37, 134 also fit the rule. The mistake Heath made, one of which we are all guilty, is failing to ask enough questions, since our process leads us away from asking questions that would disconfirm our original hypothesis of ascending even numbers. The way to proceed in this case would have been for Heath to try a series that was not ascending consecutive even numbers. If 12, 37, 134 fit the rule, then he would know his first assumption was incorrect and be led to a new formulation that would probably be correct. This is the basis for the scientific method: to test a hypothesis by trying to find ways to disprove it (see the philosopher, Karl Popper, who made finding ways to disprove an hypothesis or theory as the basis for the scientific method).

In fast thinking (System 1), we latch onto the first plausible pattern we think we see. This “confirmation bias” is the basis for many conspiracy theories, when people think they see a pattern where none exists. It also is the basis for stereotyping people. We think we see a pattern showing that “those people” are prone to a way of acting, and so treat all representatives of that group as part of a conspiracy. These perceived patterns are usually resistant to change. As Ian Mittroff, an adjunct professor of Cal-Berkeley, points out in a blog, “Reality Wars: The Battles over Truth and Reality” (posted June 19, 2015): “Ideally, in science, and idea is accepted as ‘provisionally true’ if and only if it survives repeated attempts by scientists to prove by ‘hard data’ that it’s false.” This point is continued in the next paragraph: “In contrast, in everyday life, people hold onto their ideas for as long as possible. Indeed, the more an idea is at the core of a person’s belief, the more he or she tries to protect it.” Think of the venerable theory of Cognitive Dissonance. And, as Jonathon Haidt reminds us, “Belief comes first, logical reasoning comes second.” Smart people can believe false things because they are smart enough to devise rationalizations for cherished beliefs.

What to do? Perhaps our ethical responsibility, as consumers of potentially Fake News, is to be aware of these error-inducing tendencies. We need to exercise argumentative self-control, making the effort to consider alternatives to our own reasoning and evidence. Ethical argumentation lies in our commitment to maintain vigilance, especially when we feel strongly committed to our positions.

William W. Neher
Bill Neher
Bill Neher is professor emeritus of communication studies at Butler University, where he taught for 42 years. Over those years he has served as Dean of the University College, Director of the Honors Program, Head of the Department of Communication Studies, the Chair of the faculty governance, and most recently as the first Dean (Interim) for the new College of Communication begun in June 2010. He is the author of several books dealing with organizational and professional communication, ethics, and African studies, plus several public speaking and communication text books.

 

Leave a Reply