“Can’t we just be rational about this?” Probably you have heard or uttered this admonition before, perhaps more than once. There is an assumption (in fact, the basis for this site and this blog) that we can and ought to be rational, especially when making arguments affecting people in public arenas or dealing with significant issues of public policy.
But—can we always rise to the challenge of applying faultless reasoning in our public decisions or arguments? Our first inclination is to say, “Of course.” After all we are rational human beings—the so-called “homo sapiens,” or wise man. Some anthropologists even classify modern humans in a sub-category of “homo sapiens sapiens,” even wiser, it would seem.
Still, many psychologists and philosophers challenge the notion that we are in fact as wise or as rational as we like to think. Joseph Heath, the Canadian philosopher, in his 2014 book, Enlightenment 2.0, suggests that we are not logical or rational most of the time. At the beginning of his book, he offers this thought puzzle (p. 27):
The Marriage Problem:
Bill is looking at Nancy, while Nancy is looking at Craig.
Bill is married. Craig is unmarried.
Is a married person looking at an unmarried person?
Answer: A) yes, B) no, C) cannot be determined.
What answer do you choose? The puzzle may seem easy to figure out, most people say. Do you agree? What if I reveal that the answer is not C—does that change the way you think about the puzzle? (Hint: it’s not necessary to know which married person is looking at which unmarried person.)
The reason that the marriage problem may seem more difficult than it does at first has to do with the application of different thinking styles or processes we use when faced with real-life problems. Daniel Kahneman, a Nobel prize-winner in economics (although his academic field is more often psychology), describes these different styles as “thinking fast,” and “thinking slow.” Most of the time, we are just fine employing “fast thinking,” it serves well for most of the quick decisions we have to make every day. “Slow thinking,” however, takes more effort and seems not to come to us as naturally and easily as “fast thinking.” Kahneman and his colleague, Amos Tversky, conducted a well-know series of studies over several years demonstrating that how issues are “framed,” (that is, how they are presented) goes a long way to determining how we think about them. The result is that we are often misled and fall into various kinds of reasoning fallacies. The fallacies result from our use of reasoning tools called “heuristics.”
As an example, Kahneman as well as Heath, both discuss the “confirmation bias,” which says that when we seek only confirming evidence for a favored hypothesis, we are often misled. Take this case, also from Heath’s Book. He was challenged by a friend to state the rule behind the formation of a series of three numbers. The friend gave him, first, an example that fit the rule: 2, 4, 6. Heath was then to make three guesses of series that fit the rule; his friend would confirm or deny whether the series given by Heath fit the rule. So, his first guess was 6, 8, 10. Yes—that fit the rule. Then, 32, 34, 36, and again that fit the rule. So, Heath tried 118, 120, and 122 (or something like that)—and again this fit the rule. So, Heath concluded, the rule is ascending, consecutive even numbers. No, he was told, that’s incorrect. Can you see where Heath went wrong? The rule actually was any three numbers in ascending order—so that 17, 94, and 243 also fit the rule. What happened was that Heath, like most of us, assumed he saw a pattern right away and stayed with it. It did not occur to him to try to “falsify” his conjecture by testing a completely different pattern (such as, not guessing only even numbers). This fallacy is called the “confirmation bias”—when we think we see a pattern, we tend to assume it is the real situation and look for confirming evidence. This bias is the root for a lot of tendencies, such as seeing conspiracies (a false pattern) where none may actually exist.
To overcome the natural tendency of the confirmation bias, the philosopher of science, Karl Popper argues for the use of the procedure of “falsification.” A truly scientific proposition is one that can be shown to be false. Scientists therefore look to falsify or disconfirm their hypotheses or theories. If it is not possible to specify how one would disconfirm a theory, it is not scientific (it may still be true, just not science). At the end of World War II, Popper, who had escaped from Austria at the time of the Nazi takeover of that country, published a major work on the application of rational argument in political matters, The Open Society and Its Enemies.
Our perception of patterns can mislead us in other ways. A famous example from Kahneman and Tversky involves their describing a woman named Linda, 31 years old, who is single, outspoken, very bright, a former philosophy major in college, where she was concerned about issues of discrimination and social justice. The people who are subjects in their experiement are asked to choose one of two responses as “most probable”: 1) Linda is a bank teller; or 2) Linda is a bank teller and active in the feminist movement. Most people chose number 2, even though the occurrence of two events happening together is always less probable than one event by itself. In this case, option two seemed more “representative” of Linda than just choice number 1, even though mathematically it is more probable. You may think it more likely that she is a feminist rather than a bank teller, but that is not a choice.
Of course, seeing patterns and being subject to “false positives” (seeing a pattern when none actually exists) had survival value at one time. If you fear that rustling in the bush is a threat and take preventive action, you are more likely to survive in the long run even though you are wrong most of the time. One needed to ignore the perceived pattern only the one “right” time to become the possible victim of a snake or other predator.
Heath, Kahneman, Tversky, Popper and others conclude that “slow thinking,” applying deeply rational thought, requires several conditions. It does take more time, it is usually more difficult than quick decision-making, requires thinking abstractly and working things out in language. One clear conclusion from the psychological work in the area is that it is not necessarily related to intelligence. As Heath and others point out, there is not a correlation between intelligence and the use of various biases and heuristics in thinking.
So, do we know if a married person is looking at one who is unmarried? The answer is A) Yes. Nancy can be either married or unmarried (no other possibility exists). If unmarried, then Bill is looking at her. If she is married, she is looking at Craig. Therefore, at least one married person is looking at an unmarried person, regardless of Nancy’s actual marital status.
William W. Neher
Bill Neher is professor emeritus of communication studies at Butler University, where he taught for 42 years. Over those years he has served as Dean of the University College, Director of the Honors Program, Head of the Department of Communication Studies, the Chair of the faculty governance, and most recently as the first Dean (Interim) for the new College of Communication begun in June 2010. He is the author of several books dealing with organizational and professional communication, ethics, and African studies, plus several public speaking and communication text books.
The Conference on Ethics and Public Argumentation, housed in the Butler University College of Communication, serves as CCOM’s academic hub for promoting the ethical use of reasoning and rationality in public deliberation.