There is enormous diversity among Christians in what we believe and in how we understand various passages in the Bible. This is a simple and straightforward observation; at least on this, we should be able to agree. But what causes it? Is it something about the Bible? Or is it something in us, humans? I believe it is both.
You can also watch this content as a VIDEO PODCAST or listen to it as an AUDIO PODCAST.
It’s the Bible
This is not the subject of this issue, so I merely mention this in passing. If God intended the Bible to be a clear-cut and unmistakable instruction manual for human life and faith, we would have to conclude that it has failed its purpose. It is therefore more likely that God had something different in mind.
I do not at all mean to say that the Bible is not a guide and foundation for faith and life, just that it does not fulfil this function in an easy, simple, and straightforward manner. Part of the problem is the nature of the Bible; it is not an easy book.
Still, given 2000 years of concentrated study and thought, why can we not agree about so many issues?
It’s Us
I am looking beyond the (very real) possibility of sin, rebellion, and wilful self-deception interfering with interpretation. I am assuming the best here. Even committed and reasonably well-informed Christians with a heart for God and for truth disagree with each other. What else, beyond sin, is involved?
The hard truth is that we are not as good at thinking as we think we are.
Here are some things social scientists have discovered about how humans think as summarized in a book by Daniel Kahneman, Thinking, Fast and Slow.
Two Systems
Kahneman argues there are two systems at work in our minds. System 1 is fast and almost automatic. It recognizes patterns. Based on experience, it notices similarities. It comes up with associations (this reminds me of that). It also makes out relationships of cause and effect. It provides us with an explanation or view of what is going on around us.
This view is always (more or less) coherent but not necessarily right or accurate. System 1 makes sense of things; this is the good news. It often does this by “jumping” to conclusions; this is the bad news. To put this in different, more critical terms: “System 1 runs ahead of the facts in constructing a rich image on the basis of scraps of evidence“ (Kahneman 2011:114).
System 1 is how intuition works. We “know” something to be true or right (or not) before we can give reasons for why this would be so.
2 + 2 = 5
You did not have to think about this to recognize it as false (I hope).
But what is the square root of 900? That is a little harder. Which number multiplied by itself gives us 900? You probably have to think about this. You now use system 2. (The answer, by the way, is 30).
System 2 is slow and deliberate. It is much more thorough than system 1 (and better at things like analysis and logic), but it is also lazy. Much of the time, it lets system 1 do the work. If it decides to go into action, we tend to find this hard work; it takes a lot of energy to think this way.
Let’s apply this to reading the Bible. Much of the time we simply accept, unthinkingly, the suggestions offered by system 1 as to what the words and sentences mean. This kind of reading is easy and enjoyable. Equally unthinkingly, we follow the beliefs and interpretations we have picked up in the past. Most of the time, it makes sense to read this way; we would not get much reading done if we would worry about the meaning of every single word.
It is only when we encounter surprise or dissonance that we stop to really think about what we just read. Something unexpected shows up or something does not seem to fit; now system 2 wakes up. Or we are confronted with alternative views on an issue. In the latter case, our first, semi-automatic response is most likely defence, not truth-seeking. We look for reasons why our view is right and the other view is wrong. To do otherwise would require even more energy, and our default mode is to take the path of least resistance.
This is just one of many flaws and weaknesses in our thought processes. System 1 is right much of the time, but it is susceptible to systematic error (bias) and it can be naïve. System 2 has its faults as well. Let’s look at a few.
Systematic Errors
Confirmation bias. We pick up the verses that confirm our beliefs and overlook those that conflict with them. When it comes to the Bible, we call this proof texting, but it is not a Christian problem; it is deeply ingrained in how our brains work. This is called confirmation bias: when considering a claim or hypothesis, we tend to look for evidence and information that confirms, not for information that goes against the idea in question. The more we are emotionally attached to an idea, the harder it becomes to honestly consider evidence against it.
In SBS terms, our default mode is deductive reasoning in order to prove preconceived ideas.
Substituting an easier question. If system 1 cannot answer a question, it will look for an easier one to answer instead. This makes good sense if a quick and rough assessment will do, which in daily life is often (but not always) the case. Let’s apply this to interpretation.
Whether an interpretation is true or right is often a difficult question. We may unconsciously settle for the answer to an easier question, such as: Do I like this interpretation? Does it match what I have heard in the past? Is it held by people I trust? Does it sound biblical (whatever that means)?
Anchoring. The first thing we hear, the first answer given, the first explanation we read has a disproportionate influence on the interpretation we embrace in the end.
In experiments, even a completely random and unrelated number had significant influence over numerical guesses made by participants. In one case, people were shown either the number 10 or 65 and were then asked about the percentage of nations in the United Nations that are African. The average answer of group “10” was 25 %, that of group “65” was 45 % (Kahneman 2011:119). Clearly, the unrelated number pushed results up or down.
This anchoring bias is a good reason never to start with a Bible commentary, but always first study the text for yourself.
Ignoring ambiguity. The job of system 1 is to provide a coherent interpretation of the input it receives. It will tend to ignore ambiguity and uncertainty as much – and as long – as possible. As Kahneman (ibid.:114) puts it: we favour certainty over doubt. As a result, we remain unaware of the weak spots in our interpretations. This is one factor that leads to overconfidence (more below).
WYSIATI: What You See Is All There Is (ibid.:85). Different from system 2, system 1 won’t consider alternatives and it does not notice if important information is missing, even if it would be crucial for a reliable conclusion. It is as if system 1 assumes that the information it has is complete and sufficient; anything that is unknown is deemed irrelevant.
It is easy to see how this can distort interpretation and lead different interpreters, both working with limited but different data sets, down different paths.
The understanding illusion. It is almost impossible for system 1 to not give an explanation. System 1 is driven to make sense of things. But “sense” is evaluated by system 1 based on coherence: is this a good story or a good explanation (if so, we have understanding, even if it may be an illusionary one)? System 1 does not ask: is it true? (a far more difficult question). Combined with the previous bias, WYSIATI, this makes for a formidable obstacle to true understanding:
It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern. (Ibid.:87)
The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little. We often fail to allow for the possibility that evidence that should be critical to our judgment is missing— what we see is all there is. (Ibid.:87)
The implications for Bible interpretation are obvious. We don’t easily identify information that is missing and we tend to be confident that our interpretations are sound, leading to:
Overconfidence. System 1 is satisfied when it can give a coherent explanation. The less we know about something, the easier it is for system 1 to come up with a story. And the more confident we tend to be about what we think we know.
Affect bias. We decide what is true based on whether we like it: what we like feels true. This works hand in hand with substituting easier questions: we may not know what is true or right, but we know which answer we prefer.
In addition, our acceptance of ideas is greatly influenced by our (dis-)like of the person arguing them. This is a well-known factor in politics. Whatever our preferred party or candidate proposes, we tend to uncritically like and accept; whatever the other side claims, gets easily rejected. Unfortunately, the same bias is at work in theology and biblical interpretation. If you like N.T. Wright, you may accept an interpretation by him that does not hold water; if you dislike Calvin, you may reject interpretations and ideas that are actually quite good, simply because they (supposedly) are Calvinism.
A different term for this bias is the halo effect: if we are favourably inclined toward someone or to a point of view, we tend to like everything associated with them, and vice versa. In reality, as thinking strategies, “Calvin said it, so I believe it” and “Calvin said it, so I oppose it” are equally poor. (If Calvin leaves you cold, fill in any name other than Jesus and God, and the statement still holds.)
Another consequence of affect bias: since we are obviously favourably inclined toward the Bible, due to the halo effect, anything that claims to be “biblical” already has one foot in the door.
Repetition makes us more willing to accept an interpretation or claim regardless of its merit: what is familiar feels true.
Cognitive ease. This is similar to the previous two: what is easy to understand feels true.
Unwarranted pattern recognition. Our brains are wired to recognize patterns – even where they are imagined. Think of recognizing shapes in the clouds sailing by. That’s cute and innocent, but inferring patterns from Scripture is not. Something appearing in the Bible more than once (say, people speak in tongues when the Holy Spirit comes upon them or the way women are treated in oriental and roman culture) does not necessarily make it normative.
Remedies?
These biases are not easy to overcome, since they are part of the way our brains work. Jumping to conclusions is mostly a question of economics: it is fast and saves time and effort, and most of the time it suffices. So what can we do when interpreting Scripture, where jumping to conclusions will not do?
- Beware, especially of confirmation bias. Ask: what might speak against a point of view? Why do I tend to this interpretation (is it because of ease, familiarity, preference, because everyone around me believes this?)?
- Make an effort to consider alternatives: how else could this be understood?
- Consult others (for instance through commentaries and other reference works) to deliberatively expose yourself to different viewpoints.
- Don’t unfriend people on Facebook who express an opinion you don’t like.
- Ask more questions.
- Ask: what is unclear or ambiguous?
- Ask: what information is missing? What else might be relevant? What do I not know?
- Learn to live with open questions and tensions rather than force a decision.
- Beware. Beware. Beware.
“Biblical”?
I end with one more example from Kahneman’s book:
“How many animals of each kind did Moses take into the ark?” The number of people who detect what is wrong with this question is so small that it has been dubbed the “Moses illusion” (ibid.:73).
Did you get it? Not many people would have been fooled if the question had been: How many animals did Donald Trump take into the ark? It is the combining of biblical and therefore associated ideas and names that makes the error difficult to detect.
This example is only an error in a simple fact. How much harder to catch errors in interpretation when loosely connected biblical ideas, passages, and background information are thrown together. After all, it would sound biblical, so can it possibly be wrong?
The answer, of course, is: yes.
Attribution
Dennis Jarver, Deep in Thought… https://www.flickr.com/photos/archer10/11713640484 (CC BY-SA 2.0)
Lightbulb: https://pixabay.com/de/birne-licht-idee-strom-gl%C3%BChlampen-40701/ (CC0)
References
Daniel Kahneman (2011), Thinking, Fast and Slow (New York: Farrar, Straus and Giroux)
Disclosure of Material Connection: Some of the links in the post above are “affiliate links.” This means if you click on the link and purchase the item, I will receive an affiliate commission.
Wenn du über diese Links etwas kaufst, hilfst du mir, die Kosten für Create a Learning Site abzudecken.