Follow by Email

"Smart people (like smart lawyers) can come up with very good explanations for mistaken points of view."

- Richard P. Feynman, Physicist

"There is a danger in clarity, the danger of over looking the subtleties of truth."

-Alfred North Whitehead

January 31, 2011

Irrational Persistence of Belief

Humans have a tremendous capacity to resolve conflicting and competing thoughts and evidence. When watching individuals confess to crimes and then later attempt to explain away both the evidence against them and the confession, I can’t help but be reminded of what Nietzsche once wrote: "I have done that, says my memory. I cannot have done that, says my pride, and remains adamant. At last, memory yields.”

Human beings have the ability to distort their views of reality in the direction that fulfills their desires. The tendency for people to not look for evidence that disfavors one’s preconceived ideas, and if they find evidence that disfavors their preconceived ideas, the tendency to find a way to disregard it, is part of confirmation bias as discussed in an earlier blog entry.

Researchers have shown that people maintain incorrect beliefs about things despite overwhelming evidence showing that their beliefs are incorrect. This tendency is called irrational persistence of belief.

Researchers have stated: “The irrational persistence of belief is one of the major sources of human folly, as many have noted. We tend to hold to our beliefs without sufficient regard to the evidence against them or the lack of evidence in their favor.”[i]

We have all explained the evidence and arguments against someone’s position, and then have heard the person say, “Whatever, I still believe my position is correct.” Irrational persistence of belief is combated by remaining open to counter-evidence and criticism of one’s ideas, including being able and willing to do self-critiques of one’s ideas—and then changing one’s position, if warranted. Rigid defensiveness about one’s ideas should signal a possible irrational belief.[ii]

One of the determinants of irrational persistence of belief relates to how people think about how they should think. Some beliefs about thinking lead to poor decision-making, such as the belief that changing one’s mind is a symptom of weakness.[iii] Flip-flopping on an issue because it is politically expedient to do so is one thing. Changing one’s mind in light of new evidence, or in light of a better understanding of the evidence, is a whole other thing. Maintaining a steadfast opinion, regardless of its wrongfulness, is not a virtue; but is a sign of poor thinking.

Decisions makers should attempt to be open to all sides of an argument, and to be aware of our tendency to want to continue to believe in our already formed opinions—regardless of the evidence against them. We need to fight our desire for not wanting to be confused by the facts.



[i] Baron, Jonathan, 2008, Thinking and Deciding, Cambridge University Press, New York, New York, p. 203.

[ii] Ibid. p. 203.

[iii] Ibid. p. 213.

The views expressed in this blog are solely the views of the author(s) and do not represent the views of any other public official or organization.

January 24, 2011

Critical Thinking - The Similarity/Uniqueness Paradox

To “think like a lawyer” law students are taught to “reason by analogy”. Reasoning by analogy is a form of inductive reasoning in which one compares a group of facts associated with one phenomenon with the facts associated with another phenomenon and then comes to some reasoned conclusion. In law, the phenomenon being considered is usually a case in front of one court, and another appellate case.

The legal process uses reasoning by analogy to compare one set of facts in a case with another set of facts in a published appellate case, and then make a conclusion on whether or not the cases are similar enough to make a reasoned decision on whether the appellate case is controlling precedent. Reasoning by analogy also comes into use in the law, in evidentiary questions, such as whether or not “other-acts” evidence are relevant to an issue that they are being offered for. We compare the circumstances of the case before us with the circumstances of the “other-acts”.

Reasoning by analogy will involve a potential source of thought error called the similarity-uniqueness paradox.[i] This paradox states that all things are both similar and different. One can identify an infinite number of similarities between two things, and at the same time identify an infinite number of differences. The error arises: “…first, by allowing genuine differences to be obscured by similarities, and second, by allowing genuine similarities to be obscured by differences.”[ii]

For example, the first error is committed when we stereotype groups of people to come to a conclusion about a single individual. We see similarities and allow these similarities to obscure any differences.

I see this error also occurring in the context of sentencing, when the prosecution makes the argument that the defendant is “pure evil”. I have not yet met the person about whom I could not find something good (or bad.)

The second error is committed when we only see the difference and allow these differences to obscure the similarities. That error comes into play when we see only differences and don’t recognize similarities between two different groups. For example, one group of citizens may be comprised of Republicans and the other Democrats, but they are both groups of citizens of the United States, both groups of human beings, etc.

The antidote to this error is to first identify the criteria that will be used for evaluating whether an item is similar or different. The reasons for the criteria must bear on condition that is probative to an issue in the case. For example, why should we care, when comparing cases, if one car was red and the other car was white, or that they were both white? Is the color of the car probative to an issue in the case? Why is the criterion probative?

After probative dimensions are selected, we must ask what similarities and what differences exist between the items being compared. We should be aware of “points of critical distinction” which is the point where two similar items start to differentiate.[iii]

Part of the job of an accomplished advocate is to take all cases that are arguably precedential, and then, through arguing similarities and differences, show the Court, in a credible manner, how all cases support the client’s position. The Court must first decide if the criterion for identifying the similarities and differences are probative of the issue, and then consider all the probative similarities and differences, to determine the precedential mandate of the cases on the case before the court. Some analyses are easier than others.



[i] Levy, David A., 1997, Tools of Critical Thinking-- Metathoughts for Psychology, Waveland Press, Inc. Long Grove, ILL., pp. 34.

[ii] Ibid. p. 41.

[iii] Ibid. p 43.

The views expressed in this blog are solely the views of the author(s) and do not represent the views of any other public official or organization.

January 17, 2011

Critical Thinking - Confirmatory Bias

As a prosecutor, I was concerned about the accuracy of decisions in the criminal justice system. I did not want to charge, much less convict, a person who was innocent of the identified crime. Further, I wanted to find and convict those that had actually committed the crime. One of the areas that I identified for professional development for those in law enforcement was critical thinking, decision making, and judgment.

After becoming a judge, I thought it would be wise to further my reading about critical thinking, decision making, and judgment. I wanted to attempt to at least be aware of the situations where one is at a heightened risk of making an erroneous decision. Much empirical research on these topics also exist. In the next several blog entries, I will address some of this research.

The first topic I want to discuss is one that is common in the criminal justice system-confirmatory bias. Confirmatory bias is the tendency to selectively collect and interpret information to confirm beliefs and to avoid information that may disagree with prior beliefs.[i]

Confirmaatory bias can occur under three different scenarios.[ii] The first involves situations where people need to interpret ambiguous evidence. The research shows that a strong tendency exists that people will interpret ambiguous evidence based on their initial beliefs. This type of bias includes “a propensity to remember the strengths of confirming evidence but the weaknesses of the disconfirming evidence, to judge confirming evidence as relevant and reliable but disconfirming evidence as irrelevant and unreliable, and to accept confirming evidence while scrutinizing disconfirming evidence.”[iii]

I asked law enforcement officers to assume that a suspect is innocent, and then consider the evidence. Does the presence of an item of evidence make you change your conclusion that the suspect is innocent? Can the evidence be interpreted in favor of the suspect? Much evidence can be interpreted as either supporting or refuting a claim. Anyone who has watched a trial understands that.

Research has shown that different areas of the brain are used when people consider information that does or does not favor a predetermined conclusion. “Neural information processing related to motivated reasoning appears to be qualitatively different from reasoning in the absence of a strong emotional stake in the conclusions reached.”[iv] One only need to observe the stark differences in the interpretation of the same facts by strong political partisans of different persuasions.

A second situation that involves confirmatory bias occurs when people find a correlation between phenomena separated by time when one doesn’t exist, or when people fail to recognize a correlation when one exists. People have a difficult time interpreting information regarding correlations between two things. This bias has been identified as one of the weakest part of human reasoning.[v]

This bias underlies the power of using statistics rather than just human judgment to determine the efficacy of treatment modalities in the criminal justice system. If one hears someone extol the efficacy of anything, one should want to see the statistical evidence. Most of the time, the statistical evidence doesn’t exist at all, or it is so blatantly biased, in collection and interpretation, that it is useless.

The third situation that involves confirmatory bias is a tendency to selectively collect only that evidence that confirms prior held beliefs. This can be a big problem, as many times, only a short window of opportunity exists to collect evidence of a crime. If confirmatory bias prevents the collection of other evidence, then that evidence can be lost.

Here, one should attempt to ask a question to disconfirm what one believes. For example, if one believes Joe committed the crime, one should not merely ask what would prove that Joe committed the crime, but should ask what evidence would prove that Joe did not commit the crime. Does Joe have an alibi that made it impossible for him to commit the crime? Is Joe’s explanation for what happened physically possible? Of course, one must be very careful in how this evidence is collected, so that confirmatory bias itself does not result in a misinterpretation of ambiguous evidence.

Confirmatory bias can feed itself. If one starts with the belief a suspect is guilty, and then one erroneously interprets ambiguous evidence in a way that supports that belief, that can lead to a stronger belief in the defendant’s guilt, resulting in a greater tendency to engage in confirmatory bias. Confirmatory bias is a danger to truth seeking in the criminal justice system. We should be aware it when making important decisions.



[i] Budowle, Bruce et al, 2009, A Perspective on Errors, Bias, and Interpretation in the Forensic Sciences and Direction for Continuing Advancement, J. Forensic Sci.., July 2009 Vol. 54. No. 4 pp.798-809, 803.

[ii] Rabin, M and J.L. Schrag, 1999, First Impressions Matter: A Model of Confirmation Bias, Quarterly Journal of Economics, 114(1) Feb. 1999, pp. 37-82.

[iii] Lord, C.G., L. Ross and M.R. Lepper, “Biased Assimilation and Attitude Polarization: The Effect of Prior Theories on Subsequently Considered Evidence”, Journal of Personality and Social Psychology, XXXVII (1979), 2098-2109.

[iv] Westen, Drew, et al, 2006, Neural Bases of Motivated Reasoning: An fMRI Study of Emotional Constraints on Partisan Political Judgment in the 2004 U.S. Presidential Election, Journal of Cognitive Neuroscience 18:11, pp. 1947–1958.

[v] Nisbett, R.E. and L. Ross, Humans Inference: Strategies and Shortcomings of Social Judgment, 1980, Prentice Hall.

The views expressed in this blog are solely the views of the author(s) and do not represent the views of any other public official or organization.

January 10, 2011

Detecting Lies Part 2

The first time I confronted the unreliability of a lie detection device was when I was a defense attorney. A law enforcement officer informed me that my client was guilty because he had flunked a voice-stress analysis. He said the machine said my client had forged a lottery ticket (the ticket was a small winner), and the machine didn’t lie. Lucky for my client, I didn’t trust the machine and I located witnesses that said that the ticket was forged by others as a practical joke at work. The charges were ultimately dismissed.

As a prosecutor, the detection of lies was an important part of what I did. People were charged, or not charged, based on a credibility determination of the witnesses. Sex crimes were often involved. Was the sex consensual or not? Was there sexual contact or not? Under certain circumstances, I would consider the results of polygraph examinations, along with all other evidence, in making charging decisions. The evidence regarding the efficacy of polygraphs in lie detecting was mixed, but evidence existed to that showed a polygraph result that indicated truthfulness had some reliability.[i]

Polygraph evidence is not admissible in Wisconsin Courts. State v. Dean, 103 Wis. 2d 228, 279, 307 N.W.2d 628 (1981). Anything that a defendant says during a polygraph examination is also not admissible. State v. Schlise, 86 Wis. 2d 26, 42-44, 271 N.W.2d 619, 627 (1978). Statements that a defendant makes after the polygraph examination is over may be admissible. State v. Johnson, 193 Wis. 2d 382, 388, 535 N.W.2d 441, 442-443 (Ct. App. 1995);
State v. Greer, 2003 WI App 112, ¶. Also, an offer to take a polygraph test can be relevant to a defendant’s credibility and may be admissible at trial for that purpose. State v. Pfaff, 2004 WI App 31, ¶26.

I also have experience with computerized layered voice analysis, which is a more sophisticated form of voice stress analysis, which was a technology also used by law enforcement when I was a prosecutor. My opinion is they are about as accurate as voice stress tests.

The problem with all lie detecting devices is that: “There is no evidence that any pattern of physiological reactions is unique to deception.”[ii] Based on my experience, another error can arise because of biased interpretation of the results through confirmation error. The interpreter of the results often find the results that they were looking for in the data.

Newer techniques have been developed with the advent of brain imaging techniques. These techniques are based on theory that there are unique circuits in the brain that may be activated when one lies. A recent study took a look at one of these techniques. The researchers concluded that no one area of the brain was identified as being helpful in detecting deception.[iii] These researchers correctly identified 71% of lying subjects. The researchers stated that the use of a functional MRI is a more direct measurement of what is happening in one’s brain. However, the same problem of identifying a unique pattern of brain activity associated with lying remains.[iv]

If anyone is interested in further research regarding the detection of lies, and a discussion on the research regarding polygraphs, I can recommend a book written by one of the preeminent scholars in the field, Paul Ekman, entitled Telling Lies, Clues to Deceit in the Marketplace, Politics, and Marriage. Ekman discusses, in detail, the challenges in detecting lies. We, as well as jurors, probably have an erroneous confidence in our ability to detect lies.



[i] Ekman, 2001, Paul, Telling Lies—Clues to Deceit in the Marketplace, Politics and Marriage, W.W. Norton & Company, New York, pp. 215-223

[ii] “The Truth About Lie Detectors (aka Polygraph Tests)”, 2004, American Psychological Association.

[iii] Monteleone et al, 2009, “Detection of deception using fMRI: Better than chance, but well below perfection.”, Social Neuroscience, Vol. 4. Issue. 6, pp. 528-538.

[iv] Ibid.

The views expressed in this blog are solely the views of the author(s) and do not represent the views of any other public official or organization.

January 3, 2011

Detecting Lies

Most people involved in the criminal justice system, or for that matter, any adversarial process, are often involved in attempting to detect lies. As a former prosecutor, people I worked with used all manners and techniques to detect lies, from determining whether or not someone “looked them in the eye” to polygraphs and computerized layer voice analysis. None of these techniques are known to be foolproof. Let’s examine what the research says about lie detection.

A recent meta-analysis of 247 samples was completed by researchers to determine the differences in ability of people to detect lies and the differences of the ability of people to lie without being detected.[i] First, it is well accepted that people are not accurate at detecting lies. The research literature confirms that an individual’s ability to detect a lie by merely watching a person tell it is little more than chance.[ii] In the real world, outside a research laboratory, such as in a court room, people attempt to detect lies by relying on motivational information of the person (does he have a reason to lie), other physical evidence, or other people’s testimony.[iii] If a statement conflicts with what the perceiver believes to be more credible evidence, the perceiver may decide someone is lying.

However, sometimes we have nothing other than the statement of the sender (the person making the statement) to make a judgment about whether or not they are lying. Bond et al (2008) attempted to measure if there were differences in individual’s ability to detect deception. Their answer was no. It does not appear that individuals vary in their ability to detect deception. They state: “These data provide no evidence that the best lie detection performances in this research literature reflect any extraordinary ability. The highest detection rates are no higher than chance would produce.[iv]

These researchers also looked at the differences in perceivers credulity--“the general predisposition to regards others’ statements as truthful.[v] The research showed that there was a difference among perceivers in their credulity. Some people are more likely to believe something as the truth regardless of whether or not it is true, and others are more likely to think something is a lie regardless of whether or not it is true or not. However, neither of these individuals were any better than the other in detecting lies in a laboratory setting.[vi]

Much larger differences were found among individual’s ability to lie without being detected, or their detectability. Some liars are more often detected than others—they are bad liars. The research shows that some people are much better liars than others—they are good at lying. (Not good liars!)[vii]

These researchers also identified another interesting differing trait among people—their natural appearance of credibility. This trait is the largest determinant of a judgment of deception. Some people appear truthful, regardless of whether or not they are telling the truth. Other people appear untruthful, regardless of whether or not they are lying. These researchers hypothesize that this difference is related to facial anatomy. “Some infants are anatomically gifted with an honest-looking face; others are facially disadvantaged. The gifted have baby faces and the disadvantaged look mature.” This trait carries forward throughout their lives.[viii]

The bottom-line of this research is that we all are poor at detecting lies from the observation of the declarant alone. Some people are inherently biased toward believing people are telling the truth, while others are inherently biased toward believing someone is lying. Both are just as wrong. Some liars are better than others. The most important trait in people believing someone is telling the truth or not, is their appearance of credibility. We believe people are telling the truth, regardless of whether or not they are, if they look like people who we believe tell the truth. We disbelieve people, regardless of whether or not they are lying, if they look like people who we believe are liars.



[i] Bond, Charles F. Jr., and Cella M. DePaulo, (2008), Individual Differences in Judging Decemption: Accuracy and Bias, Pyschological Bulletin, Vol. 134. No. 4 pp. 477-492.

[ii] Ibid, p. 477.

[iii] Ibid. p. 488.

[iv]Ibid. p. 483

[v] Ibid. p. 478

[vi] Ibid. p 483

[vii] Ibid. p. 484.

[viii] Ibid. p. 487.

The views expressed in this blog are solely the views of the author(s) and do not represent the views of any other public official or organization.