Judgment is the criminal lawyer’s stock-in-trade. The ability to assess the risks of a situation, and choose the better course of action, is the value that lawyers bring to the criminal justice system. It doesn’t matter if they’re defense attorneys negotiating a deal or fighting it out at trial, or if they’re prosecutors deciding whether and what to charge — their value is their judgment. The better the judgment, the better the lawyer.
It’s therefore critical that criminal lawyers have some understanding of how and why people take risks. In advising a client inclined to take a bad risk, the lawyer can’t really change that perception without knowing what’s causing it. And such an understanding also helps one spot one’s own inclinations to error before it’s too late.
This is not common sense. (In fact, common sense is usually the enemy here.) It’s insight. The ability to see how people act, and realize — aha! — why.
Fortunately for the rest of us, there are amazingly smart people out there who do that all day. When you find one with real insights about why people take the risks they do, you’re probably gonna want to listen.
That’s why we’re taking a moment to point you to Danny Kahneman (that’s his picture up there).
Who is Danny Kahneman, you ask. You’re not alone. If you’re not an economist, you can be forgiven for not knowing he won the Nobel Prize for basically inventing the field of Behavioral Economics. If you’re not a psychologist, you can be forgiven for not knowing he’s considered “one of the most influential psychologists in history, and certainly the most important psychologist alive today.” If you’re not a foreign-policy wonk, you can be forgiven for not knowing of his significant ideas on the evaluation of risks in wartime. He’s one of the most insightful and relevant people nobody’s ever heard of.
As it happens, a lot of his insights are directly relevant to the practice of criminal law. Trying to decide the likely outcome of that trial? You’re probably sizing up the particulars of your case and comparing it to your own past experience. Maybe you’re applying some anecdotal generalizations about this courthouse, based on things you’ve heard over the years. And you could be way off base — your assessment is more likely to be accurate if you ignore the particulars and just look at the big picture. Namely, the statistical distribution of real-life results in cases like that.
Oh, your courthouse doesn’t keep those kinds of statistics? Then at least understand that, no matter how realistic you think you’re being, cognitive biases not unlike wishful thinking are throwing you off base without your realizing it.
Avoiding such cognitive biases is one of the main reasons for hiring a lawyer in the first place — why we say someone who represents himself has a fool for a client. The lawyer is an agent, so should be less emotionally invested in the outcome and more able to assess things accurately. And experienced lawyers at least have some data beyond the particulars of this one case, on which to base their judgment. So ask yourself to what extent your desire to win, your emotional connection to the client or the issue, may be clouding your judgment. Better yet, just assume that it is, and adjust your outlook accordingly.
That’s just one example, loosely based on his ideas in reference-class forecasting and the planning fallacy. You should also check out his Prospect Theory. Maybe you’re a defense attorney, trying to help a client decide whether to take a plea or go to trial. You’re convinced that the odds of success at trial are great, and your client agrees, but he still wants to accept a fairly harsh plea. Or maybe it’s the other way around — your client wants to roll the dice with a jury, even though the case is a slam dunk and there’s a sweet offer on the table. Either way, your client isn’t being rational, and you can’t figure out why. If you don’t do something, he’s going to screw himself.
Well, Prospect Theory shows that people aren’t rational. They’re more risk-averse than they ought to be, when choosing between things posed as “gains.” When options are presented as “losses,” people are prone to take more risks. A year added to a sentence is significantly more valuable to a client than a year whittled off, though they both consist objectively of the same 365 days.
Your client’s looking at 10 years after trial, but you figure a 70% chance of acquittal? Rationally, going to trial is the equivalent of taking 3 years. If the offer is 7, the rational decision is to go to trial. But your client doesn’t see it that way. His starting reference point is that 7-year offer. He just sees those extra 3 years if he loses at trial, and they far outweigh the 4 years he objectively saves by going to trial — and they even outweigh the 7 years he saves if, as is probable, he wins.
The other client’s looking at 10 years after a trial he is almost certain to lose. The offer is 6 years. But this client’s reference point is not the offer, but a walk. He’s not comparing a gain and a loss, but two losses. The rational actor would take the 6 at once. But the irrational, normal human being, is going to say “fuck it, let’s roll the dice,” even though that’s probably going to slam him with 10.
What do you do about each one? Maybe couch the discussion in different terms. Get the client to at least look from a different reference point, one that helps his perception more closely match (what you believe to be) objectivity. Make the more objectively reasonable outcome stand out (we often choose that which is more prominent, see lineups). At least give the client a chance to reach a different conclusion, if you’re convinced he’s being irrational.
Prospect Theory also helps explain why a prosecutor will continue on with a case, well after he should have folded, because he’s already invested so much time and resources into it. It’s like the gambler who keeps throwing good money after bad, and equally irrational. It only makes the loss worse. The trick is to persuade the gambler that sunk costs are indeed sunk, and that it makes sense to just move on. It’s easier, actually, with a gambler who’s risking his own money (though it’s still damn hard to do). A prosecutor whose only risk is a loss at trial (after all, the budget isn’t coming out of his pocket) isn’t really risking much if he keeps going.
By the way, the fact that people are not rational actors has been solidly proven by experiment and empirical data. It’s not “theory” as in a mere supposition, but “Theory” with a capital “T,” which means “as close to God’s own truth as is humanly possible.”
This has HUGE implications for criminal law, beyond the tactics and strategy of a particular case. Our entire jurisprudence is built on the presumption that people, in general, are rational actors.
Our sentencing — both its severity and its modern history of reform — is intended to be rational. Severity, especially in nonviolent crimes, is intended to deter people and make them think twice before succumbing to temptation. The US Sentencing Guidelines and similar restraints on discretion are intended to have the same effect by making outcomes predictable.
Meanwhile, our jury instructions are presumed to be applied rationally and correctly. Police are expected to objectively try to catch the right person. Judges are expected to apply the law objectively.
What really happens? Nobody is deterred by the severity or certainty of a specific sentence. If they are deterred at all (which is not usual), it is by the risk of punishment in general — the actual potential sentence is irrelevant — or by social disapprobation. No offender sits down beforehand and even roughly says to himself “hmm, if I steal this million dollars, there’s a 20% chance that I’ll be caught and sentenced to 10 years. That’s two years for a million dollars. Is two years of my life worth a million dollars?”
Similarly, jurors often try to follow what portion of the instructions they remember, but they usually wind up voting for the outcome they believe to be “right,” rather than the one dictated by the cold unfeeling law. Emotion plays a large role in this ostensible rational undertaking. Guilty people walk even when they actually were proven guilty, because the jury just didn’t want to convict (usually explaining it by saying “we thought he did it, but we just needed more” — ignoring that they were already convinced beyond a reasonable doubt). People who should have walked go to jail because jurors threw the prosecution a bone, and let them win what they thought was a throwaway count. People go to jail because the jurors are angered by the crime and want to punish someone, and these happen to have been the ones sitting in the defendant’s chair.
Police who should rationally focus on catching the right guy get all emotionally invested in the one guy they happen to be targeting. Confirmation bias rears its ugly head, where evidence of innocence is disregarded because it doesn’t fit the “truth,” and facts whose meaning is open to interpretation are only seen as confirming that “truth.” This is not a conscious, purposeful attempt to screw over an innocent person; the police who do this (and the prosecutors who keep it going afterwards) sincerely believe it. They just focus more and more on the innocent guy, the actual offender remains free to strike again, and the Innocence Project gets another case.
Our society keeps making strides towards greater justice, and the past decade has seen many. Perhaps the time is ripe for our jurisprudence to start recognizing that irrational behavior is the norm, and adapt our procedures to take that into account. Start to do so, anyway. Meanwhile, we lawyers and judges acting within the system would do well to acknowledge that which science now deems well-settled.
Well, that’s a digression and a half. Getting back to the main topic, go read yourself some Kahneman. There’s articles by and about him all over, just Google his name. A good place to start is this series of excerpts from a Master Class he taught a few years back called “A Short Course in Thinking about Thinking.” Then go check out some of the scholarly journals, if you’re so inclined. (Applying some of my own confirmation bias, I’ll recommend Richard McAdams & Thomas Ulen’s 2008 piece at Chicago Law School, “Behavioral Criminal Law and Economics,” which makes some of the same points I just made above, and is therefore True with a capital T.)