A couple of weeks ago, we taught another CLE course for the good folks at West Legal Ed Center, in our “Hope for Hopeless Cases” series. This one was on ways to defend cases where the government is going to use DNA evidence to prove your client’s guilt. (Here’s a link.)
DNA evidence can be just devastating. The science is good, after all. And to a lot of potential jurors (and judges and lawyers, unfortunately), “science” is another word for “magic.” Which is another word for “I don’t have to understand how it works, all I know is that it must be so.”
This can often be a wonderful thing, when the science is used correctly, and for the limited purposes to which it is suited. When used correctly, DNA evidence can free the innocent, and help ensure that we really are only punishing the guilty.
The problem is, DNA evidence is all too often used wrong.
And when that happens, the wrong people can get convicted.
And now today we read a good article in the latest Washington Monthly called “DNA’s Dirty Little Secret: A forensic tool renowned for exonerating the innocent may actually be putting them in prison.” (Link here.)
It’s a good article, about the case of John Puckett, who was convicted in 2008 of an old murder from 1972. It was a brutal rape and murder, with about 20 suspects at the beginning, but the case went cold. Then in 2003 the police tested the DNA found in the evidence. It was old DNA and degraded, and it was also a mixture of multiple people’s DNA. The results were compared to California’s DNA database, and there was a possible match with Mr. Puckett. He hadn’t been a suspect in 1972, but based on this apparent match — and on nothing else — he was prosecuted and ultimately convicted. Jurors have since said that they convicted because of the statistical odds quoted to them at trial, and that if they had known the stats of false positives — which were one in three — they never would have trusted the government’s stats like that.
The article highlights the fact that DNA evidence may be based on good science, but by the time it gets to a jury it can be seriously flawed. Contrary to popular belief, DNA evidence is not objective. It involves a huge amount of subjective interpretation and judgment calls. And whenever human beings have to interpret data and make judgment calls, there is a lot of room for reasonable doubt.
Contamination, of course, can be a huge issue. Cops screw up when they collect biological evidence, when they stick it in evidence bags, and when they ship it off to the lab. There’s all kinds of opportunities in the real world for a suspect’s DNA to get mixed up with the evidence sample. We’ve worked on at least two cases in the last six months where that is exactly what seems to have happened.
But leaving aside contamination, there are all kinds of ways that experts can look at DNA evidence results and draw the wrong conclusion.
There are technical errors in the lab, for one thing. Sometimes they analyze the wrong evidence. Sometimes the machines doing the analysis aren’t working properly. Sometimes the lab doesn’t test control samples and negative controls, to see if the machines are working right, and whether they’re giving false positives. They almost never do double-blind analysis. Often, analysts will manually adjust the data results, adding or deleting data (!) when it doesn’t look right to them.
There are analytical errors all the time, too. They’re comparing two results — the evidence itself, and the exemplar of the suspect. They never match perfectly, not ever. So errors arise when comparing the two samples. Or judgment calls must be made that give rise to reasonable disagreement (a/k/a “reasonable doubt”) — especially when the evidence is a mixture of more than one person’s DNA. Two people can make four peaks at each locus, and there are 6 different ways to interpret those four peaks. Three people in the mix jump that up to 15 different interpretations. At each locus. That’s a lot of room for judgment and doubt.
There’s another thing called the “observer effect,” which is rampant in labs serving law enforcement. The analysts are often told by the police exactly what they think the evidence will show, or why they need to prove your client did it. And the test are never double-blind, so the analyst’s pre-existing ideas will become the filter through with the evidence is interpreted. Even the plainest evidence that doesn’t match the preconceived notion, it just gets explained away. Happens all the time. It’s a normal human behavior. That’s why double-blind testing exists in the first place.
Lots of things can be mistaken for other things. Machine errors can be read as actual data, and can hide other data. And often enough the test isn’t run a second time to see what the result would have been without that particular error. Same goes for degraded DNA samples and PCR errors.
Beyond analytical concerns, of which there are far more than we’ve mentioned here, there can be significant problems with the way the statistics are handled. The underlying stats, just like the underlying science, are perfectly valid. The problems come up when the stats are used incorrectly, to imply something that they really don’t mean. The Puckett case involves the false-positive fallacy — the odds of a false positive can be very very high, even in a test that has an unbelievable success rate. There’s also the birthday paradox, whereby the odds of two people matching each other — even against astronomical odds — approaches a coin toss in large real-world populations. And don’t forget the “prosecutor’s fallacy,” named for the unfortunate tendency to refer to the statistics as the odds that the defendant isn’t the right person. A good defense attorney can often show to the jury that the numbers don’t necessarily mean what the prosecutor says they mean.
DNA evidence, at best, can only tell you whose DNA you’re probably looking at. It cannot tell you how it got there, what happened, or who did it. And just because the DNA may even probably be your client’s, that doesn’t mean he necessarily did the crime. Just like with a fingerprint, there needs to be more, a lot more, to tie him to the commission of the actual crime. People forget this. And all the DNA witnesses can do is evaluate their data; they cannot evaluate the case itself.
So good lawyers shouldn’t let the DNA evidence become the case. It’s just a tool, like a fingerprint. Nothing more. It’s not magic, it’s not infallible. There’s plenty of room for error.
This can be room for reasonable doubt, but it can also be room for convicting the innocent. Hopefully, the more lawyers and judges learn about the downside of DNA evidence, the fewer wrongful convictions we’ll see.
But we’re not holding our breath in the meantime.