How should judges evaluate lawyers’ alleged mishandling of forensic science evidence when the challenge is brought years after the trial? One recent United States Supreme Court decision grapples with this; and this article contextualizes that holding, analyzes its weaknesses, and suggests some factors for judges to weigh.
In criminal cases, the importance of science (and understanding the limits of science) cannot be gainsaid. The statistics are clear: in a review of homicide cases in Cleveland, Ohio, the clearance rate was higher [63.1%] for cases with probative results — either matches or exclusions — than in cases without such evidence [56.3%], and the average sentence imposed was higher in the former category.
Yet there is a confounding problem – the consumers of forensic evidence have little or no scientific training, either at the college level or ‘on the job.’ Perhaps 5% of lawyers [and judges] studied science, a number presented in research papers and confirmed repeatedly by polling attendees at legal education conferences. And the consequences are severe.
The scientific illiteracy of lawyers was highlighted in the 2009 Report of the National Academy of Sciences Strengthening Forensic Science: A Path Forward, which concluded that “[t]he judicial system is encumbered by, among other things, judges and lawyers who generally lack the scientific expertise necessary to comprehend and evaluate forensic evidence in an informed manner…”
This was brought home directly in a recent judgment overturning a Pennsylvania capital conviction based on DNA evidence. As argued at trial,
“That hat that was left at the scene in the middle of the street has [the defendant’s] sweat on it and [the victim’s] blood on it. DNA is a witness. It is a silent, unflappable witness.”
There was only one problem – the blood was on the victim’s hat, not the hat of the accused. Prosecutors misread the report; defense lawyers never caught the error. And this was not the only science-related error made at trial. The prosecutor argued to the jury that because the defendant was the major contributor of the DNA on the hat he was the person who wore it most recently, a statement contrary to science. The defense made no objection.
So with this background, one would hope that when confronted with a claim that counsel erred by not researching the science being used against his client, the United States Supreme Court would emphasize that the lawyer’s first duty is to learn – ask questions, conduct basic research, and consult with an expert before trial. But the exceptional deference paid to lawyers’ judgment calls, especially when viewed through a retrospective prism that emphasizes finality, dominated the Court’s analysis.
The case is Maryland v. Kulbicki, 136 S. Ct. 2 (U.S. 2015), and its story is worth telling. Kulbicki, a police officer, had an extra-marital affair in which the woman gave birth to a child. Kulbicki was subsequently the subject of a paternity/support action.
In 1993 the woman was murdered after a shot in the head. Two days later, when Kulbicki’s home and vehicles were searched, a bullet fragment was found in his truck. Metallurgic analysis of that fragment showed that it had the same elemental composition as a bullet fragment found in the victim’s brain. This is called Comparative Bullet-Lead Analysis (CBLA). So far, so good.
The problem came when FBI analyst Peele testified at the 1995 trial about the significance of this finding.
Peele testified that the bullet fragments from Gina’s brain and Kulbicki’s truck exhibited “the same amounts of each and every element… detected,” and were thus “analytically indistinguishable.” Peele added that the results were “what you’d expect if you were examining two pieces of the same bullet, they are that close, two pieces of the same source.”
Kulbicki v. State, 53 A.3d 361, 368 (Md. Ct. Spec. App. 2012).
Time marched on, as does science, and CBLA evidence came to be discredited in the decade after trial. The flaws in CBLA were substantial:
[T]he problem with CBLA is…that it is unreliable to conclude that a CBLA “match” supports further specific factual assertions put forth at trial. Most often, these assertions are that matching bullets came from the same box, the same manufacturer, were related in time or geography, or generally linked the defendant to the crime in some unspecified manner. Crucially, these conclusions rested on assumptions unsupported by scientific and statistical testing of the general bullet manufacturing process… [T]here was no generally reliable evidence that a CBLA match corresponded to a match among any other type of “source,” such as a specific manufacturer, box, time, location, etc. Thus, it remained in many cases a distinct possibility that while bullets from the same “source” match each other, they also match bullets from any number of “sources.”
Kulbicki v. State, 53 A.3d at 377.
Attacking his conviction more than a decade after CBLA evidence was shown to be unreliable, Kulbicki faced an uphill battle as lawyers need not be prescient and predict future developments. His lawyer was to be judged by what reasonable lawyers would have done in 1995.
And that is where the United States Supreme Court gave Kulbicki’s lawyer – and all lawyers handling cases involving forensic science – an unwarranted pass. The Maryland Supreme Court decided that a new trial was appropriate because the lawyer never looked for, found or read a 1991 report by Peele that contradicted, or at least called into question, Peele’s fundamental claim at trial – that each batch of metal melted to create bullets is compositionally unique. In his 1991 report Peele had found bullets manufactured 15 months apart to have the identical composition.
The Maryland high court granted relief based on a simple articulation of the duty of a lawyer – lawyers are effective only when they make decisions based upon “adequate investigation[,]” and the “failure to investigate the forensic evidence is not what a competent lawyer would do.”
And what did the United States Supreme Court say in response? In a per curiam decision with no preceding oral argument, the Court first misstated the Maryland Court’s holding as being that “Kulbicki’s defense attorneys were constitutionally required to predict the demise of CBLA.” In this mis-characterization, and in the remainder of the opinion, it ignored the duty of lawyers to ask questions and educate themselves.
According to the Court, “counsel did not perform deficiently by dedicating their time and focus to elements of the defense that did not involve poking methodological holes in a then-uncontroversial mode of ballistics analysis.” The Court concluded that because the trial predated the search engines of the worldwide web, the Peele report would not have been easily located.
Restated, this seems to say that ‘if it looks good, don’t bother to read up on it. And don’t ask for the expert’s curriculum vitae, which would lead to the expert’s own research.’
The message of “you don’t have to ask questions if the science looks good at the time of trial” has already been echoed by one court. Pulido v. Grounds, 2015 U.S. Dist. LEXIS 141313, *56-57 (E.D. Cal. Oct. 13, 2015). And that notion defies a basic precept of science, that “we accumulate scientific knowledge like clockwork, with the result that facts are overturned at regular intervals in our quest to better understand the world.” Arbesman, THE Half-life of Facts. In a world where, over the past decade, we have seen the impact of flawed or erroneous forensic evidence testimony, the lesson from the Court should be simpler and clear – lawyers’ first duty in a case with scientific evidence is to ask questions and learn its limitations.
Post-Kulbicki, how does a judge respond to new post-conviction challenges that look back and say “the lawyer didn’t investigate the science?” If the jurisdiction has more stringent standards under its own Constitution or statute, then Kulbicki is of no moment. But where the jurisdiction applies federal law, a searching inquiry is still warranted. The Kulbicki court emphasized that this trial was pre-internet (describing the case as in “an era of card catalogues, not a worldwide web,”), where research was not as easy; and although unmentioned, in 1995 we did not have the history of forensic errors and the awareness of the limitations of many forensic disciplines that we have now (and have had since 2009, if not earlier).
So even after Kulbicki, as time and knowledge evolve, it may be that even when we apply what that Court called the “rule of contemporary assessment of counsel’s conduct,” i.e. judging reasonableness as of the time of counsel’s conduct, there may be good grounds to apply the standard the Maryland Court did – that “failure to investigate the forensic evidence is not what a competent lawyer would do.”
This blog post originally appeared on The Judicial Edge, The National Judicial College’s monthly email newsletter.