The False Promise of DNA Testing
http://www.theatlantic.com/ magazine/archive/2016/06/a- reasonable-doubt/480747/
http://www.theatlantic.com/
The subject of the segment was the Houston Police Department Crime Laboratory, among the largest public forensic centers in Texas. By one estimate, the lab handled DNA evidence from at least 500 cases a year—mostly rapes and murders, but occasionally burglaries and armed robberies. Acting on a tip from a whistle-blower, KHOU 11 had obtained dozens of DNA profiles processed by the lab and sent them to independent experts for analysis. The results, William Thompson, an attorney and a criminology professor at the University of California at Irvine, told a KHOU 11 reporter, were terrifying: It appeared that Houston police technicians were routinely misinterpreting even the most basic samples.
Carol Batie watched the entire segment, rapt. As soon as it ended, she e-mailed KHOU 11. “My son is named Josiah Sutton,” she began, “and he has been falsely accused of a crime.” Four years earlier, Batie explained, Josiah, then 16, and his neighbor Gregory Adams, 19, had been arrested for the rape of a 41-year-old Houston woman, who told police that two young men had abducted her from the parking lot of her apartment complex and taken turns assaulting her as they drove around the city in her Ford Expedition.
A few days after reporting the crime, the woman spotted Sutton and Adams walking down a street in southwest Houston. She flagged down a passing patrol car and told the officers inside that she had seen her rapists. Police detained the boys and brought them to a nearby station for questioning. From the beginning, Sutton and Adams denied any involvement. They both had alibis, and neither of them matched the profile from the victim’s original account: She’d described her assailants as short and skinny. Adams was 5 foot 11 and 180 pounds. Sutton was three inches taller and 25 pounds heavier, the captain of his high-school football team.
The DNA evidence was harder to refute. Having seen enough prime-time TV to believe that a DNA test would vindicate them, Sutton and Adams had agreed, while in custody, to provide the police with blood samples. The blood had been sent to the Houston crime lab, where an analyst named Christy Kim extracted and amplified DNA from the samples until the distinct genetic markers that swim in every human cell were visible, on test strips, as a staggered line of blue dots.
Kim then compared those results with DNA obtained from the victim’s body and clothing and from a semen stain found in the back of the Expedition. A vaginal swab contained a complex mixture of genetic material from at least three contributors, including the victim herself. Kim had to determine whether Sutton’s or Adams’s genetic markers could be found anywhere in the pattern of dots. Her report, delivered to police and prosecutors, didn’t implicate Adams, but concluded that Sutton’s DNA was “consistent” with the mixture from the vaginal swab.
Batie was starting to think her son would never be freed. But the KHOU 11 segment, the first of a multipart investigative series on the Houston crime lab, encouraged her. Shortly after e‑mailing the station, she received a call from David Raziq, a veteran television producer in charge of KHOU 11’s investigative unit. In the course of their work on the series, Raziq and his team had uncovered a couple of close calls with wrongful conviction—in one case, a man had been falsely accused, on the basis of improperly analyzed DNA evidence, of raping his stepdaughter. But in those instances, attorneys had managed to demonstrate the problems before their clients were sent to prison.
Batie hand-delivered the files from her son’s case to Raziq, who forwarded them on to William Thompson, the UC Irvine professor. Thompson had been studying forensic science for decades. He’d begun writing about DNA evidence from a critical perspective in the mid-1980s, as a doctoral candidate at Stanford, and had staked out what he describes as a “lonely” position as a forensic-DNA skeptic. “The technology had been accepted by the public as a silver bullet,” Thompson told me this winter. “I happened to believe that it wasn’t.”
Together with his wife, also an attorney, Thompson unpacked the two boxes containing the files from Sutton’s trial and spread them out across their kitchen table. His wife took the transcripts, and Thompson took the DNA tests. Almost immediately, he found an obvious error: In creating a DNA profile for the victim, Kim had typed three separate samples, two from blood and another from saliva. The resulting DNA profiles, which should have been identical, varied substantially. This alone was cause for serious concern—if the tech couldn’t be trusted to get a consistent DNA profile from a single person, how could she be expected to make sense of a complex mixture like the one from the vaginal swab?
“It was exculpatory evidence,” Thompson told me. “And the jury never heard it.”
KHOU 11 flew a reporter out to Irvine and taped a new interview with Thompson. Sutton’s case was taken up by Robert Wicoff, a defense attorney in Houston, who persuaded a Texas judge to have the DNA evidence reprocessed by a private testing facility. As Thompson had predicted, the results confirmed that Sutton was not a match. In the spring of 2003, more than four years after his arrest, Sutton was released from prison. His mother was waiting for him at the gates, her eyes bright with tears. “Going to prison, for me, was like seeing my death before it happens,” Sutton later told a local newspaper reporter.
In 2006, a cold hit in the FBI’s Combined DNA Index System, or codis, would lead police to Donnie Lamon Young, a convicted felon. Young confessed that in 1998, he and an accomplice had raped a Houston woman in her Ford Expedition. In January 2007, Young pleaded guilty to the crime.
Christy Kim was fired from the Houston crime lab, but reinstated after her lawyer argued that her errors—which ranged from how she had separated out the complex mixture to how she had reported the odds of a random match—were a product of systemic failures that included inadequate supervision. (Kim could not be reached for comment.) Sutton’s case became one of the central pillars of a public inquiry into practices at the lab. “The system failed at multiple points,” the head of the inquiry, Michael Bromwich, concluded.
Modern forensic science is in the midst of a great reckoning. Since a series of high-profile legal challenges in the 1990s increased scrutiny of forensic evidence, a range of long-standing crime-lab methods have been deflated or outright debunked. Bite-mark analysis—a kind of dental fingerprinting that dates back to the Salem witch trials—is now widely considered unreliable; the “uniqueness and reproducibility” of ballistics testing has been called into question by the National Research Council. In 2004, the FBI was forced to issue an apology after it incorrectly connected an Oregon attorney named Brandon Mayfield to that spring’s train bombings in Madrid, on the basis of a “100 percent” match to partial fingerprints found on plastic bags containing detonator devices. Last year, the bureau admitted that it had reviewed testimony by its microscopic-hair-comparison analysts and found errors in at least 90 percent of the cases. A thorough investigation is now under way.
DNA typing has long been held up as the exception to the rule—an infallible technique rooted in unassailable science. Unlike most other forensic techniques, developed or commissioned by police departments, this one arose from an academic discipline, and has been studied and validated by researchers around the world. The method was pioneered by a British geneticist named Alec Jeffreys, who stumbled onto it in the autumn of 1984, in the course of his research on genetic sequencing, and soon put it to use in the field, helping police crack a pair of previously unsolved murders in the British Midlands. That case, and Jeffreys’s invention, made front-page news around the globe. “It was said that Dr. Alec Jeffreys had done a disservice to crime writers the world over, whose stories often center around doubtful identity and uncertain parentage,” the former detective Joseph Wambaugh wrote in The Blooding, his book on the Midlands murders.
As Jay Aronson, a professor at Carnegie Mellon University, notes in Genetic Witness, his history of what came to be known as the “DNA wars,” the technology’s introduction to the American legal system was by no means smooth. Defense attorneys protested that DNA typing did not pass the Frye Test, a legal standard that requires scientific evidence to have earned widespread acceptance in its field; many prominent academics complained that testing firms were not being adequately transparent about their techniques. And in 1995, during the murder trial of O. J. Simpson, members of his so-called Dream Team famously used the specter of DNA-sample contamination—at the point of collection, and in the crime lab—to invalidate evidence linking Simpson to the crimes.
But gradually, testing standards improved. Crime labs pledged a new degree of thoroughness and discipline, with added training for their employees. Analysts got better at guarding against contamination. Extraction techniques were refined. The FBI created its codis database for storing DNA profiles of convicted criminals and arrestees, along with an accreditation process for contributing laboratories, in an attempt to standardize how samples were collected and stored. “There was a sense,” Aronson told me recently, “that the issues raised in the DNA wars had been satisfactorily addressed. And a lot of people were ready to move on.”
While helping to overturn wrongful convictions, DNA was also becoming more integral to establishing guilt. The number of state and local crime labs started to multiply, as did the number of cases involving DNA evidence. In 2000, the year after Sutton was convicted, the FBI’s database contained fewer than 500,000 DNA profiles, and had aided in some 1,600 criminal investigations in its first two years of existence. The database has since grown to include more than 15 million profiles, which contributed to tens of thousands of investigations last year alone.
As recognition of DNA’s revelatory power seeped into popular culture, courtroom experts started talking about a “CSI effect,” whereby juries, schooled by television police procedurals, needed only to hear those three magic letters—DNA—to arrive at a guilty verdict. In 2008, Donald E. Shelton, a felony trial judge in Michigan, published a study in which 1,027 randomly summoned jurors in the city of Ann Arbor were polled on what they expected prosecutors to present during a criminal trial. Three-quarters of the jurors said they expected DNA evidence in rape cases, and nearly half said they expected it in murder or attempted-murder cases; 22 percent said they expected DNA evidence in every criminal case. Shelton quotes one district attorney as saying, “They expect us to have the most advanced technology possible, and they expect it to look like it does on television.”
“You reached a point where the questions about collection and analysis and storage had largely stopped,” says Bicka Barlow, an attorney in San Francisco who has been handling cases involving DNA evidence for two decades. “DNA evidence was entrenched. And in a lot of situations, for a lot of lawyers, it was now too costly and time-intensive to fight.”
DNA analysis has risen above all other forensic techniques for good reason: “No [other] forensic method has been rigorously shown able to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source,” the National Research Council wrote in an influential 2009 report calling out inadequate methods and stating the need for stricter standards throughout the forensic sciences.
The problem, as a growing number of academics see it, is that science is only as reliable as the manner in which we use it—and in the case of DNA, the manner in which we use it is evolving rapidly. Consider the following hypothetical scenario: Detectives find a pool of blood on the floor of an apartment where a man has just been murdered. A technician, following proper anticontamination protocol, takes the blood to the local crime lab for processing. Blood-typing shows that the sample did not come from the victim; most likely, it belongs to the perpetrator. A day later, the detectives arrest a suspect. The suspect agrees to provide blood for testing. A pair of well-trained crime-lab analysts, double-checking each other’s work, establish a match between the two samples. The detectives can now place the suspect at the scene of the crime.
But today, most large labs have access to cutting-edge extraction kits capable of obtaining usable DNA from the smallest of samples, like so-called touch DNA (a smeared thumbprint on a window or a speck of spit invisible to the eye), and of identifying individual DNA profiles in complex mixtures, which include genetic material from multiple contributors, as was the case with the vaginal swab in the Sutton case.
These advances have greatly expanded the universe of forensic evidence. But they’ve also made the forensic analyst’s job more difficult. To understand how complex mixtures are analyzed—and how easily those analyses can go wrong—it may be helpful to recall a little bit of high-school biology: We share 99.9 percent of our genes with every other human on the planet. However, in specific locations along each strand of our DNA, the genetic code repeats itself in ways that vary from one individual to the next. Each of those variations, or alleles, is shared with a relatively small portion of the global population. The best way to determine whether a drop of blood belongs to a serial killer or to the president of the United States is to compare alleles at as many locations as possible.
Think of it this way: There are many thousands of paintings with blue backgrounds, but fewer with blue backgrounds and yellow flowers, and fewer still with blue backgrounds, yellow flowers, and a mounted knight in the foreground. When a forensic analyst compares alleles at 13 locations—the standard for most labs—the odds of two unrelated people matching at all of them are less than one in 1 billion.
A groundbreaking study by Itiel Dror, a cognitive neuroscientist at University College London, and Greg Hampikian, a biology and criminal-justice professor at Boise State University, illustrates exactly how subjective the reading of complex mixtures can be. In 2010, Dror and Hampikian obtained paperwork from a 2002 Georgia rape trial that hinged on DNA typing: The main evidence implicating the defendant was the accusation of a co-defendant who was testifying in exchange for a reduced sentence. Two forensic scientists had concluded that the defendant could not be excluded as a contributor to the mixture of sperm from inside the victim, meaning his DNA was a possible match; the defendant was found guilty.
Dror and Hampikian gave the DNA evidence to 17 lab technicians for examination, withholding context about the case to ensure unbiased results. All of the techs were experienced, with an average of nine years in the field. Dror and Hampikian asked them to determine whether the mixture included DNA from the defendant.
Last year, Murphy published a book called Inside the Cell: The Dark Side of Forensic DNA, which recounts dozens of cases of DNA typing gone terribly wrong. Some veer close to farce, such as the 15-year hunt for the Phantom of Heilbronn, whose DNA had been found at more than 40 crime scenes in Europe in the 1990s and early 2000s. The DNA in question turned out to belong not to a serial killer, but to an Austrian factory worker who made testing swabs used by police throughout the region. And some are tragic, like the tale of Dwayne Jackson, an African American teenager who pleaded guilty to robbery in 2003 after being presented with damning DNA evidence, and was exonerated years later, in 2011, after a police department in Nevada admitted that its lab had accidentally swapped Jackson’s DNA with the real culprit’s.
Most troubling, Murphy details how quickly even a trace of DNA can now become the foundation of a case. In 2012, police in California arrested Lukis Anderson, a homeless man with a rap sheet of nonviolent crimes, on charges of murdering the millionaire Raveesh Kumra at his mansion in the foothills outside San Jose. The case against Anderson started when police matched biological matter found under Kumra’s fingernails to Anderson’s DNA in a database. Anderson was held in jail for five months before his lawyer was able to produce records showing that Anderson had been in detox at a local hospital at the time of the killing; it turned out that the same paramedics who responded to the distress call from Kumra’s mansion had treated Anderson earlier that night, and inadvertently transferred his DNA to the crime scene via an oxygen-monitoring device placed on Kumra’s hand.
Given rates of transfer, the mere presence of DNA at a crime scene shouldn’t be enough for a prosecutor to obtain a conviction. Context is needed. What worries experts like Murphy is that advancements in DNA testing are enabling ever more emphasis on ever less substantial evidence. A new technique known as low-copy-number analysis can derive a full DNA profile from as little as 10 trillionths of a gram of genetic material, by copying DNA fragments into a sample large enough for testing. The technique not only carries a higher risk of sample contamination and allele dropout, but could also implicate someone who never came close to the crime scene. Given the growing reliance on the codis database—which allows police to use DNA samples to search for possible suspects, rather than just to verify the involvement of existing suspects—the need to consider exculpatory evidence is greater than ever.
Indeed, some analysts are incentivized to produce inculpatory forensic evidence: A recent study in the journal Criminal Justice Ethics notes that in North Carolina, state and local law-enforcement agencies operating crime labs are compensated $600 for DNA analysis that results in a conviction.
“I don’t think it’s unreasonable to point out that DNA evidence is being used in a system that’s had horrible problems with evidentiary reliability,” Murphy, who worked for several years as a public defender, told me. No dependable estimates exist for how many people have been falsely accused or imprisoned on the basis of faulty DNA evidence. But in Inside the Cell, she hints at the stakes: “The same broken criminal-justice system that created mass incarceration,” she writes, “and that has processed millions through its machinery without catching even egregious instances of wrongful conviction, now has a new and powerful weapon in its arsenal.”
The growing potential for mistakes in DNA testing has inspired a solution fitting for the digital age: automation, or the “complete removal of the human being from doing any subjective decision making,” as Mark Perlin, the CEO of the DNA-testing firm Cybergenetics, put it to me recently.
Perlin grew interested in DNA-typing techniques in the 1990s, while working as a researcher on genome technology at Carnegie Mellon, and spent some time reviewing recent papers on forensic usage. He was “really disappointed” by what he found, he told me: Faced with complex DNA mixtures, analysts too frequently arrived at flawed conclusions. An experienced coder, he set about designing software that could take some of the guesswork out of DNA profiling. It could also process results much faster. In 1996, Perlin waved goodbye to his post at Carnegie Mellon, and together with his wife, Ria David, and a small cadre of employees, focused on developing a program they dubbed TrueAllele.
Around us, half a dozen analysts and coders sat hunched over computer screens. The office was windowless and devoid of any kind of decoration, save for a whiteboard laced with equations—the vibe was more bootstrapped start-up than CSI. “I think visitors are surprised not to see bubbling vials and lab equipment,” Perlin acknowledged. “But that’s not us.”
He led me down the hallway and into a storage room. Row upon row of Cybergenetics-branded Apple desktop computers lined the shelves: ready-made TrueAllele kits. Perlin could not tell me exactly how many software units he sells each year, but he allowed that TrueAllele had been purchased by crime labs in Oman, Australia, and 11 U.S. states; last year, Cybergenetics hired its first full-time salesman.
Four years ago, in one of its more high-profile tests to date, the software was used to connect an extremely small trace of DNA at a murder scene in Schenectady, New York, to the killer, an acquaintance of the victim. A similarly reliable match, Perlin told me, would have been very difficult to obtain by more analog means.
His critics have a darker view. William Thompson points out that Perlin has declined to make public the algorithm that drives the program. “You do have a black-box situation happening here,” Thompson told me. “The data go in, and out comes the solution, and we’re not fully informed of what happened in between.”
Last year, at a murder trial in Pennsylvania where TrueAllele evidence had been introduced, defense attorneys demanded that Perlin turn over the source code for his software, noting that “without it, [the defendant] will be unable to determine if TrueAllele does what Dr. Perlin claims it does.” The judge denied the request.
But TrueAllele is just one of a number of “probabilistic genotyping” programs developed in recent years—and as the technology has become more prominent, so too have concerns that it could be replicating the problems it aims to solve. The Legal Aid Society of New York recently challenged a comparable software program, the Forensic Statistical Tool, which was developed in-house by the city’s Office of the Chief Medical Examiner. The FST had been used to test evidence in hundreds of cases in the state, including an attempted-murder charge against a client of Jessica Goldthwaite, a Legal Aid attorney.
Goldthwaite knew little about DNA typing, but one of her colleagues at the time, Susan Friedman, had earned a master’s degree in biomedical science; another, Clinton Hughes, had been involved in several DNA cases. The three attorneys decided to educate themselves about the technology, and questioned half a dozen scientists. The responses were emphatic: “One population geneticist we consulted said what the [medical examiner] had made public about the FST read more like an ad than a scientific paper,” Hughes told me. Another called it a “random number generator.”
At the hearing, bolstered by a range of expert testimony, Goldthwaite and her colleagues argued that the FST, far from being established science, was an unknown quantity. (The medical examiner’s office refused to provide Legal Aid with the details of its code; in the end, the team was compelled to reverse-engineer the algorithm to show its flaws.)
Judge Mark Dwyer agreed. “Judges are, far and away, not the people best qualified to explain science,” he began his decision. Still, he added, efforts to legitimize the methods “must continue, if they are to persuade.” The FST evidence was ruled inadmissible.
Dwyer’s ruling did not have the weight of precedent: Other courts are free to accept evidence analyzed by probabilistic software—more and more of which is likely to enter the courtroom in the coming years—as they see fit. Still, Goldthwaite told me, the fact that one judge had been willing to question the new science suggested that others might too, and she and her team continue to file legal challenges.
When I interviewed Perlin at Cybergenetics headquarters, I raised the matter of transparency. He was visibly annoyed. He noted that he’d published detailed papers on the theory behind TrueAllele, and filed patent applications, too: “We have disclosed not the trade secrets of the source code or the engineering details, but the basic math.”
To Perlin, much of the criticism is a case of sour grapes. “In any new development in forensic science, there’s been incredible resistance to the idea that you’re going to rely on a validated machine to give you an accurate answer instead of relying on yourself and your expertise,” he told me.
In 2012, shortly after Legal Aid filed its challenge to the FST, two developers in the Netherlands, Hinda Haned and Jeroen de Jong, released LRmix Studio, free and open-source DNA-profiling software—the code is publicly available for other users to explore and improve.
Erin Murphy, of NYU, has argued that if probabilistic DNA typing is to be widely accepted by the legal community—and she believes that one day it should be—it will need to move in this direction: toward transparency.
“The problem with all DNA profiling is that there isn’t skepticism,” she told me. “There isn’t the necessary pressure. Is there increasing recognition of the shortcomings of old-school technology? Absolutely. Is there trepidation about the newer technology? Yes. But just because we’re moving forward doesn’t mean mistakes aren’t still being made.”
On April 3, 2014, the City of Houston shut down its old crime lab and transferred all DNA-testing operations to a new entity known as the Houston Forensic Science Center. Unlike its predecessor, which was overseen by the police department, the Forensic Science Center is intended to be an autonomous organization, with a firewall between it and other branches of law enforcement. “I think it’s important for the forensic side to have that independence, so we can narrow it down without worrying about which side is going to benefit or profit from it, just narrowing it down to what we think is the accurate information,” Daniel Garner, the center’s head, told a local reporter.
And yet Houston has been hard-pressed to leave its troubled history with forensic DNA behind. In June 2014, the Houston Chronicle reported that a former analyst at the old crime lab, Peter Lentz, had resigned after a Houston Police Department internal investigation found evidence of misconduct, including improper procedure, lying, and tampering with an official record. A representative from the county district attorney’s office told the Chronicle that her office was looking into all of the nearly 200 cases—including 51 murder cases—that Lentz had worked on during his time at the lab. (A grand jury declined to indict Lentz for any wrongdoing; he could not be reached for comment.)
“It’s almost 20 years later, and we’re still dealing with the repercussions,” Josiah Sutton’s mother, Carol Batie, told me earlier this year. “They say things are getting better, and maybe they are, but I always respond that it wasn’t fast enough to save Josiah.”
Before entering prison, Batie said, Sutton had been a promising football player, with a college career ahead of him. After his exoneration, he seemed stuck in a state of suspended animation. He was angry and resentful of authority. He drifted from job to job. He received an initial lump-sum payment from the city, as compensation for his wrongful conviction and the time he spent in prison, along with a much smaller monthly payout. But the lump-sum payment quickly vanished. He fathered five kids with five different women.
Batie called the city to ask about counseling for her son, but was told no such service was available. “I did my best to put myself in his shoes,” Batie said. “I was annoyed, but I knew he felt like the world was against him. Everyone had always given up on him. I couldn’t give up on him too.”
Last summer, Sutton was arrested for allegedly assaulting an acquaintance of his then-girlfriend. He spent the better part of a year in lockup before posting bail and is now awaiting trial. (Sutton denies the charges.) Batie believes that her son’s problems are a direct result of his incarceration in 1999. “He had his childhood stolen from him,” she told me. “No prom, no dating, no high-school graduation. Nothing. And he never recovered.”
I wondered whether Batie blamed DNA. She laughed. “Oh, no, honey,” she said. “DNA is science. You can’t blame DNA. You can only blame the people who used it wrong.”