Over the summer, the Texas Forensic Science Commission, which sets standards for physical evidence in state courts, came to an unsettling conclusion: There was something wrong with how state labs were analyzing DNA evidence.
It seemed the labs were using an outdated protocol for calculating the probability of DNA matches in “mixtures”; that is, crime scene samples that contain genetic material from several people. It may have affected thousands of cases going back to 1999.
At first, they assumed the update wouldn’t make a big difference — just a refinement of the numbers.
But when a state lab reran the analysis of a DNA match from a murder case about to go to trial in Galveston, Texas, it discovered the numbers changed quite a bit.
Under the old protocol, says defense lawyer Roberto Torres, DNA from the crime scene was matched to his client with a certainty of more than a million to one. That is, you’d have to go through more than a million people to find somebody else who’d match the sample. But when the lab did the analysis again with the new protocol, things looked very different.
“When they retested it, the likelihood that it could be someone else was, I think, one in 30-something, one in 40. So it was a significant probability that it could be someone else,” Torres says.
The change didn’t affect the outcome of that case because there was other evidence against his client, but officials in Texas have just begun the process of correcting the mistake.
“We have to go back and identify which of those cases involved DNA mixtures where the lab may have given incorrect results,” says Jack Roady, the district attorney in Galveston. “It’s going to be a herculean task, but we’re gonna do it.”
Roady has been cooperating with the Texas Forensic Science Commission on fixing the problem, and he recalls the scene in September when he described the situation to a meeting of his fellow prosecutors.
“There was sometimes moments of collective gasps,” he says. “The fact that this science may not have been done correctly in the past gives us great pause.”
It’s unsettling to find out DNA analysis can vary like this because it threatens to undermine the deep faith people have placed in the technology.
“And it’s not faith they should not have had to begin with,” says Keith Inman, who teaches forensic science at California State University, East Bay.
Inman has worked with DNA evidence since the 1980s. He says forensic DNA-matching is based on sound science, but sometimes labs can get ahead of themselves. What happened in Texas, he says, is that labs have been using cutting-edge “testing kits” that can extract tiny traces of DNA from crime scenes, but those samples were then analyzed with math that’s not suited to “weak” samples that combine DNA from many people.
He says the problem isn’t limited to Texas. He says the newest, best analysis method — called “probabilistic genotyping” — takes time to roll out, and that’s put labs in a quandary.
“There’s this interim time that cases are coming up and the analyst has to do something with it, and they know by definition that there is a better approach,” Inman says.
Meanwhile, the justice system’s hunger for DNA evidence just keeps growing. There are now police departments that have made swabbing for DNA part of their routine.
“We collect DNA evidence daily,” says Jim Ferraris, deputy chief in Salem, Ore. His department has taken advantage of quicker testing provided by the state lab, and he says every officer in town is now trained to collect DNA. They even swab stolen cars and burgled homes.
“The doorjamb area, the point of entry — we’d swab that area,” says Ferraris. “Let’s say the dresser got rifled, we’d look on the handles. DNA has been folded into the fabric of what we do every day here.”
All that swabbing has paid off. They’ve found DNA links between crimes, and they’ve found suspects. Ferraris believes the DNA-swabbing has led to a decrease in property crimes.
At the same time, these “touch samples” can be very challenging for the labs. When you take a sample from a doorjamb, the sample may include DNA from several people, in roughly equal proportions, all mixed together.
Faced with ambiguous samples like that, results can vary. A lab using one method may find a match, while another lab, using a more conservative analysis, may judge the same sample to be inconclusive.
In the world of scientific research, this would be seen as normal; scientific doubt is considered part of the process. But NYU law professor Erin Murphy says in the world of courts and lawyers, those doubts aren’t always understood.
“We have this tendency in criminal justice to look for solutions to the most vexing problem, which is, ‘How do we find dangerous people and stop them?’ And when we look for those solutions, we like for them to be perfect,” Murphy says.
Murphy has written a new book about this called Inside the Cell: the Dark Side of Forensic DNA. She says people should understand that DNA analysis can involve “subjectivity,” and she worries about the tendency of juries to look past doubt.
“[Juries] are even willing to go a step further and say, ‘We’ll convict on the basis of DNA, even in the face of evidence — non-DNA evidence — that this isn’t the perpetrator,” Murphy says.
She has a pet theory about this. She says the public has seen other kinds of evidence discredited over the years, such as bite mark analysis and hair analysis. Even eyewitness testimony is now seen as unreliable. So we look to DNA.
“I feel like we’re clinging to a life raft of DNA and saying, ‘Well this one will save us!’ ” Murphy says. “But I think that both wrongly inflates DNA’s capabilities, and it also is just overlooking the part of the story of these old techniques, where they were used in our system for decades without challenge.”
If we now push this technology too hard — say, by prosecuting someone on nothing more than a genetic match — then DNA may also be headed for the kind of reassessment that has battered the reputation of older types of physical evidence.