Secret Algorithms Are Deciding Criminal Trials and We’re Not Even Allowed to Test Their Accuracy

In today’s world, computerized algorithms are everywhere: They can decide whether you get a job interview, how much credit you access, and what news you see. And, increasingly, it’s not just private companies that use algorithms. The government, too, is turning to proprietary algorithms to make profound decisions about your life, from what level of health benefits you receive to whether or not you get charged with a crime.

This isn’t necessarily good or bad. At their core, “algorithms” are just instructions, like a recipe or user manual, that use raw inputs to determine outcomes in all kinds of decision making. But it becomes a serious problem when the government keeps those algorithms — including the source code that executes the programs and the raw data that constitutes their inputs — secret from the public.

And that’s exactly what is happening in criminal trials around the country.

Take, for example, the case of Billy Ray Johnson, who was sentenced to life in prison without parole for a series of burglaries and sexual assaults he says he did not commit, largely based on the results of a proprietary algorithm called TrueAllele. TrueAllele claims to identify the perpetrator of a crime from a tiny, degraded DNA sample swimming in a larger soup of multiple individuals’ DNA. It’s an experimental technology, not at all like the DNA tests that have developed over the past two decades, which also have serious flaws. At Mr. Johnson’s trial, the court denied the defense team access to TrueAllele’s source code — information crucial to the defense case — all because the company that owns it cried, “Trade secret!”

As we explained in an amicus brief we filed in the case on Wednesday, this is unconstitutional in a number of ways. Our Constitution gives a defendant the right to confront the witnesses against him, and it provides him with the right to a fundamentally fair trial that includes a meaningful opportunity to present a complete defense. It also gives the public a right of access to criminal proceedings, including evidence, so that we can serve as a check upon the judicial process.

Access to the source code of algorithms used in the criminal justice system is critical to ensure fairness and justice. Algorithms are human constructs that are subject to human bias and mistake, which can plague them throughout their design and use. For example, at the building stage, something as simple as a misplaced ampersand can have profound implications. A coding error in another DNA algorithm was recently found to have produced incorrect results in 60 criminal cases in Australia, altering its reported statistics by a factor of 10 and forcing prosecutors to replace 24 expert statements.

Beyond random mistakes, people hold cognitive biases that can materially affect the variables they include in an algorithm, as well as how they interpret the results. Racial bias also often creeps into algorithms, both because the underlying data reflects existing racial disparities and because inaccurate results for smaller minority groups may be hidden in overall results.

And, of course, there’s the possibility that financial incentives will pervert the goals of companies that build these algorithms. In the context of DNA typing, the prosecution, backed by the substantial resources of the state, is a company’s most likely customer — and that customer is likely to be most satisfied with an algorithm that delivers a match. So companies may build programs to skew toward matches over the truth.

In Mr. Johnson’s case, the trial court decided to ignore these potential pitfalls — and, more significantly, the defendant’s constitutional rights — ruling in favor of TrueAllele’s argument for secrecy. This is legally wrong and has troubling practical implications. Research shows that juries put too much trust in uncontested algorithms. Prosecutors and their expert witnesses present their results as infallible truth, which go “far beyond what the relevant science can justify.” And juries, when given no other option, generally do not question them.

But the results need to be questioned, and this case demonstrates why.

TrueAllele’s parent company, Cybergenetics, and a government lab that bought the algorithm to run in-house got wildly different results — both from themselves on different test runs and from each other overall. Indeed, TrueAlelle’s creator testified that he expected the government’s results, generated by running the same data through the same program, to be “within two zeros,” or a magnitude of 100, of his results. Yet even though he expected a significant discrepancy, he was able to offer his results as unquestioned evidence. All while the defense was given no meaningful opportunity to challenge his testimony.

Access to similar DNA algorithms has revealed serious errors in them. Much like the example from Australia, a recent case in New York revealed that another DNA algorithm “dropped valuable data from its calculations, in ways . . . that could unpredictably affect the likelihood assigned to the defendant’s DNA being in the mixture.” This was only discovered after the trial court correctly ordered that the algorithmic source code be disclosed to the defense, prompting the prosecution to withdraw the evidence. Yet courts continue to admit the results of other DNA algorithms, like TrueAllele, without disclosure to the defense or the public.  

This isn’t the first time we’ve been down this road with technology in criminal courts. There is a long history of junk science being used under the guise of technological advance. Public access to such evidence was a prerequisite to establishing its invalidity.

In the 1990s, “a series of high-profile legal challenges” and “increased scrutiny of forensic evidence” caused various long-standing methods — from bite-mark analysis to ballistics testing and from fingerprinting to microscopic-hair-comparison — to get “deflated or outright debunked.” Similarly, after a New Yorker article exposed a flawed case based on arson science, the state responsible not only “reconsider[ed] old cases that had been improperly handled by the original investigators,” but also “reinvented itself as a leader in arson science and investigation.”

Scientific errors in the criminal justice system are a serious problem. But the examples above also reveal the power of adversarial testing and public scrutiny to correct those errors and create better science.

We hope the California appellate court agrees with us and orders disclosure of the algorithmic source code. An adversarial testing process is crucial to ensure that Mr. Johnson’s constitutional rights are enforced.

Add a comment (15)
Read the Terms of Use

DNA

People should know they leave DNA everywhere they go. Every time you leave your house you shed skin cells and drop hair fragments. Each strand of hair is your complete DNA.

So if you go to a place where a crime is later committed and your hair is found, you will be questioned. At that point you better have a good alibi.

I agree DNA puts a person in a place, but it absolutely does not reveal when the person was there. Cops and prosecutors that don't understand this are being DNA vigilantes.

I guess people need to put a 24 hour "private" camera on them selves with date time stamps and a news paper or something when they are with less than 2 other people. Otherwise the ops will arrest you for whatever they want and use your DNA to "prove" you were at the crime scene, which is true but not at the time of the crime. But a dumbass, uneducated jury selected by the prosecutor will sentence you to jail or death.

The whole system is flawed and fuked!

Anonymous

I am an American Citizen and there entity has Tortured me seens 2012-2017 with sinister warfare technologies, with Somatosensory, ( phantom touch )

Anonymous

Gay

Anonymous

Oh yeah, they put two electrodes into my butthole next to prostate and connected it to a battery. When they turned it on I 5 orgasms. Best time of my white bread life! I'm gay.

Anonymous

look at pre-trial detention algorithm (some call it pre-trial release algorithm). aclu is ok with it. face it - well intentioned but unconstitutional tech is in courtrooms because incompetent lawyers are everywhere and look the other way if the tech meets their political correctness standard. i predict - oops - read this - https://www.psychologytoday.com/blog/mental-illness-metaphor/201709/pred... lawyers are biggest problem and sole reason the crim just system is irretrievably broken.

Anonymous Bioin...

Having worked in the field for years, most companies keep their software hidden to disguise how poor their work actually is. (Or how simple what they're doing is.) Cherry-picking is the norm, false and exaggerated claims are rampant, and those writing much of the software are almost never experts in both biology and computer science and are very often experts in neither.

Additionally, any software, any algorithm which can be used in court must be available for peer review. Otherwise, there can be no assessment as to the validity of their results. This is the standard in academia.

Even in academia, wrong results and flawed execution cause widespread error, including, for example, species of bacteria whose reference genomes contain mostly cow because the scientists failed to separate its DNA from that of its host.

The truth of the matter for most companies with propriety software is precisely what one would expect: there is no special magic, there is no man behind the curtain, and the "special algorithm" (algorithm honestly being a misnomer in this case) consists whole cloth of the Emperor's New Clothes: nothing.

Anonymous

Coming from someone that knows quite a bit about these specific systems (I have had many interactions with two of them) and the field of DNA analysis (been doing it going on 20 years), these things dont do anything an expert cant. They arent "new technologies" or even "different DNA technologies"
They're just stat calculators being sold as something new, necessary and better. The truth is that they aren't. In some rare instances they might help tease a little more information out of a specific profile but in the grand sceme of things they dont do anything scientists cant do for the most part. Complex/inconclusive/insufficient profiles/mixtures are just that. Too complex or insufficient to make a conclusion about (s#!+ in-s#!+ out) and definitive profiles are easy calls. These programs use a probabilistic method to assign the probability that a donor to the profile has a specific deconvoluted profile from it. It doesn't make crappy evidence any better. It doesn't take a low level inconclusive profiles, conclusive...but you're right annonymousbioin... its definitely a matter of either the solutions are a lot more simple to get at than they're making it sound (to land big government contracts) or they're just covering up shoddy coding (ive seen this before too). There is no reason for "trade secret" when you own patents, unless you're not just trying to pull the wool over the eyes of lawyers. They're trying to pull over the eyes of us scientists too. The problem is as long as there is someone that is really smart and sounds really smart calling the rest of us stupid, trying to sell something as better than a government employee, the government is going to bite. It's just sometimes, that will come back to bite them.

Amit Lakhani

I think there should be a call for exclusively open-source algorithm use in criminal proceedings. This would allow both sides to have the code in its entirety, with no question of the underlying methodology. Private, for-profit algorithms might be useful, but suffer myriad problems already described.

Adam Farrel

The existence of the world is now with the help of technology and thus it is not an exaggerating matter, I think. Even Google is making use of the algorithms to find scammers online. This will change when technology develops further.

Reference website: https://www.essayschief.com

Pages

Stay Informed