Commonwealth v. Arrington
What's at Stake
In this amicus brief, the ACLU and its coalition partners urged robust application of the legal standard governing the admissibility of expert testimony and technical evidence, especially in cases involving opaque or proprietary algorithms.
This case involves investigators’ attempt to introduce into evidence algorithmically generated frequent location history from an accused person’s iPhone through the proffered expert testimony of a crime analyst who explicitly disavowed knowledge of how the algorithm worked. At the time of the investigation, iPhones calculated and saved information about the locations a device visited more than once by compiling several distinct types of location data, including GPS data, information about Wi-Fi and Bluetooth connections, and data gleaned from the iPhone’s contact with nearby cell towers, running that data through a proprietary algorithm, and then displaying so-called “frequent location history” location points on a map as circles of varying sizes (sometimes with diameters representing hundreds of meters). The analyst attempted to establish the reliability of the frequent location history by testifying about his own tests on a different model of iPhone running a different version of Apple’s operating system, to show that after visiting a location several times with that device, the phone seemed to generate frequent location history reflecting that it had been in that general vicinity.
The trial court excluded the analyst’s testimony on the basis that the analyst lacked the specialized knowledge required to qualify him as an expert witness, and had failed to adequately substantiate the reliability of the frequent location history evidence. The government appealed to Massachusetts’ high court.
In an amicus brief in the case, the ACLU, along with the ACLU of Massachusetts, the National Association of Criminal Defense Lawyers, the Massachusetts Association of Criminal Defense Lawyers, and the Electronic Frontier Foundation, urged affirmance of the trial court’s order. The brief explains the importance of strictly enforcing this legal standard by highlighting examples of other proprietary or black-box algorithmics used in the criminal enforcement system, like probabilistic genotyping, face recognition technology, risk assessment tools, and dubious gunshot-detection systems, and surfacing critiques of those algorithms’ reliability. The brief explains that when the government attempts to introduce evidence from such flawed and inscrutable algorithms, it may only do so through the testimony of a proper expert who has sufficient access to, and knowledge about, the algorithm, and who can credibly speak to its reliability. As the brief explains, any weakening of the standard for expert testimony threatens to allow unreliable evidence from black-box or proprietary algorithms to flood criminal trials.