Now, a Wisconsin judge has ordered the influential Static-99R instrument excluded from a sexually violent predator (SVP) trial, on the grounds that failure to release the data violates a respondent's legal right to due process.
The ruling may be the first time that the Static-99R has been excluded altogether from court. At least one prior court, in New Hampshire, barred an experimental method that is currently popular among government evaluators, in which Static-99R risk estimates are artificially inflated by comparing sex offenders to a specially selected "high-risk" sub-group, a procedure that has not been empirically validated in any published research.
In the Wisconsin case, the state was seeking to civilly commit Homer Perren Jr. as a sexually dangerous predator after he completed a 10-year prison term for an attempted sexual assault on a child age 16 or under. The exclusion of the Static-99R ultimately did not help Perren. This week, after a 1.5-day trial, a jury deliberated for only one hour before deciding that he met the criteria for indefinite civil commitment at the Sand Ridge Secure Treatment Center.*
Dec. 18 note: After publishing this post, I learned that the judge admitted other "actuarial" risk assessment instruments, including the original Static-99 and the MnSOST-R, which is way less accurate than the Static-99R and vastly overpredicts risk. He excluded the RRASOR, a four-item ancestor of the Static-99. In hindsight, for the defense to get the Static-99R excluded was a bit like cutting off one's nose to spite one's face.
The ruling by La Crosse County Judge Elliott Levine came after David Thornton, one of the developers of the Static-99R and a government witness in the case, failed to turn over data requested as part of a Daubert challenge by the defense. Under the U.S. Supreme Court's 1993 ruling in Daubert v. Merrell Dow Pharmaceuticals, judges are charged with the gatekeeper function of filtering evidence for scientific reliability and validity prior to its admission in court.
Defense attorney Anthony Rios began seeking the data a year ago so that his own expert, psychologist Richard Wollert, could directly compare the predictive accuracy of the Static-99R with that of a competing instrument, the Multisample Age-Stratified Table of Sexual Recidivism Rates," or MATS-1. Wollert developed the MATS-1 in an effort to improve the accuracy of risk estimation by more precisely considering the effects of advancing age. It incorporates recidivism data on 3,425 offenders published by Static-99R developer Karl Hanson in 2006, and uses the statistical method of Bayes's Theorem to calculate likelihood ratios for recidivism at different levels of risk.
The state's attorney objected to the disclosure request, calling the data "a trade secret."
Hanson, the Canadian psychologist who heads the Static-99 enterprise, has steadfastly rebuffed repeated requests to release data on which the family of instruments is based. Public Safety Canada, his agency, takes the position that it will not release data on which research is still being conducted, and that "external experts can review the data set only to verify substantive claims (i.e., verify fraud), not to conduct new analyses," according to a document filed in the case.
Thornton estimated that the raw data will remain proprietary for another five years, until the research group finishes its current projects and releases the data to the public domain.
While declining to release the data to the defense, Hanson agreed to release it to Thornton, the government's expert and a co-developer of the original Static-99, so that Thornton could analyze the relative accuracy of the two instruments.
The American Psychological Association's Ethics Code requires psychologists to furnish data, after their research results are published, to "other competent professionals who seek to verify the substantive claims through reanalysis" (Section 8.14).
Since the Static-99 family of instruments (which include the Static-99, Static-99R, and Static-2000) began to be developed more than a decade ago, they have been in a near-constant state of flux, with risk estimates and instructions for interpretation subject to frequent and dizzying changes.
The timing of this latest brouhaha is apropos, as reports of bias, inaccuracy and outright fraud have shaken the psychological sciences this year and led to more urgent calls for transparency and sharing of data by researchers. Earlier this year, a large-scale project was launched to systematically try to replicate studies published in three prominent psychological journals.
A special issue of Perspectives on Psychological Science dedicated to the problem of research bias in psychology is available online for free (HERE).
*Hat tip to blog reader David Thompson for alerting me that the trial had concluded.