Here’s a sure-fire method for producing a “successful” program: measure your successes, and ignore your failures. Works every time. What’s astonishing is how easy it is to get some academic to write it up, how willing the newspapers are to report the resulting “study” as if it contained actual information, and the many politicians will then cite your “success” as scientifically documented fact.
The latest incarnation of this particular confidence trick is from Chuck Colson’s Prison Fellowship, which, thanks to the generosity of then-Governor George W. Bush, runs its own prison (for born-again Christians only) in Texas, and now has similar programs in Iowa, Minnesota, and Kansas. A report from the University of Pennsylvania’s Center for Research on Religion and Urban Civil Society found that graduates of the program (called InnerChange) were only half as likely as matched controls to return to prison. Or so we are told in a press release from the Prison Fellowship. The White House gave Colson a nice photo-op with Bush, and Ari Fleischer said “This is an initiative that the President believes very deeply in to help reduce recidivism in our federal prisons and prisons everywhere.” Religion News Service picked up the report, and of course the editorial page of the Wall Street Journal is also enthusiastic, taking its obligatory swipe at “liberals” who want to keep God from rescuing sinners.
Here’s the way the study worked. The researchers took a group of 171 prisoners who entered the InnerChange program, and found then selected the records of a group of other inmates that met the selection criteria but didn’t enter. The comparison group was selected to match the program entrants on race, age, offense type, and something called the “salient factor score,” (SFS), a standard measure of recidivism risk. Then the post-release criminal behavior the graduates of the InnerChange program was compared to that of the matched controls.
Veeeeeeerrrrrrrrryyyyyyy zzzzzzzientifick, nicht war?
But completely bogus. Not only were the entrants to the program a self-selected group, which means that in some important ways (such as a desire to change their lives) they weren’t actually matched to the comparison group, but it was only the graduates — 75 of the 177 entrants — who showed better behavior than the pseudo-control group. Comparing all of the entrants (including those who dropped out, were kicked out, or got early parole) to all of the comparison group, the difference in recidivism reverses: the InnerChange group was slightly more likely to be rearrested (36.2% versus 35%) and noticeably more likely to actually go back to prison (24.3% versus 20.3%).
In other words, those who succeed, succeed, while those who fail are likely to fail. Whodathunkit?
Don’t get the impression that the Prison Fellowship is unusual in hyping its numbers this way. Most of the drug treatment literature (the stuff the people wearing the “Treatment Works” buttons keep shoving at you) works the same way, as a National Academy study of a couple of years ago rather rudely pointed out.
The self-selection problem is a really hard one for social scientists go get around: as an ethical matter, you can’t randomly assign people to receive different treatments without getting their informed consent in advance. Anyway, it’s quite plausible that even a good program will only work for the people who want it, so there’s no point in randomly assigning people who just aren’t interested.
If there are more volunteers for a given treatment than there are program slots, then you can invite people to volunteer and tell them up front that there will be a lottery to get in. But sometimes that won’t work, and you just have to match on the observables, hope the resulting distortion isn’t too great, and tell your readers to be cautious in interpreting your resuls.
But there’s no excuse for cherry-picking by comparing those who make it througha program with a group matched to all of those entering the program. That’s just cheating. The only legitimate way to analyze the data is to keep everyone selected for the program in the study, regardless of how long they stay in the program. (This approach is called “intention-to-treat” analysis, a carry-over from its roots in medical-outcomes research.)
Nor is there any excuse for reporters regurgitating this pap without checking with the people who know better. (Finding someone who hates the program on ideological grounds to describe the findings as “junk science,” as the religion News Service did, doesn’t count.)
“So how,” I hear you ask, “does anyone get away with this shell game?” The answer is the same as for any sort of bamboozlement: it only works on people who, at some level, want what you’re trying to convince them of to be true. As Machiavelli didn’t quite say (but one of his translators said in his name), “men are so simple, and so driven by their needs, that whoever wishes to deceive will find another who wishes to be deceived.” The Prince, Chapter XVIII
Manuel, the Cabellian anti-hero, put it more succinctly, in the Latin proverb he borrowed as the motto of his house: Mundus vult decipi.
Update Just for a change, I have a constructive suggestion for something you can do about it.