Re: the Hopkins study of Iraqi deaths published in the Lancet:
1. No, I don’t know how much the overall Iraqi mortality rate has gone up since the invasion. Neither does anyone else. That suggest the need for less shouting and more measuring and calculating. (For a pretty good example of serious discussion, see the comment thread on this Crooked Timber post, and then compare it with the stuff coming out of Red Blogistan.
2. Jane Galt is entirely right to say that when a measurement or calculation generates an unbelievable result, it’s often wise not to believe it. That’s just Bayes’s Rule in action. What time is it when the clock strikes thirteen? Time to get the clock fixed.
3. But one also shouldn’t cling too strongly to prior beliefs when those beliefs aren’t strongly founded. That’s also Bayes’s Rule in action. And making the most careful estimate possible under the circumstances isn’t at all the same thing as “a wild-assed guess.” If lower estimates made with other methods cast doubt on the Lancet figure, by the same token it casts doubt on them. “This result has a large error band around it, and may be biased upward” is not the same as “This figure is worthless and should be ignored.”
4. Refusing to believe something because you’d feel terrible if it were true is not a good statistical method. The rage of the hawks against the authors of the study certainly stems in part from an unwillingness to contemplate the possibility that their pet adventure has cost something more than half a million lives. And their universal lack of expressed interest in finding out the true number tells very heavily against their sincerity.
5. The argument that 600,000 people couldn’t have died without there being more news stories depends on a claim about the accuracy of the newsgathering process in Iraq that doesn’t seem to be supported by evidence. Most of the killing seems to be Iraqi-on-Iraqi. More than half are by gunfire. How do we know that the incidence of individual homicide and small-scale massacre isn’t that high? If ten people were killed in each of a hundred villages one day, what reason is there to think that the newspapers would report a thousand-casualty day?
6. Fewer than a third of the excess deaths were from Coalition action. The rest are Iraqi-on-Iraqi. So comparisons with German civilian casualties in WWII are pointless.
7. Yes, the survey projected 600,000 excess deaths based on 547 actually reported deaths. That’s what “sampling” means, doofus. Every four years, pollsters in the U.S. project the results of voting by 100,000,000 people based on samples of 1000 or so, and get within a few percentage points.
8. The claim that 2.5% of Iraqis couldn’t have died without leaving visible depopulation is very week. Visible to whom? Certainly three or four times that number of people have left the country.
9. Incident counts under-report fatalities. So the fact that a population-based estimate comes up with a higher figure that adding up the incident counts is no surprise. Whether the discrepancy in this case is so large as to cast doubt on the population-based estimate is a question for someone expert in both sets of methods and on what’s actually happening in Iraq. The intersection of that set with the set of bloggers may be empty.
10. The paper claims that one team of four surveyors could survey a cluster of forty households in a day. That seems odd, and calls for some explanation.
11. The interviewers asked for death certificates, and mostly saw them. But the estimated number of fatalities is much larger than the total mortality figures compiled by Moqtada al-Sadr’s Ministry of Health. Either the sampling is off, or the interviewers were lying, or the families were showing phony death certificates, or the local officials who produce death certificates aren’t reporting them to the Ministry of Health, or the Ministry is failing to add them up right, either deliberately or not. Perhaps someone could go to the local authorities and ask them for their totals. But it wasn’t incumbent on the Hopkins folks to do so.
12. If the incident-based counts have been rising, that tells us something about the trend, even if the level of the incident-based counts is below the level of the survey-based estimate. So to say that when John Murtha cites those numbers he’s casting doubt on the Lancet report is intellectually dishonest even beyond the warblogger norm.
13. Estimating a confidence interval (“error band”) around a point estimate is a way of being honest about how much you know and don’t know. The argument “this study has a big error band, therefore it’s not reliable” (more or less what Medpundit says) betrays a quite astonishing level of either deception or ignorance.
As so often Nietzsche was there first:
“I have done that,” says memory. “I could not have done that,” says pride. Eventually — memory yields.
Daniel Davies puts the controversy in what seems to me the right context:
First, don’t concentrate on the number 600,000 (or 655,000, depending on where you read). This is a point estimate of the number of excess Iraqi deaths – it’s basically equal to the change in the death rate since the invasion, multiplied by the population of Iraq, multiplied by three-and-a-quarter years. Point estimates are almost never the important results of statistical studies and I wish the statistics profession would stop printing them as headlines.
The question that this study was set up to answer was: as a result of the invasion, have things got better or worse in Iraq? And if they have got worse, have they got a little bit worse or a lot worse. Point estimates are only interesting in so far as they demonstrate or dramatise the answer to this question…
And the results were shocking. In the 18 months before the invasion, the sample reported 82 deaths, two of them from violence. In the 39 months since the invasion, the sample households had seen 547 deaths, 300 of them from violence. The death rate expressed as deaths per 1,000 per year had gone up from 5.5 to 13.3.
Davies claims that the ratio between estimates of the number of dead from incident reports and the actual number is usually five or more. If that’s right, the discrepancy between the Iraq Body Count estimate and the Lancet estimate doesn’t look improbably large.