Sample bias in the Lancet study?

Did the Hopkins team oversample high-mortality areas?

According to David Berreby at the Huffington Post, there seems to be a real challenge to the Hopkins study of deaths in Iraq published in the Lancet: maybe the sampling method, which started from major intersections, oversampled areas with higher-than-average mortality.

I don’t know nearly enough to understand whether this is right or not, but it seems plausible. It will be interesting to compare the reactions to this challenge from opponents of the war to the reactions to the original study from supporters of the war.

I predict a substantially less accusatory and hysterical tone; the authors of the objection are very unlikely to be subjected to the level of vituperation directed at the authors of the study.

Update Tim Lambert says the new critique doesn’t amount to much.

Author: Mark Kleiman

Professor of Public Policy at the NYU Marron Institute for Urban Management and editor of the Journal of Drug Policy Analysis. Teaches about the methods of policy analysis about drug abuse control and crime control policy, working out the implications of two principles: that swift and certain sanctions don't have to be severe to be effective, and that well-designed threats usually don't have to be carried out. Books: Drugs and Drug Policy: What Everyone Needs to Know (with Jonathan Caulkins and Angela Hawken) When Brute Force Fails: How to Have Less Crime and Less Punishment (Princeton, 2009; named one of the "books of the year" by The Economist Against Excess: Drug Policy for Results (Basic, 1993) Marijuana: Costs of Abuse, Costs of Control (Greenwood, 1989) UCLA Homepage Curriculum Vitae Contact: Markarkleiman-at-gmail.com

12 thoughts on “Sample bias in the Lancet study?”

  1. I may not be a statistician, but I did ace statistics at (Michigan) Tech, and one thing I remember that nobody defending Lancet seems to want to address:
    Sampling errors aren't the only source of error in polling. Frequently they're not even the largest error in a poll. They're just the easiest to calculate.
    People frequently lie to polsters here in the US, to avoid nothing more than embarassment. In a war zone, where the wrong answer might get their families murdered, why would we assume that polsters are getting accurate answers from their samples?

  2. This is just a guess from an average guy, but judging from the fact that the interviewers apparently all survived, it seems likely they were working in areas of lower than average danger.
    Either that, or the press corps in Baghdad are all lounging around the hotel pool when they should be out doing interviews.

  3. Catowner,
    I have several friends working as journalists in Baghdad and Iraq. While it's true they don't do as many interviews as they should, your assumption on the reason is incorrect, it has nothing to do with pools. The reason is that it's simply too dangerous for the journalist and the interviewed for it to happen as though this were occuring in Mayberry.
    Perhaps it's time to change channels over from Fox… But then, I can't think of a televised alternative that would help…

  4. Brett:
    Of course misreporting is possible. But what reason is there to think it happened in this case? Note that each household was asked about births and deaths before and after the invasion … sorry, I meant "libeation." The study looked at the difference. The pre-liberation births and deaths, projected onto the national population, closely matched the official counts.
    So you need to come up with a story about why families decided to exaggerate the numbers of their dead after the liberation but not before.
    Possible? Sure. But not very plausible.

  5. DeLong had this, too. I think the criticism ranges from false to misspecified. My comment is here: http://delong.typepad.com/sdj/2006/10/back_alleys
    ——————————————————
    I could see a slight bias. Did Johnson and Holloway examine the actual data set, or were they working off the description? I do not have the data and a map, but "all the households surveyed by the Lancet authors were on main roads or at intersections of smaller streets with major arteries" seems false.
    The study selected a "main street" (which is not a "major artery" in my book, but again I don't have the data and a map), then selected a residential street which intersected it (i.e., not all but none of the households sampled were "on" main roads). Somewhere on that street (not at the corner except by chance), a start household was selected, then 40 adjacent households were sampled.
    Hence, as I said, I could see a slight sampling bias for corners along main roads versus the previous GPS method. However, I also see a slight sampling bias against main roads off the corners. Based on the pullquote cited, I'd say Johnson and Holloway make a misleading and factually incorrect assertion.
    If they actually ran the numbers and found bias, I retract this analysis. However, I bet they didn't.

  6. This is from the Science article:
    "…Burnham counters that such streets were included and that the methods section of the published paper is oversimplified. He also told Science that he does not know exactly how the Iraqi team conducted its survey; the details about neighborhoods surveyed were destroyed 'in case they fell into the wrong hands and could increase the risks to residents.'"
    Well I guess we'll never know.

  7. Lambert's post:
    scienceblogs.com/deltoid/2006/10/science_on_lancet_study.php
    Seems to me he's getting a bit out ahead of the facts of the matter, but we'll see.

  8. The indefatigable Brett rears his ugly head again. Brett, it's also likely that the respondents *minimized* the number of deaths – that there was suubstantial underreporting. Like the 'didn't see nothing' attitude that can confound police efforts when investigating shootings in dangerous neighborhoods, where nobody wants a reputation as a 'snitch'.

  9. Brett Bellmore at October 21, 2006 07:00 AM
    a war zone, where the wrong answer might get their families murdered, why would we assume that polsters are getting accurate answers from their samples?
    Oh, Brett, don't be silly. Maybe because, in most cases in which the interviewees claimed to have had a death in the family, they asked and usually got death certificates. And in most of those cases, cause of death was indicated as being something like gunshot wounds. You really should read the report of the study before commenting on it.
    It strikes me that, if there was false answers for the reason that you mention, the death count would be under-reported, not the other way around.

Comments are closed.