Although I don’t expect Al Gore to run for president (and it probably would not be healthy for our political system if he did), he is still the only contender (besides Kucinich and probably Kerry) who unequivocally opposed the Iraqi invasion from Day One. (Ironically, potential Republican candidate Ron Paul did too). Gore did so loudly and forcefully, with passionate and well-reasoned rhetoric. He repeated this message over and over again to anyone willing to hear.
Here’s a Dale Keiger article about the biostatisticians who issued the report about the increased mortality rate occurring in Iraq because of the invasion. (You know, the one saying the invasion caused an increase in 650,000 deaths). One of the interviewed biostatisticians comments:
As a biostatistician, the Bloomberg School’s Zeger has thought a lot about the study. “I am so impressed by Gil because he was able to conduct a scientific survey on a shoestring budget under very difficult circumstances,” he says. He does not dismiss all concerns about the methodology. “It was the best science that could be done under the circumstances. We’re always making decisions absent scientific-quality data — that’s public health practice.” But he draws an important distinction between practice and science. “We tend to have a different standard for scientific research. This study was on the research end. It was published in a scientific journal. There are a lot of aspects that are below the reporting standards you would have if you were doing a U.S. clinical trial, for example: the documentation for each case, the ability to reproduce the results, detailed information about how everything was done. I think it would be useful for the school and the public health community to think through these kinds of issues.
“[But] it’s absolutely appropriate, on very limited resources, to go into a place like Iraq and make an estimate of excess mortality to use in planning and making decisions. My own sense is I would rather err on the side of generating potentially useful data, with all of the caveats. I think noisy data is better than no data.” Zeger notes that the tests of the data’s validity, built into the second survey at his recommendation, all checked out. He admits the numbers are hard to grasp, especially the study’s estimate that from June 2005 to June 2006, Iraqis were dying at a rate of 1,000 per day. “That’s a lot of bodies,” he says. “I have a hard time getting my mind around that. But as a scientist, what do you do? That’s the number.”
The paper’s authors talk about the study’s implications:
In the debate over the Lancet paper, Burnham and Roberts sometimes found themselves making an argument that was awkward for scientists: The accuracy of our figures is not the most important aspect of this research. Yes, we are sure of the data. Yes, we are confident of our estimate. Of course the figures are important. But even if the number 654,965 is wrong, that’s not the point. People whom the U.S. has a duty to protect are dying. That’s the point.
Roberts told the Australian radio program The National Interest, “The huge cost is how many people have died. For [official estimates] to be consistently downplaying that by a factor of 10 prevents us, I think, from justly evaluating this venture. For political leaders to be talking in swaggering tones, saying things like ‘better we’re over there fighting them than they’re over here fighting us,’ I think is exceedingly inappropriate at a time when virtually every extended family in Iraq has lost someone. We think 2.5 percent of the population has died, and at this moment in history it is really, really important for the political leaders who began this venture to be expressing contrition.”