Re-reading five-plus years of the RBC (back then, the “community” was just me) on Iraqi policy isn’t a cheerful activity, though I was pleasantly surprised to find just how much skepticism I expressed at the time. Obviously, I should have expressed more.
Still, this email from a reader (received just after the previous two posts) struck me as making the error Aristotle warned of: expecting more certainty from an inquiry than its methods will support.
I’ve only started following your ‘Reality Based Community’ blog recently. Great name and great blog. But it was shocking to read that you fell for the war advertisements 5 years ago. I’m just a physicist, but I’ve attached the leaflet we were handing out the day the war started. The information was all readily available. What goes on in those Public Policy Schools?
To which I answered:
What goes on in these public policy schools is that we think as clearly as we can, with what information we have. (As it happens, I’m not an expert on foreign policy, so I was thinking on the basis of facts provided by others.) But we don’t own crystal balls. And you might recall that, back then, Bill Clinton, Tony Blair (not yet known as Mr. Bush’s poodle) and Vaclav Havel were all on the pro-war side of the argument.
In the sciences, it’s not considered discreditable to have proposed a theory that doesn’t stand up to experiment. That would be a useful lesson for politics. If you’re shocked that smart, thoughtful people make big mistakes, you might want to recalibrate your shock-meter.
On any yes-or-no question, the prior probability of being right by making a random guess is 0.5. So merely having reached the right conclusion once is no great sign of wisdom. The more you know and the smarter and more thoughtful you are, the more you can bias the odds in your favor. So having reached the wrong conclusion once is some evidence against one’s smarts, knowledge, thoughtfulness, or all three. But it’s not perfectly conclusive evidence. If you want to know whether Person X is likely to make correct guesses in the future based on X’s guessing record in the past, you need to review X’s approach to those previous questions, not just tot up right and wrong guesses.
Of course, that implies that the future is not fully predictable, so one feature of X’s approach you want to look at very carefully is X’s awareness of just how unknown the unknowns are. Anyone who says, “I was right before, so I’m right now” is either a fool or a scoundrel.
Now of course there are some decisions that don’t, fundamentally, depend on predictions. If you think that going to war with Iraq was a crime, not just a blunder, then your position with regard to those who supported going to war is justifiably different. But in that case the finding that invading Iraq was a catastrophically disastrous blunder — easy to see now, unless you’re John McCain or George W. Bush or one of their worshippers, but harder to see in prospect — doesn’t really add much to your initial conviction that it was a crime.
It’s fair to expect those of us who got this one wrong to put different weights next time on the costs and risks of war compared to the costs and risks of the alternatives. But the next hard decision is going to be hard in some different way, and a simple reflex based on the “lessons of Iraq” won’t provide a definitive answer. Opponents of resistance to the Axis pointed to the “lessons of World War I,” and proponents of the War in Vietnam pointed to the “lessons of Munich.” Like Beethoven, history repeats, but always with variations.