I’m observing a remarkable constellation of very high information-density items popping up in the news. My favorite, just on aggregate looniness grounds, is that (i) Grover Norquist has arranged for the (ii) CPAC to be infiltrated and subverted by the (iii) Muslim Brotherhood. Then there’s the astonishing 1998 finding that vaccinations cause autism. A Cornell psychologist has demonstrated precognition. In the “news you can really use” department, why we don’t need to worry about air fares this summer. And on the other side, we learn that the strongest correlate of rejection of the avalanche of consistent, varied, tested, evidence of anthropogenic global warming is personal political preference.
The information in a signal, I hasten to point out, is the negative logarithm of its prior probability of reception, and that probability is a property of whoever is receiving it. This can get complicated; it’s not surprising to me that Frank Gaffney at any moment will say something totally batsh.t, but “the CPAC is lousy with Muslim terrorists” was probably quite surprising to the ACU’s people. The information in a signal is not at all the same as the truth of the signal’s content, though; that a signal “A is B” has very low probability for receiver X is equivalent to “X believes A is not B”, so there’s the issue of base rates. Most people are Bayesians, integrate new information with what they already know, and when they hear hoofbeats, think horses, not zebras. And not, in Arthurian England, coconuts; the discussion of these in the movie raises the further issue of how fairly patent evidence should be integrated with the complete absence of any plausible theory to explain it, relevant to precognition and all the other psychofakery out from under which James Randi keeps pulling the rug. Global warming, early on, was a coconut among the gold standard, cold fusion and lots of zebras and unicorns, but now it’s not; it doesn’t do to ignore unlikely signals, any more than it’s wise to embrace something just because it has a high wow index.
Integrated into what? It would appear, belief. But it’s not so clear what this is. Statements of belief are an indicator, but no more, because talk is cheap and anyway the best predictor of what we say is what we want to be heard saying, especially “hear ourselves saying”. Irretrievable commitment of resources is a better one; being late for a meeting because you waited for a traffic light to cross a busy street is pretty good evidence of what you believe about the physics of vehicle-pedestrian impact. But not perfect: global warming deniers almost certainly include some folks who would rather make more money selling fossil fuels in the near term than leave future generations a habitable planet; Andrew Wakefield seems to have simply sold the children of strangers for a big fat bribe.
Belief is very social. It’s not just that the people you hang out with greatly bias your sampling of evidence in their conversation, but that they reinforce this or that view of your own worth. I wouldn’t be surprised if it’s now so comfortable for Wakefield to think of himself as a fighter for truth and children’s health, basking in the adulation of Jenny McCarthy and her friends, that he really believes the nonsense he unleashed. (He should still be in jail.) And of course reputation and self-regard are resources, so the more of them you’ve bet on filling a 10-Q-K-A straight with three jacks showing in other players’ face cards, the harder it is to fold.
There’s no escape from engaging with surprising propositions. Any one of them is probably wrong, but you have a lot to gain by acting on the ones that aren’t. Base rates and priors (not hopes and ego!) are relevant and useful; it should have taken a lot more than twelve kids and one paper to derail all the other evidence of the net salutary effects of vaccination. Follow the money never hurts, cui bono? Finally, there’s a utility test, as in, “so what?” As an engineer I was trained to respect empirical equations, like the ones we still use for turbulent flow in pipes even while we hope for something better, because they are useful and the more elegant theoretical model will trash your plumbing. Conversely, an amazing proposition with no utility is probably just wrong: they took the thimerosal out of the vaccines and autism rates didn’t change. A good diagnostic of pseudoscience like pyramids and spoon-bending is the enduring triviality of the purposes to which it can be put; anyone who can really move objects at a distance by will would be disarming bombs, not bending spoons; who wants his spoons bent? If you can see the future, you will be cleaning up on Wall Street, not predicting that a dirty picture will pop up on a computer screen. Even if a pyramid over my sandwich makes it spoil more slowly, why is that better than the refrigerator (not to mention, how does the pyramid know not to do its other miracle, accelerating decay, which is promised if I put it over my compost pile)?