Against the mushroom strategy

Philippe de Croy links to a newspaper account of a RAND researcher saying that if the sort of asteroid that ended the Cretaceous Period and wiped out the dinosaurs were heading for the Earth, the scientists who found out about it, and the government officials they told, shouldn’t tell the rest of us.

Philippe uses this as the jumping-off place for a quite sensible discussion of the “terror alert” system, reaching the conclusion, with which I heartily agree, that when it comes to warnings, less is more. (He also points out that the incentive system facing officials pushes them in the direction over-warning.) Philippe makes the important broader point that it would help if the population cultivated an attitude of calm rather than one of panic. Since the terrorist threat doesn’t add significantly to the risks of daily life, the most sensible approach in personal decision-making is to ignore it. Staying off airplanes because you don’t want to wait in security lines is sensible; staying off airplanes for fear of hijackers is simply loony, and socially irresponsible to boot.

The same is true of crime; the vast bulk of the social cost of criminal activity is due to precautions against victimization rather than actual victimization. Would you rather hear that your city’s crime rate had been cut in half, or that its auto-accident rate had been cut in half? Right. Me, too. Yet auto accidents account for twice as many deaths, and much more serious injury and property loss, than criminal activity. The bulk of the very large social costs of crime stem from crime-avoidance behavior, ranging from staying out of the park after dark to buying an alarm system to moving, or relocating your business, to a safer neighborhood. (Some of that crime-avoidance behavior is fully rational at the individual level. But it still has costs, some of which land on other people: a good reason to have a publicly-supported crime-victim-compensation program.)

The cognitive psychologists have shown that the mind tends to process both tiny risks and moderate-sized risks as small risks. And for something really scary, even a vanishingly small risk is seen as intolerable, as long as the risk has a name to put on it. In the face of imperfect rationality, perfect information is not necessarily a social benefit.

So naturally I agree with RAND’s Mr. Sommer that scientists and officials ought to adopt a policy of not telling us what we would be better off not knowing.


I have two objections to Mr. Sommer’s analysis, one substantive and fairly small, the other moral and political and much more serious.

The substantive point is that the inevitability of a huge asteroid impact doesn’t mean that there isn’t anything to do about it. Given a week, or a month, or three months, before the species was extinguished, I’d want to say farewell to my loved ones, make my peace with my gods if any, and eat more cheese and dessert. I’d also reschedule any dental work and spend more time hiking and less time cleaning out the garage and reviewing articles for scholarly journals. [It’s even possible that the survival probability from a big asteroid hit is non-zero, and that I might be able to do some things to make my survival more likely or the future of the human remnant less terrible.]

But perhaps Mr. Sommer has factored all this in, and decided that total human welfare would be less under conditions of end-time panic than it would be under conditions of blissful ignorance. That leads into my other objection. He has no right to make that decision for the rest of humankind, and neither does anyone else.

Even if you dissent from that moral stance, there remains the political problem. A known or suspected governmental policy of denying that the Big Asteroid is coming if ever it does come will tend to increase the credibility of rumors that “The Asteroid is coming and the government is covering it up.” Since rumors are more common than Big Asteroids, it’s not clear that the dishonest policy will reduce total social cost, even on the paternalistic calculation.

Moreover, there’s the political problem. Governmental lying is fundamentally incompatible with the maintenance of a democratic republic. Some amount of lying and secrecy are inseparable from the conduct of foreign policy, intelligence activity, law enforcement, and military affairs, but that is a bug, not a feature, and we ought to try hard to keep the necessary deception of enemies and criminals from lapping over into the deception of the voters, who after all hold the supreme office.

And as Lincoln pointed out, credibility, unlike classified information, cannot be compartmentalized. A government known, or suspected, of being willing to lie to us about the Big Asteroid will also be suspected — not unjustly — of being willing to lie to us about, for example, Iraq’s acquisition of weapons of mass destruction or Iraqi connections to al-Qaeda.

In governing a democratic society, honesty really is the best policy.

Author: Mark Kleiman

Professor of Public Policy at the NYU Marron Institute for Urban Management and editor of the Journal of Drug Policy Analysis. Teaches about the methods of policy analysis about drug abuse control and crime control policy, working out the implications of two principles: that swift and certain sanctions don't have to be severe to be effective, and that well-designed threats usually don't have to be carried out. Books: Drugs and Drug Policy: What Everyone Needs to Know (with Jonathan Caulkins and Angela Hawken) When Brute Force Fails: How to Have Less Crime and Less Punishment (Princeton, 2009; named one of the "books of the year" by The Economist Against Excess: Drug Policy for Results (Basic, 1993) Marijuana: Costs of Abuse, Costs of Control (Greenwood, 1989) UCLA Homepage Curriculum Vitae Contact: