The Christian closet

Why are evangelical Christians who teach at top-rank universities “in the closet” about their beliefs?

I’m glad to see that Pete Stark has decided to step out of the “Godless closet.” He’s surely not the only unbeliever in the Congress, but he’s the only one with the nerve to say so, and on behalf of the great congregation of the faithless I thank him. (Wait! He’s not actually faithless, he’s a Unitarian. That’s someone who believes in, at most, one God.)

There’s no doubt that the bigotry of the Godly confronts atheists in public life with a choice between … imprecision … about their beliefs and ostracism. No doubt the same is true in other settings: the Air Force, for example, or the FBI, or Amway, or simply high-religiosity neighborhoods or towns, especially in the South and Midwest.

But of course we atheists are different, right? We don’t think people are going to Hell for not disbelieving the same things we disbelieve, so we have no particular reason to persecute. As a result, Evangelical Christians in, let’s say, academia, don’t find themselves forced into a closet.

Oh, wait (See “Sermon on Doubt,” below. The punchline comes in the last few paragraphs.)

Just imagine that you’re a recent Ph.D. looking for a first academic job. You happen to be an observant Christian or Jew, and consequently you customarily say a prayer of thanks before eating. You’ve just given your job talk, and you’re going out to dinner with the folks who get to decide whether you’re going to get a tenure-track job or instead be consigned to the life of an academic gypsy. Are you going to pray out loud, or even ask for a moment of silence before the meal?

Seriously, now.

If you think this is a secret from the Godly, or if you think they don’t resent it, or if you think their resentment doesn’t contribute to their bigotry against freethinkers, all I can say is, I think different. If I may borrow a phrase from a religious text, we’d do better in taking the motes out of Jerry Falwell’s eyes if we’d take the beam out of our own eyes first.

Update Several emails from readers on this, none sympathetic.

One denies that the phenomenon exists: “the plural of anecdote is not data.” Surveys show that religious people don’t like non-religious people; somehow that’s supposed to prove that non-religious people don’t discriminate against religious people when they have the chance.

Two others laugh it off, concentrating on the small issue of saying grace before meals rather than the serious issue of people who have to be “in the closet” about their most deeply-held beliefs in the presence of their professional colleagues.

If your faith requires you to say roll your eyes back in your head and speak in tongues before eating, you ought to be comfortable doing so. If you want to ask the others to join you in doing so, why not? If the people in the dept. you’re applying to can’t deal with it, you don’t want to be there.

Get it? Those “born-agains” are weird, so whatever bad happens to them is funny.

The fortitude with which people endure the sufferings of others, and especially of others they don’t care about, is a testimony to the greatness of the human spirit.


A Sermon on Doubt

Mark A.R. Kleiman

Dearly beloved, my sermon for today is on the text, “Doubt is the chastity of the mind.” (Can anyone give me the chapter and verse?)

Faith (as a euphemism for organized religion) was much in the news, until the Iraqi crisis started to wag the dog, thanks to the President’s “faith-based initiative.” Even before I learned that one of Pat Robertson’s organizations was an early grant recipient, I have had dark suspicions that most of the content of the plan is taking public money away from liberal secular non-profits in order to give it to right-wing churches (and black churches that are, if not politically conservative, at least willing to do business with Republicans). Not only will the new “faith-based” grantees be permitted discriminate in hiring in favor of their own congregants, they will be able to avoid all sorts of regulatory requirements as well.

It would be unfair to characterize the initiative as covering only areas where the secular contractors to be displaced weren’t contributors to the Bush campaign, or to areas where success isn’t mission-critical. Still, as far as I can tell churches won’t be allowed to compete to produce electric power or to build weapons systems for the Defense Department, although from an objective viewpoint prayer seems at least as likely to provide an effective missile defense as the secular alternatives now under consideration.

Moreover, in practice some faiths will surely turn out to be more equal than others when the goodies get doled out: Jerry Falwell has already started yelling about how bad it would be if any of his tax dollars went to support a mosque.

[Falwell pointed out, accurately if rudely, how intolerant al-Islam often proves when its adherents constitute a majority. He might have pointed out that the death penalty provided by Shari’a for apostasy makes Islamization a kind of trap door, and added something about the destruction of the Standing Buddhas by the Taliban.

[However, someone might have pointed out to him that the death penalty for apostasy and the destruction of idols are both commanded in general and praised in particular cases in the Bible, which Falwell proclaims as the inerrant Word of God.]

As one who sides with Jefferson and Voltaire against Franklin about the net social effect of hierarchical and congregational religion (which gives me pause, because I rarely disagree with Franklin or agree with the other two), I’m tempted to hope that Bush will undo what seems to have been Jefferson’s strategic mistake. By insulating religion from government in an attempt to starve it financially and minimize its political clout, Jefferson’s “wall of separation” has kept American religion independent, competitive, and (lamentably, in my view) vigorous. Contrast England or Sweden, where established churches have resulted in widespread indifference if not active unbelief. Perhaps if we have the Christian Right churches sucking at the public teat for a few years, we can put the Fourth Great Awakening behind us.

But wait a minute. The evidence that organized religion makes things worse on balance simply isn’t up to the standards I like to think I normally ask for before holding a strong belief. Yes, lots of churches, now and in the past, manage to spread hatred and intolerance, and resist both the growth of knowledge and measures to reduce needless suffering. But lots of other churches (and even the same churches) also constitute social capital, catalyzing public goods contributions of various kinds, and convincing some people who would otherwise doubt it that there is a difference between right and wrong. How the two sets of effects balance out has to be anyone’s guess. And at a personal level it is manifest that religion is a great source of solace to many believers.

So what, I ask myself, is really bugging me here?

It can be very helpful sometimes to see one’s own thoughts expressed in their stupidest and most vulgar form. An engineering professor from USC performed this service for me about a year ago, in an op-ed in the LA Times. His point was that faith means believing things clearly false (though he failed to quote Will Rogers: “Faith means believing what you know ain’t so”), or that one has no reason to think are true, and that counting such belief as a virtue is ridiculous.

Now of course at some level I know better than that. “Faith” can mean “keeping faith,” as opposed to apostasizing when the pressure gets too great or the price is right. That’s obviously a virtue. Faith can also, as C.S. Lewis points out, mean the capacity of holding on to well-grounded beliefs when under psychological pressure to abandon them (due, for example, to cognitive dissonance effects or pseudo-evidence against them created by illusion or sample bias); faith in this sense is the intellectual analogue of fortitude. There are good secular examples of such faith, such as the faith required to get through withdrawal from drug dependency, or to remember that making six good stock market guesses in a row doesn’t mean that you’re a financial genius.

Faith can also mean maintaining a link to a community and a tradition. As Karl Popper points out in “Toward a Rational Theory of Tradition,” the notion that “science” or “reason” allows us to “break free of tradition” and “think for ourselves” is nonsense. No individual can make any substantial progress toward truth alone, or starting from scratch; if each of us starts out where Adam and Eve started out intellectually, none of us should expect to get much further than they did. (In this sense, “No salvation outside the Church” applies as well to physics as to theology; science is necessarily a social enterprise, and theories not offered for review by the scientific ecclesia aren’t science.) The question, Popper points out, is whether traditions are accepted dogmatically or instead approached critically; science he takes to be fundamentally the practice of subjecting received beliefs about the world to critical tests, and changing them when the evidence warrants change.

When it comes to moral knowledge, the need for cooperation over time is even more obvious. Pure reasoning can’t answer the question “What should I do?” The right answer depends, in part, on a pre-existing body of practices and understandings shared by a social group.

Of course those customs change over time (a matter fundamentalists like to forget), and of course they can always be twisted to some extent by self-interest, self-deception, and social power and prejudice. Still, a body of customs, especially if written down, isn’t infinitely malleable at any one moment. Such traditions thus tend to put limits on selfishness and the power of the powerful, while also providing focal points for cooperative behavior. That’s why a written law code, even one embodying fairly appalling levels of injustice, still represents progress compared to rule by the whim of the rulers.

Religious traditions have something of the same character. Text and tradition can challenge prejudice and self-interest. For example, after the Ponzi scheme called “The Foundation for New Era Philanthropy” broke down, it emerged that some of the early participants, who actually came out ahead on the deal, were fundamentalist churches and related schools. There was never a hint that these groups had been complicit in the scheme; they were merely lucky to be net beneficiaries rather than victims. Legally, it wasn’t clear that they had any obligation to help make the losers whole, but, after a reportedly prayerful day-long meeting, they decided to give everything back. One of them told a reporter, “Nobody could cite any Scripture that said we could keep the money.” (Of course, from the outside, it’s easy to be disappointed in how rarely this happens; white Southern churches didn’t seem to find that the parable of the Good Samaritan implied anything in particular about race relations, and the Christian Coalition didn’t seem to think that the commandment “The stranger among you thou shalt not oppress” had any relevance to the question of denying medical care to the children of illegal immigrants. I claim only that sometimes traditions make people do good things they otherwise wouldn’t prefer to do.) There is also a small amount of “anecdata” to suggest that neoconservative intellectual converts to born-again Christianity sometimes find the texts of their new-found faith grinding against their old political beliefs: see Glenn Loury’s forthcoming book, Changing My Mind. Religious traditions about moral behavior can both supplement public custom, as embodied in law and informal practice, and contribute to its development.

Custom, and perhaps especially custom backed by religion, can also help people avoid the individual and social behavioral traps that arise when behavior maximizes locally rather than globally. Both required days of rest and prescribed feasts can serve in this way.

So if “faith” means giving deference to traditional bodies of moral doctrine as embodied in religious denominations, it can easily be a good thing individually and socially.

But “faith” also means “dogmatic belief”: for example, the belief that every line in the Bible is the inspired and literally inerrant word of God (presumably as dictated directly to King James). This, it seems to me, as it seemed to that engineer, is not a virtue. But it is exactly the kind of thing that gets described, and praised, as “faith” in current discourse.

As in this instance, the label “faith” is often placed on beliefs that are not merely unjustified but demonstrably untrue; the Bible can’t possibly be inerrant, since it’s not even internally consistent. (What was created on the Third Day? How many pairs of animals went on the Ark? Who was St. Joseph’s father? That these are the stock questions of village skeptics, and at least as old as Tom Paine, doesn’t make them any less telling.)

Again, I’m not really so obtuse as to think that religious beliefs are simply false, in the sense that thinking that Tallahassee is the capital of Idaho is false, or even in the sense that Aristotle’s theory of motion is false. Absolute truth about, e.g., the relationship between consciousness and the cosmos is not directly available to us, and partial truth and metaphor may be as close as the limitations of our minds and our languages will let us get. Unlike true propositions, appropriate metaphors need not be mutually consistent. I can approve of what Socrates says about his service of Apollo without believing that Apollo is real, either as Socrates is real or as the law of gravitation is real.

Heraclitus said, “That which alone is wise and good does, and does not, permit itself to be called ‘Zeus.’ ” The Jewish scruple against pronouncing the Name of God is usually understood to be a guard against impiety or against magic-working; but it might also be understood as a tacit admission that, as Lao-tse says, “The name that can be named is not the eternal Name.”

Taking religious language as poetic rather than prosaic means that simple categories of truth and falsehood don’t directly apply. That’s what allows the mystics to claim that they all see the same transcendent reality, while each affirms as well his or her own adherence to a particular faith tradition. And once we understand traditional moral teachings as deserving of respect merely as the traditions of the group – more or less like Constitutional law – their authority is no longer entirely bound up with the passages in various sacred texts that seem to embody factual assertions about cosmology, biology, or history, or even to the obsolete moral content of those same texts.

But of course that sort of liberal, relativizing pap is exactly what the churches of the Christian Right are in revolt against, and the Catholic Church has always rejected. No one who says that the Creation, or especially the Resurrection, is a metaphor need apply for a job in any publicly-supported “faith-based” social service program they run. It’s a pretty good generalization that the churches that more or less insist that they are right and everyone else is wrong, because they have a direct line to the Divine Will (whether from the Apostolic Succession or the Word Alone) – the churches that are most literal-minded about their beliefs, and insist as much on the traditional stories as the traditional morals, attributing both to God rather than human effort and pretending that they don’t change over time – have been growing, and the theologically “liberal” churches have been shrinking.

That is no accident. Starting from a strong set of beliefs backed by divine sanction, it’s possible to reason oneself into a less dogmatic frame of mind, while maintaining the intensity that came from the original dogmatism. But can such a sophisticated belief set be passed down from generation to generation? Children are highly imaginative, but also strongly literal-minded. If you don’t teach a child that, say, Methodism, is true (in a sense that means that the alternatives are false) then how serious a Methodist is he likely to be? It has been asserted – I don’t know with how much accuracy – that third-generation Reform Jews are very rare. The Unitarians who revolted against New England Congregationalism around 1800 took their religion with enormous seriousness. How are the mighty fallen! Even those of us who find Unitarian doctrine and practice relatively attractive find it hard not to laugh at Unitarian jokes.

(Did you hear about the Jehovah’s Unitarians? They go door to door – for no particular reason.

How can you tell the Unitarian church? It’s the one with the question mark on the steeple.

What does a Unitarian believe in? One God – at most.)

It’s just not clear that one can sustain a church over time without claiming more special access to truth than an appropriate epistemological and theological humility would allow. Consider how Buddhism and Taoism, each started in a purifying reaction (from Hinduism and Chinese ritual practice, respectively), have become encrusted with elaborate superstition.

To my mind, then, the doctrines (positive and normative) of any given “faith,” as actually preached and believed by most of those who preach and believe it, are mostly false, inconsistent with reason and evidence, inconsistent internally, and of course inconsistent with the tenets of other “faiths.”

I too have a creed. (Can I get a grant?) As far as I know, it was first preached by Socrates, though I learned it from my parents and first read it in one of Bertrand Russell’s books. Here it is:

Believing false things is wrong.

Cognitive limitations and the complexity of the universe make false belief inevitable, but it should be avoided a much as possible. Any given false belief, once identified, should be promptly abandoned, and its sources examined in an attempt to root out other false beliefs deriving from the same error and to prevent the introduction of new false beliefs.

Holding on to a belief known to be false out of an unwillingness to change one’s mind, or be seen to have changed one’s mind (Emerson’s “foolish consistency”), or because continuing to hold the belief is comfortable, or socially approved of, is contemptible. Spreading false belief, especially to children and others unable to defend themselves intellectually, is shameful.

So that’s why I’m so willing to believe that “faith” has bad results. I want it to have bad results, because I think it deserves to have bad results.

Or to put the matter more bluntly, I think that false belief is sinful. Human nature makes that sin inevitable, but each instance of it is to be repented with a firm purpose of amendment, and the will should be trained to resist it insofar as possible. Error (material heresy) is venial; stubbornness in error (formal heresy) is mortal. As to those who tempt the innocent into the habit of false belief, better it were that a millstone should be tied around their necks, etc., etc.

The sin of false belief, like other sins, often has bad consequences. Often, it leads to unsuccessful action. Sometimes, it leads to injustice. One good reason to discipline oneself to believe only what evidence and argument support is that the habit of believing what is false makes it easy to rationalize wrongdoing. In political terms, it is clearly useful for rulers to act on true belief; since democracy means that everyone shares in ruling, a democratic polity depends in part on the intellectual integrity of all the citizens. (”Democracy,” said George Bernard Shaw, “will never be a workable form of government until the common man resents a fallacy as much as an insult.”) Believing that the sun stood still for Joshua ought to be good practice for believing Bush’s arguments for his tax cut (or, to be even-handed about it, believing that IQ tests don’t measure anything real).

But the wrongfulness of believing what is false is not, as I feel it, exhausted by its bad consequences. Since the human mind is, to some extent, capable of conforming itself to the universe, it ought to try to do so. Refusing to make that effort is a kind of impiety, and forfeits an important part of the human dignity that comes from belonging to a species that has learned so many hard-to-find truths. It’s plain disgusting, that’s what it is.

If I sound like Jerry Falwell talking about fornication, that, too, is no accident. My attitude about believing and talking bullshit contains pretty much the same mix of anger and disgust, of concerns about personal and social consequences and concerns about uncleanness, as his attitude about non-marital sex.

In principle, I hate the false belief but love the false believer – knowing myself to be have many false beliefs, some of which result from laziness or selfishness – but when I see someone just wallowing in false belief that principle tends to get relaxed a little bit.

Of course, false belief sometimes has good consequences. (Since mine is not a fundamentalist creed, I’m allowed to believe and say such things.) The general theory of the second best implies that if someone has more than one false belief, correcting any one of them might take his actions further from optimal rather than closer to it. The same applies to imperfect self-control; the discussion in Book I of the Republic about whether justice requires giving a madman his weapons when he asks for them applies to ideas and facts as well as to swords and spears.

At a technical level, truth can also serve injustice, and false belief may help restrain it; I’m glad no one suggested that Heisenberg recheck his calculations about graphite as a moderator.

Morally, superstition may outperform skepticism, at least in the short run; ancestral belief systems are powerful personal resources and social binding forces, and weakening them can lead to disaster. Both Alcibiades and Critias, who between them wrecked Periclean Athens, were students of Socrates. Atheism can lead to nihilism: “If God is dead, everything is permitted.”

My attitude toward truth might fairly be described as piety, but it’s a polytheistic sort of piety; justice, beauty, and eudaimonaia (human flourishing) also compete for my devotional attention. As Socrates pointed out to Euthyphro, any given action may be pleasing to some of the gods and hateful to others.

Note that this whole discussion assumes that truth is objective, standing as a regulative principle to which any given belief conforms more or less accurately. It does not assume that any of us can know the truth, now or in the future; I’m with Popper (and with Plato as I was taught to read him) in thinking that the best we can have is right opinion: belief corresponding to truth, but without certainty.

In one way, a truth-seeking fallibilist, believing that the truth exists but doubting that he knows it, is in the middle between a dogmatist who believes that he knows the truth and a post-modernist multi-culturalist who believes (but how could he be sure?) that there is no standard on which to judge among competing socially constructed narratives.

In another way, though, both multiculturalism and dogmatism wind up together in affirming the value of inherited shared belief, in opposition to the fallibilist’s corrosive cosmopolitan skepticism.

Fallibilism generates humility. (Though some of us are especially skilled at keeping that humility hidden.) I’m sure that some of what I believe is false; or as Socrates put it, ironically, “All I know is that I know nothing” (i.e., that there is no certain knowledge).

That implies for a fallibilist a radically unusual attitude toward those with whom he disagrees: where a dogmatist wants only to convert, and a multiculturalist to appreciate “the other” without being threatened by it, a fallibilist wants (in principle) both to teach and to learn. Since truth is out there as a regulative principle, if you and I disagree one of us, at least, must be wrong. One of us might be right (at least partially or approximately) where the other is in error, or we might both be wrong, or we might turn out to have complementary pieces of the truth, but truth isn’t merely a matter of taste, and false opinions are treyfe (opp. of “kosher”: unclean) so something needs to change. That implies that I ought to seek out and engage those whose views differ from mine, and do so with as much of what the yogis call “non-attachment” to my current views as I can muster.

But though dialogue is useful, there’s no assurance that it will lead to agreement, or even to second-order agreement (agreement on what it is we disagree about). And the process will be, in some ways, less comfortable for dogmatists (almost always the majority) than for fallibilists. In the meantime, we need to live together and transact public and private business. We need, in short, to be tolerant.

Tolerance as a virtue, and toleration as a policy, require a certain “bracketing” of one’s own beliefs in dealing with, and thinking about, those whose beliefs are different. (”Given his beliefs, that’s a reasonable thing for him to want to do, so …”) That seems especially important with respect to religion, where so many mutually inconsistent beliefs are held with so much fervor and so little evidence. (”Faith is the evidence of things not seen.”)

The question is how far to take matters – how far to extend the pale of toleration – both as to practice and as to teaching likely to lead to practice. Ritual murder is clearly too far; I don’t have to be sure, in a metaphysical sense, that it isn’t divinely commanded to be sure at a pragmatic level that I want to stop it: by persuasion if possible, but stop it in any case, even if that means using violence or suppressing the ideas that lead people to want to kill witches, or Jews, or homosexuals.

But how about beliefs that injure only those who hold them? Or only them and their children? Not so obvious.

When tolerance first came to be recognized as a virtue, its boundaries were drawn rather tightly. It was acknowledged that some sectarian differences involved matters of non-essential opinion or practice. About these there might be toleration as a policy and tolerance as a social convention, but differences on crucial points were still taken seriously: Maryland still calls itself “The Free State” to commemorate its seventeenth-century Statute on Religious Liberty, but the statute itself provides the penalty of death for anyone who denies the Trinity. While what was essential (”necessary to salvation”) was itself a controverted matter, the principle of tolerance only with respect to inessentials was a coherent one.

But in the reaction to the Wars of Religion that followed the Reformation, and under the influence of the Enlightenment, polite, respectable people adopted the convention that criticizing another person’s (false and—literally— God-damned) beliefs was “not done”: doing so was as vulgar as criticizing his (ugly and ill-mannered) children. Tolerance and toleration were accepted as a kind of arms-control agreement among warring faiths.

[Most of the faithful, at least the faithful of the Abrahamic religions (but also of Greek paganism during its period of dominance) have been willing to relax this rule of tolerance when it comes to those of no “faith” at all. “Atheist” has been a deadly insult at least since Meletus hurled it at Socrates. Senator Lieberman’s assertion that no atheist could be an ethically good person would have been politically fatal if applied to the holders of any other religious position short of outright demon-worship. The view of the faithful seems to be that atheists are either simply rebellious subjects of the Divine Majesty — whose existence and rightful claim to rule they know in their hearts even as they deny it with their tongues – or, if sincere likely to be nihilists and therefore socially dangerous. Lieberman’s libel was hardly original; Locke said as much in his Letter on Toleration.]

The growth of what might be called “strong tolerance” – tolerance extending to essentials – probably reflected, and almost certainly fostered, a weakening of dogmatic faith. Cognitive dissonance theory predicts that not acting on a belief will tend to erode that belief.

So how is a fallibilist truth-seeker supposed to deal with dogmatic religious belief, as an individual and a citizen?

One version of liberal thought, associated with Rawls and Dworkin, holds that actions taken by public authority should be impartial as among alternative conceptions of the good life. This seems to imply that politics can be separated from metaphysics.

I doubt it. How can a custody battle be decided according to “the best interests of the child” without a theory about what those interests are? And if the decision is made considering the child’s physical health but without reference to his chances of eternal bliss, how is that “neutral” between those who think they know what leads to such bliss and those who doubt?

In a liberal republic, the citizens rule themselves individually and one another politically. That regime thus benefits if the citizens hold opinions corresponding to the truth, and hold them critically rather than dogmatically. (With the likely exception of the belief that the liberal republic is the best regime, which probably needs to be taught dogmatically.) That’s one reason free governments foster (secular) education and scientific research.

But neither education nor science is metaphysically neutral. Science, in particular, however dogmatic in practice, is fallibilist at its core. And a school that doesn’t teach students to believe any revealed religion in particular effectively teaches them not to believe in any revealed religion at all. (Not much more effectively, one might infer from Americans’ astonishing religiosity, than our public schools teach anything else, but it’s the thought that counts.) There’s more than a trace of truth in the Christian Right’s tired old complaint that public education involves indoctrination in “secular humanism”: silence on religion is a loud silence.

There is also a strong case to be made, even from an unbelieving viewpoint, that silence on religion is appalling pedagogy. Religion, after all, is an important aspect of human life and history. How are students supposed to understand Martin Luther King without knowing about Martin Luther?

But, that said, what exactly should we be teaching about religion in the public schools? The set of reasonably correct statements that won’t seriously offend any substantial group is approximately the empty set.

(Take Luther, as an example. What could be said about him – e.g., about his excommunication, which still stands – that wouldn’t offend either Catholics or Protestants? An accurate translation of either side of his controversy with More would be too scatalogical to get past the censors. And how about Luther’s anti-Semitism? Or if Luther is too tough, how about Joseph Smith? Should we, or should we not, teach schoolchildren, Mormon and non-Mormon alike, that he was a career con artist?)

The empty set is, in consequence, approximately what we teach. The alternative would be school-board elections fought along sectarian lines, and the systematic inculcation in classrooms of demonstrable falsehood. And what hope do we have of finding schoolteachers competent to teach something as subtle as comparative religion, considering the hash they made of the “new math”?

So the current policy may be the least bad practicable approach, but let’s not pretend it’s somehow “neutral.” It favors my creed over the faiths held by the majority, and it requires schoolteachers with strong beliefs in one or another of those faiths to keep a central aspect of themselves out of their teaching.

Not in college, of course. We assume that college students are intellectually tough enough to confront strong, and conflicting, opinions among their instructors.

Don’t we?

This question is especially lively for me at the moment. Along with some colleagues, I’m trying to put together a conference, or series of conferences, on mystical experience: what it is, what occasions it, what its consequences are, and what to do about it in law and policy given the evidence that some of the controlled substances can be used as triggers for it. (It’s possible to believe that primary religious experience can be a personal and social blessing without believing that the experiences involve actual encounters with anything outside the minds of those who undergo them.)

I sent a sketch of the conference to a friend who is a tenured public policy professor at a first-rank university and, as he has told me, a committed evangelical Christian. He sent it back with a series of very good questions, based on his personal religious experiences. I wanted to share those thoughts with the organizing group, and asked his permission to do so.

His reply rocked me back on my heels. He was willing to have his thoughts spread around, but not with his name attached. Being known as a believer would create too great a risk to his academic career.

Now this is someone I’ve known well for more than a decade. While I’d heard him mention how out-of-place his beliefs made him feel in academia, it just hadn’t gotten through to me that he was actually “in the closet” about his religion. In answer to my shocked inquiry, he has informed me that there is in fact a kind of “evangelical underground” (my phrase, not his) in academia, and perhaps more broadly in business and the professions.

All of which reminds me, yet again, how skin-deep much of liberalism and multiculturalism actually turns out to be. It also leaves me at a loss about what to do.

On the one hand, the topic ought to be brought out into the open for discussion. Religious bigotry is loathsome in any circumstance, but it’s especially intolerable in a university. In the name of all the gods (existent or not), what are we afraid of?

On the other, I’d hate to contribute to public dislike of either academics or atheists. (The latest Pew survey shows that about two-thirds of those who express an opinion “disapprove” of atheists as a group, the only religious description that draws majority hostility.)

One thing I’m sure of: dogmatic religion may be wrong, even foolish; I’m pretty sure most of it is. But dogmatic atheism doesn’t even have the excuse of a supposed divine commandment to excuse it. As for dogmatic fallibilism, its refutation is left as an exercise for the student.

Here endeth the sermon. Say “Amen,” sombody.

Author: Mark Kleiman

Professor of Public Policy at the NYU Marron Institute for Urban Management and editor of the Journal of Drug Policy Analysis. Teaches about the methods of policy analysis about drug abuse control and crime control policy, working out the implications of two principles: that swift and certain sanctions don't have to be severe to be effective, and that well-designed threats usually don't have to be carried out. Books: Drugs and Drug Policy: What Everyone Needs to Know (with Jonathan Caulkins and Angela Hawken) When Brute Force Fails: How to Have Less Crime and Less Punishment (Princeton, 2009; named one of the "books of the year" by The Economist Against Excess: Drug Policy for Results (Basic, 1993) Marijuana: Costs of Abuse, Costs of Control (Greenwood, 1989) UCLA Homepage Curriculum Vitae Contact: