The Genetics Profession Confronts its Troublesome Inheritance

On August 8, a remarkable letter appeared in the New York Times Sunday Book Review. Written by a group of five leading evolutionary geneticists and signed by another 135, it repudiated the main conclusions of Nicolas Wade’s book A Troublesome Inheritance. Wade was for many years the main science reporter for the New York Times covering developments in genetics and biology. His book purported to summarize the main findings of the research he had been covering: that the European, African, and Asian races are genetically defined and that they have faced different evolutionary pressures that have given them what he claimed are different intellectual, behavioral, and civilizational capacities.

The book has been widely reviewed and, apart from a glowing endorsement from conservative policy writer Charles Murray, has received largely negative assessments. Wade’s main response has been that the commentators lack the stature and expertise to criticize his ideas. Thus, when the 135 scientists, many of whom Wade cites as his own authorities, blasted his argument as “incomplete and inaccurate” and with “no support from the field of population genetics,” his thesis had been dealt a mortal blow.

But to understand what makes the move of these geneticists so remarkable, you need some history and sociology of claims that genetic science explains racial differences in intellect and behavior.

In 1969, educational psychologist Arthur Jensen used ideas from the emerging field of behavior genetics in an article claiming that the IQ and educational achievement gaps between black and white children were due in large part to genetic differences between the races, and that educational efforts to close the gap must therefore fail. This was the era of intense conflicts over civil rights and President Johnson’s Great Society. Jensen’s writings sparked student protests and heated academic debates. Not surprisingly many education scholars, social scientists, and psychologists denounced Jensen’s work, but so too did many geneticists. In 1975, 1,390 members of the Genetics Society of America co-signed a statement that said “there is no convincing evidence as to whether there is or is not an appreciable genetic difference in intelligence between races” and over nine hundred had signed a stronger repudiation of Jensen’s work.

The IQ and race controversy was traumatic for researchers interested in genes and behavior. As debates raged about science, politics, and ethics of the research, the field fragmented into mutually distrusting groups and many geneticists completely abandoned behavior as a topic.

A quarter century later psychologist Richard Herrnstein and Charles Murray published The Bell Curve, an 845 page doorstop that made a very similar argument to Jensen’s: the lack of success of Latinos and African Americans relative to whites and Asians has a strong genetic basis. American inequality, they argued, is mostly genetic. This time, the response was very different. Social scientists and liberal pundits decried the work, criticizing the science and linking it to the history of scientific racism. However, biologists and geneticists largely ignored the debate. Those who tried to intervene, like Stephen J. Gould, were often perceived as politically, rather than scientifically, motivated. Geneticist David Botstein explained his peers’ silence: The Bell Curve “is so stupid that it is not rebuttable.” Members of the Human Genome Project’s (HGP) Ethical, Legal, and Social Issues (ELSI) division hoped to organize project leadership to publicly distance genetics from the book’s racial ideas. It took two years for an ELSI statement to be allowed to appear in a specialist genetics journal, but HGP leadership remained publicly quiet. Soon thereafter ELSI was reorganized and its public activism discouraged.

We tend to think of a scientist’s public responsibility as a matter of individual commitment. But it has much to do with the structure and culture of scientific communities. The IQ controversy from the 1970s had spurred changes driving geneticists’ disengaged approach to The Bell Curve in the 1990s. Conflicts fragmented the research community so geneticists rarely interacted with behavioral scientists and weren’t comfortable engaging their claims critically. Mistrust made it impossible to see public criticism as legitimately scientific rather than purely political. And the outsourcing of ethics to ELSI made it difficult for many geneticists to see the public interpretation of scientific controversies as their business.

The genetic evidence for racial behavioral differences hasn’t changed in the 45 years since Jensen wrote, but geneticists’ public responses have. The recent collective response to Wade’s book is heartening because it indicates that geneticists are coming to see that a new approach to the public interpretation of their science is needed. Because it aims to tell us about human similarities and differences, capacities and potential for change, there will always be a public politics to genetics. The difficult work of the public interpretation of contentious issues cannot be left to social scientists and ethicists (whose genetics credentials will be questioned) or to individual geneticists (whose motivations will be questioned). This group will take heat for their stand, but they cannot be doubted as scientists or marginalized as individuals. They will learn, I believe, that being political in this way—soundly criticizing public misappropriations of their research—can only be good for the long term legitimacy of genetics.

Aaron Panofsky is Associate Professor in Public Policy and the Institute for Society and Genetics at UCLA. His recent book, Misbehaving Science considers the scientific and political controversies surrounding behavior genetics.

Letter to the Washington Post’s Public Editor Regarding Poor Science Reporting

I complained to the Washington Post about their poor science reporting regarding addiction, but got no answer

I emailed this to the Washington Post’s Public Editor 5 weeks ago, and was disappointed to get not even a perfunctory answer.

Dear Mr. Feaver,

As a professor of psychiatry and addiction treatment researcher at Stanford University, I was very disappointed that the Washington Post was among the news outlets that reported almost verbatim from a press release the claim that Oreo cookies are as addictive as cocaine.

As I described on our university’s medical school blog, the “proof” for this assertion was an undergraduate research project at Connecticut College which has not been published, peer-reviewed, or indeed even presented in any public forum.

Yet this Post story by Valerie Strauss took a stenographic approach, passing along the claims of the press release almost word for word. Indeed, her story even reproduced the photo and large block quotes from the release. Ms. Strauss barely added any words of her own to her article, and certainly none that conveyed appropriate skepticism.

It is to the Post’s credit that a few days later Stephanie Pappas wrote a critical column about the study, quoting an expert who pointed out the fatal flaws in the research. However, it is worth noting that while her article opened with a dig at the “blared headlines” about the study at Fox News and Time, it did not own up to the fact that The Washington Post itself was among those media outlets which uncritically passed along the sensational and untrue statements in the press release.

I understand the pressure to publish and to do so quickly. But I would like to see leading newspapers such as yours implement some policies to ensure that speed does not trump accuracy. It could involve allowing only journalists with relevant science background to write science stories. It could require reporters to at least talk to one critical expert before passing along a press release as fact. In an era where every month there are press releases claiming that new studies show that climate change is a hoax, or that the MMR vaccine causes autism, I am not the only person who counts on your great paper to filter the wheat from the chaff.

Keith Humphreys

Driving and cell phones

I was listening to WEEI’s broadcast of the ball game in the car this evening (the game in which Boston clinched a playoff slot, woo hoo).  It’s a wonderful world in which I can set my smartphone – better called a pocket computer – to pick up a Boston radio station from the web, a continent away, and plug it into the Aux input of the car radio,.
The announcers interviewed a honcha of AT&T’s New England operation during a pitching change, about a PR program AT&T is putting on to discourage texting and driving.  Good for AT&T, but she made a serious mistake, plugging a speech-to-text/text-to-speech technology for texting.  Do. Not. Do. This. And do not use a hands-free device to talk on the phone in the car; it’s just as dangerous as holding the phone up to your ear and talking.   Think about all the great RBC posts you will miss if you are dead. David Strayer at the University of Utah runs the go-to lab about this and in their latest very interesting paper  we find

Taken together, the data demonstrate that conversing on a cell phone impaired driving performance and that the distracting effects of cell-phone conversations were equivalent for hand-held and hands-free devices….Finally, the accident data indicated that there were significantly [p<.05] more accidents when participants were conversing on a cell phone than in the single-task baseline or alcohol [BAL = .08] conditions.

Why this is true is not intuitively obvious, and I think even Strayer’s group, who recognize that this is a cognitive problem and not an issue of visual distraction by the keyboard, doesn’t quite get it: the psychology through which the cell phone impairs driving is not only the driver’s but also the conversational partner’s, and what the driver intuits about the latter.  Continue reading “Driving and cell phones”

True Ghost Stories

Life is full of ghost stories that turn out to be true

As a child, I loved ghost stories. I still do, but I particularly appreciate those that turn out to be explicable. One of my favorites is of the Fire Dog of Asu, which was related in George Plimpton’s outstanding edited collection of narratives by members of the Explorer’s Club.

The story of the Fire Dog is related by an exploration team who investigated the legend among some Pacific Islanders of a terrifying, bloodthirsty gigantic dog that roamed through the jungle and glowed in the dark. Shortly before the arrival of the team on the island, a man died of a heart attack, apparently due to terror caused by the fire dog. (Yes, this all sounds eerily like The Hound of the Baskervilles, but the Fire Dog of Asu is a non-fiction story).

The team of explorers investigates the part of the jungle where the beast is rumored to live. As night is falling they arrive at a beach. Here they find a cave in which they hear the snuffling and growling of a large animal. They feel paralyzed with fear as a glowing beast emerges from the gloom, charging straight at them. Continue reading “True Ghost Stories”

More Claptrap on Science on the NYT webpage

The NYT has done it again — posted more claptrap on science.  But this time it’s by a respected philosopher, Thomas Nagel.   Nagel’s post is a cliff notes version of his book published last year, Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature is Almost Certainly False.  

This book title alone provides a good indication that Nagel should be ignored on these matters,  since any scientific theory is likely to be “false” in an old-fashioned philosophical  sense of an exact description of nature– as Newtonian mechanics is “false” because it does not comprehend relativistic or quantum interactions.    So we need to ask what is it about our academic institutions and intellectual cultures that allow tenured faculty at NYU (at least a second-tier University) and Oxford University Press (a premier publishing house that publishes academic and quasi-academic books) to advance misleading nonsense that proceeds in ignorance of  how other professors in nearby offices do their work.

Let’s try to go through Nagel’s argument and see what it relies on and what it misses.

First he builds a strawman that physics aspires to be a “theory of everything.”  Leave out the silly grammar where a field of study is anthropomorphically given aspirations.   When physicists talked about a “theory of everything” they didn’t mean a theory that comprehends such things as consciousness, morality, aesthetics, free will, or even  the stock market —  they meant, to use informal terms,  a theory that provided unified explanations of gravitation and the previously unified theory of electricity, magnetism, and strong forces within atomic nucleii.  This was an ambition to unite the world of physics, not to use physics to subsume all other sciences.

So let’s not beat up on physics.

When we get to neuroscience and psychology, there is a hard  question about what is the relation of the biochemistry and connective structures of the brain to conscious life –part of the conundrum is about subjective experience, and another part of this is about agency and free will.    Neuroscientists, psychologists, philosophers, and humanists struggle with these issues.  Granted there is a lot of nonsense in these fields, but there is a lot of serious investigation also, involving both theoretical constructs and experimentation of various kinds — in other words, science.

Nagel’s response is to wave his wand and act as if none of this science exists.  He argues that if physics cannot explain subjective experience, then we need wholly new theories “of a different type from any we have seen so far.”    But we have lots of scientific theories that have no direct contact with physics, and many of these relate to understanding complexes of  human behavior.  Nagel acts as if he has never met an economist or an information theorist or a computer scientist or a  social psychologist or an ecologist or even a logician (obviously impossible for a modern philosopher) — but these people routinely deploy theories that are different in character from those of physics, and many of them deal with systems that behave teleogically.  Teleology turns out to be the wholly different element that Nagel says needs to be melded into natural science.

Nagel wants to declare “mind” as a fundamental part of “nature”  — certainly one would have a hard time explaining the historical trajectory of the post 1900 evolution of the Earth environment without reference to mind, so it’s clearly important now, but that does not mean it’s a fundamental part of the natural order everywhere.  Nagel seems to believe that mind cannot not spring up from nothing, and so it can’t have arisen by evolution.  Never mind that this is formally equivalent to saying that we need a fire element because you can’t create fire from nothing.   He wants morality and reason to exist outside of history and evolutionary contingency because he can’t seem to vanquish the bugaboo of relativism otherwise.  (This is spelled out in detail, if speciously, in the book).  So his response is to insist on somehow mentalizing nature itself, in some way yet to be determined — maybe like the aether was needed to conduct light.

I suppose we should not foreclose this possibility — but what sort of theory would it be and how would it be testable?  More tellingly, it is not at all necessary to make progress.  In fact,  Nagel considers and rejects the primary overall frame within which active scientists are making progress on these issues– the notion of “emergent properties.”  So far as I can tell, Nagel’s rejection is purely aesthetic — he doesn’t think you can create something just by increasing complexity of interactions and changing the level of analysis.  Similarly, his rejection of the evolutionary emergence of reason is also primarily aesthetic — he fears that recognizing that reason and morality arose historically and contingently undercuts their legitimacy by making them appear more unreliable.   In my view this recognition engenders a more critical stance that should open up the possibility to make them more reliable, but I wouldn’t use this personal judgment as a way to sniff out truth and falsehood.

It’s entirely clear that one can fully resist Nagel’s conclusion on the need to mentalize nature without resorting to any of his supposedly exhaustive four-fold options for resistance.   You don’t need to mentalize physical nature to recognize the power of thought once mind comes into being — especially social mind backed up by culture and language.   I don’t mean to minimize questions about, for example, whether you could have a different logic and where logic comes from, and I am also not going to completely foreclose the possibility that one day a scientific theory might somehow look like what Nagel is proposing now.  This would be mere speculation.

It’s completely clear that Nagel has not made anywhere near the case he thinks he has.  There is lots of room for improved understanding of the nature of mind and consciousness in ways that are completely consistent with materialist physics and neo-Darwinism, with the addition of complex systems understanding.

A NYU professor who pronounces science’s conception of reality to be false without engaging with any current science should be ashamed of himself.    Oxford University Press should not have published this book.   The fact that Nagel is respected and picked up in the New York Times is a symptom of our fragmented and fundamentally un-serious intellectual culture.

If the universe had any sensible teleology or nature were infused with Mind we would no doubt be served much better than this.

Update:  A comment notes that, according to one apparently reputable ranking, the NYU Philosophy Department is the best in the English speaking world, which just makes me shake my head more.



What may be the most unenlightening collection of pop epistemology, or theolosomethingorother, ever somehow commanded space in the NYT this week, beginning with totally wooly noodling by a self-proclaimed creationist that somehow got six more people to outgas on it. The operative question is framed as  “believing in” science (and its big-bang, old-universe, evolutionary branches), or a literal interpretation of Genesis.

What, I wonder, is the operational definition of belief in a debate of this kind?  As a devout Bayesian, and allowing for all the tricky heuristics and biases of rational process, I give “how you bet” priority over “what you want to be heard saying”.  The problem is that we practically never have to bet on science or the Bible.  Where in daily life does anyone get to act in a way that will work out much worse, or much better, if  science is right and the Bible wrong on this stuff?  Even Christian Scientists‘ health statistics are about the same as everyone else’s, because we all have the same plumbing keeping the sewage away from the drinking water, the same FDA keeping bad stuff out of the food, the same EPA keeping poison out of the air, etc.  (On the other hand, we do not see even a few Christian Scientists leaping off tall buildings on the proposition that the physical world is not real…) Continue reading “Belief”

Ayn Rand and the Blind Monkey Theorem

Ayn Rand’s attack on C.S. Lewis’s obscurantist distrust of science was vulgar. But it wasn’t wrong.

Even a blind monkey, it is said, finds a banana every once in a while.

I was reminded of that bit of wisdom some time ago when Left Blogistan was enjoying itself celebrating the nasty marginalia Ayn Rand wrote in her copy of C.S. Lewis’s Abolition of Man.  The joke, of course, is that both have become idols of the not-too-bright elements of the American Right, despite the fact that they agreed on roughly nothing.

In general, Rand deserves her followers, while Lewis emphatically does not deserve his. (I can just imagine Lewis’s reaction had he lived to see Ollie North (!) living in a mansion called “Narnia.”) Lewis was a superb writer of persuasive prose (I’d put him in the Orwell class) and, on average, a far clearer and more original thinker than Rand, whose “philosophy” is mostly Nietzsche-and-water. You don’t have to be a Christian to admire the brilliance of Screwtape, or its insight into some aspects of moral psychology and of bureaucratic life.

I live in the Managerial Age, in a world of “Admin.” The greatest evil is not now done in those sordid “dens of crime” that Dickens loved to paint. It is not done even in concentration camps and labour camps. In those we see its final result. But it is conceived and ordered (moved, seconded, carried, and minuted) in clean, carpeted, warmed and well-lighted offices, by quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voices.

Still, Lewis’s version of Christianity – and perhaps even more, his Aristotelianism – involved him in deep, deep hostility to science. You can see that in the Out of the Silent Planet/Perelandra/That Hideous Strength trilogy, where a character into whose mouth Lewis puts the words of J.B.S. Haldane is the leader of a (literally) diabolical conspiracy, with another character clearly based on H.G. Wells as its pompous, clueless front man.

Partly this is just an echo of Berkeley making fun of Newton as a way of getting back at science for proving that the actual world isn’t consistent with what had long been Christian doctrine; partly it’s an expression of the resentment of literary intellectuals toward the prestige enjoyed by scientists, as described by C.P. Snow in The Two Cultures.

But at a deeper level, it has to do with two different approaches to dealing with suffering: the religious view that accepts it as the Divine will and invites sufferers to turn their misery to spiritual benefit and the scientific/technological view that asks how knowledge can be harnessed to the task of reducing the volume of suffering in the world. It would be too harsh to say that Lewis would prefer prayer to medicine as a way of addressing the problem of disease, but “too harsh” is not the same as “inaccurate.” After all, a life saved by medicine – unlike a soul saved by prayer – is not saved for eternity.

More fundamentally still, there is an almost ineradicable tension between the stance that seeks for truth in the traditions of the past and the stance that seeks it in new inquiry, which Lewis exemplifies by “digging up and mutilating the dead.” (See Popper’s “Toward a Rational Theory of Tradition” for an attempt to reconcile traditionalism with critical thought.) Lewis’s preference for manuscripts over laboratories came from the same roots as his commitment to revealed religion.

Below are some of the passages on which Rand commented rudely. Her comments aren’t worth paying attention to, but the passages themselves say much ruder things about Lewis than Rand could ever have managed to say.

I am considering what the thing called ‘Man’s power over Nature’ must always and essentially be. No doubt, the picture could be modified by public ownership of raw materials and factories and public control of scien­tific research. But unless we have a world state this will still mean the power of one nation over others. And even within the world state or the nation it will mean (in principle) the power of majorities over minorities, and (in the concrete) of a government over the people


There neither is nor can be any simple increase of power on Man’s side. Each new power won by man is a power over man as well. Each advance leaves him weaker as well as stronger. In every victory, besides being the general who triumphs, he is also the prisoner who fol­lows the triumphal car.



There is something which unites magic and applied science while separating both from the wisdom of earlier ages. For the wise men of old the cardinal problem had been how to conform the soul to reality, and the solution had been knowledge, self-discipline, and virtue. For magic and applied science alike the problem is how to subdue reality to the wishes of men: the solution is a technique; and both, in the prac­tice of this technique, are ready to do things hitherto regarded as disgusting and impious – such as digging up and mutilating the dead.


If we compare the chief trumpeter of the new era (Bacon) with Marlowe’s Faustus, the similarity is striking. You will read in some critics that Faustus has a thirst for knowledge. In reality, he hardly mentions it. It is not truth he wants from the devils, but gold and guns and girls. ‘All things that move between the quiet poles shall be at his command’ and ‘a sound magician is a mighty god’ In the same spirit Bacon condemns those who value knowledge as an end in itself: this, for him, is to use as a mistress for pleasure what ought to be a spouse for fruit. The true object is to extend Man’s power to the performance of all things pos­sible. He rejects magic because it does not work; but his goal is that of the magician.


The serious magical endeavour and the serious scien­tific endeavour are twins: one was sickly and died, the other strong and throve. But they were twins. They were born of the same impulse.

Note the elision from Faust’s desire for personal power to Bacon’s desire to create knowledge that would be useful to humankind. As to goals, what Lewis says is half-true: Jenner didn’t especially want to understand smallpox; he just wanted to prevent it. (Galileo and Newton, whom Lewis doesn’t mention, were in a different business.) But if Lewis believed that preventing smallpox was a good thing, he somehow neglected to say so.

My purpose here is not to condemn Lewis; I have learned much and had great pleasure from reading his books. A non-Christian who wants to grok what Christianity is about could do much worse than Mere Christianity plus The Lion, the Witch, and the Wardrobe. But Lewis’s anti-scientific and anti-technological bias comes as part of the package, and Rand wasn’t wrong to call him out on it.

Simple problem, simple solution

What do you do when a researcher reports that someone is cooking the data? Fire the complainer and move on.

When a researcher at a heavily-funded biomedical research lab reports that his team-mates are cooking the data, the solution is straightforward: fire the whistleblower, and keep moving.

Of course I have no competence to judge whether Daniel Yuan is right, and I have only limited confidence in mass-media reporters to grasp what’s going on. (Why not ask some people in the field to review the paper and Yuan’s criticism and say whether he’s on to something?) But the absence of a quote from someone senior at Hopkins saying “We’ve checked this over, and Yuan had it wrong” seems telling.

All money corrupts, and big money corrupts big-time. In the Middle Ages we had corrupt Church officials; today we have dodgy scientists. And the business model of the grant-funded parts of universities – and most of all of the medical schools – means that losing your funding means becoming a former scientist. There couldn’t possibly be more pressure to come up with something publishable, whether it’s accurate or not.

It seems to me that every university needs a research ombudsman, to whom a researcher with concerns about integrity can go and get an arm’s-length adjudication of his claims, with protection from retaliatory job action. If I were running NIH, I might want to make that mandatory for the top 100 grant recipients.

Guns, Lysenko, and Ezra Klein

What does Stalinist biology have in common with gun-nuttery?

Ezra Klein has been filling in on Larry O’Donnell’s MS-NBC show The Last Word, and last night we talked about guns.

Ezra’s lead-in was in many ways more interesting than my segment. I didn’t say anything RBC readers don’t already know; after all, I’m not a real gun expert (like Phil Cook or Jens Ludwig or John Donohue or Rick Rosenfeld or David Kennedy or Susan Ginsburg); I just play one on TV. But Ezra proved again the silliness of the false equivalence between MS-NBC and Fox; he devoted most of the segment to demolishing the case for bringing back the Assault Weapons Ban (as opposed to the limitation on high-capacity magazines).

Putting aside the merits of the question, just imagine a Fox News host carefully explaining why a key Republican talking point is fundamentally bogus.

At the very end, we talked about the President’s actions to limit the scope of the Congressional ban on gun-violence research, and I called that limitation “an anti-Lysenkoist measure,” thus dating myself.

Everything old is new again; if you think that having politicians decide scientific questions, and persecute scientists who follow the data rather than the Party line, is obsolete, you’ve clearly never met a contemporary Republican politician (VA Attorney General and Gubernatorial candidate Ken Cuccinelli, for example).

So while I’m glad to have added a word to Ezra’s already-prodigious vocabulary, I’m sorry that “Lysenkoism” is of contemporary, rather than merely antiquarian, interest. But since it is, we should all be aware of it, and its insidious effects.

Footnote Kevin Drum’s piece on lead and crime reminds me that the first researcher to show a link at the individual level – Herb Needleman of the University of Pittsburgh – almost lost his job to a furious assault, featuring trumped-up accusations of scientific fraud brought by people paid as expert witnesses by the smelter industry.