Eating the Seed Corn in the Health Research World

We will be sorry for decades to come if we don’t immediately start expanding the NIH budget to support young scientists

Michael White’s essay on the dismal state of the National Institutes of Health is a must-read for anyone who cares about progress in medical and public health research. The positive impact of the 1998 bipartisan initiative in Congress to double the NIH budget has been completely wiped out.


Of all the bad news in White’s article, this is particular discouraging:

The tighter competition for funding has put the squeeze on younger scientists with fledgling labs; the proportion of young scientists with NIH grants is half of what was in 1998, while the proportion of funded scientists over 65 has doubled. Because scientific training typically takes over 10 years, students who decided to enter graduate school in the boom days of the mid-Aughts are now entering a job market that looks nothing like what they expected.

On the ground in my daily work in both a university medical school and a public hospital, it’s a rare month that some bright young person doesn’t tell me they are quitting science because it’s too hard to get funded. These are usually not reversible decisions. Even a well-trained young physician who leaves research for 5 years to treat patients full-time is very hard to tempt back into science if the funding picture improves (and is even harder to bring back up to speed on the cutting-edge scientific questions and methods of the day).

There is no question that we have some enormously talented and productive senior citizen scientists. But a decade or two from now, when an antibiotic resistant bacteria or new strain of bird flu is ravaging the planet, that generation will no longer be around to lead the scientific charge on humanity’s behalf. That’s why we constantly need a new stream of young people committing to health science careers. That seed corn is currently being consumed at an alarming rate, and if we don’t act immediately to rectify the situation we will suffer for many years to come from the loss of a generation of health researchers.

Author: Keith Humphreys

Keith Humphreys is the Esther Ting Memorial Professor of Psychiatry at Stanford University and an Honorary Professor of Psychiatry at Kings College London. His research, teaching and writing have focused on addictive disorders, self-help organizations (e.g., breast cancer support groups, Alcoholics Anonymous), evaluation research methods, and public policy related to health care, mental illness, veterans, drugs, crime and correctional systems. Professor Humphreys' over 300 scholarly articles, monographs and books have been cited over thirteen thousand times by scientific colleagues. He is a regular contributor to Washington Post and has also written for the New York Times, Wall Street Journal, Washington Monthly, San Francisco Chronicle, The Guardian (UK), The Telegraph (UK), Times Higher Education (UK), Crossbow (UK) and other media outlets.

49 thoughts on “Eating the Seed Corn in the Health Research World”

  1. You don’t see a need to review the model, and not just the funding? KH: “That’s why we constantly need a new stream of young people committing to health science careers.” Contrast Paula Stephan, cited in White’s piece: “the research enterprise itself resembles a pyramid scheme” [at the expense of graduate students and postdocs].

    Another problem with the analysis is that you should not look at biomedical research in isolation. The entire budget of the NSF, covering dozens of other scientific fields, is only $7bn. Historically, health science research has been more generously funded in the US than most other non-nuclear fields. That does not excuse boom-and-bust funding, or underfunding of research generally, leading to zero-sum conflicts.

    1. You don’t see a need to review the model, and not just the funding?

      Not sure where you came up with that. What in the post led you to conclude I am against changing the model?

        1. I am sorry I don’t follow you. Clearly new people have to enter fields for them to survive, but that isn’t anyone’s subjective impression/debating point.

          1. If Stephan and Barry are right, the current model relies on inducing more young researchers to enter than there are long-term career slots. The triage is cruel, and additionally immoral to the extent it’s based on deception. At least aspiring young actors know their chances of making it are low. A sustainable research enterprise has to be demographically sustainable as well. If it’s true that the ideas come disproportionately from the young (but is there any evidence for this outside mathematics?), then a sustainable research system will be less productive than a pyramid scam. So for equal output, it needs to be larger.
            One of the costs of the pyramid model is that it turns older researchers into managers and fundraisers, always hustling for grant money for the postdocs to keep the lab afloat. Darwin spent approximately 0 time on such matters.

          2. It is of course wrong to be cruel and deceptive — we all know that but perhaps it is sometimes worth stating the very obvious. I remain puzzled though that you posed it as if were an objection to my post as I did not endorse what you deride.

            That said, your analysis assumes that we are powerless to influence the number of long-term career slots and therefore we must have a cruel system. We are not and we do not. When there is stable funding, there are stable careers — I work in a center full of scientists who have been doing research for decades and who have launched many young people onto successful and socially valuable careers. It isn’t a law of nature that we can’t have more opportunities for talented young people to become scientists.

  2. It’s the NIH’s own fault they’re so far behind scientifically and with the research. They could have listened to us scientists and not gone ahead with, for instance, the twice-failed OspA Lyme vaccines as next-to-be-failed HIV vaccines. How dumb was that? It was known for years from experiments with fungal-antigen Tuberculosis vaccines that they didn’t work.

    1. If there is an argument to be made on behalf of your (inadequately explained) position, the deranged and incoherent contents of your link certainly don’t demonstrate it.

      1. Hey, Warren, it could have been worse! It could have said NAZI-Bonehead AmeriKKKa instead of NAZI-Bonehead America.

  3. “Because scientific training typically takes over 10 years, students who decided to enter graduate school in the boom days of the mid-Aughts are now entering a job market that looks nothing like what they expected.”

    Should the scientific training to do health research really require 10 years? Can it be reduced to 8, 7, or 6?

    Gradually, over the course of the 20th and 21st centuries, we’ve accreted more time to earn the credentials to do scientific research. James Watson was only 25 when DNA was discovered.

    1. Leaving aside the misphrased “DNA was discovered”: sure, but Rosalind Franklin, who generated the data, was over 30, and Frances Crick, who interpreted the data, was almost 40. So what was your point? Have you ever talked to a biologist about Watson? Even if he were a respected and successful researcher, what would one outlier demonstrate?

      Also: there is a distinction between “doing scientific research”, and being a trained and independent scientist. I’ve gotten great research out of undergrads, and seen other people get much, much better research out of undergrads and even high school students. This is part of doing science.

  4. I think the question is – why should we expect health spending (or any subset of the budget) to grow indefinitely at a rate of 3.3% ABOVE inflation?

    1. The trend for health spending to rise faster than GDP, and hence increase its share, is well established across the OECD. It’s true however the healthcare is financed, so it’s not a matter of the government budget.

      Since research is one of the few avenues available to contain this rise in costs, it looks reasonable that it should grow at least in line with all health spending. It’s not currently very well targeted at cost reduction, though. I proposed here that it should be.

    2. Laboratory research faces expenses for materials and equipment that rise at a rate more comparable to medical inflation than economy-wide inflation. 3.3% is not an extravagant rate (it’s below medical inflation over the same period). Furthermore, the graph shown is misleading in its starting point: the slope is higher over the couple of decades before 1990 than from 1990-2000, and the sharp increase in the late 1990s and early 2000s mostly serves to catch up to where the line would have been were it not for that decade of nearly flat growth.

    3. “Indefinitely” is a troublesome word here. “Any subset of the budget” could be “spending on the Air Force”, for instance. Once upon a time, the fraction of the budget going to the Air Force was negligible. Now it’s substantial. That “subset of the budget” must have grown faster than inflation for some decades. Projecting its growth rate “indefinitely” during those decades would have been just as logical as projecting, today, that health care or medical research spending will grow “indefinitely”.

      Not every curve is an unbounded exponential increase. There is such a thing as an S-curve. An S-curve is what takes you from one level to another. The first half of an S-curve can easily look like an exponential, to people who are over-eager to make projections.

      The notion that the level of medical care or medical research must not be allowed to rise from their present share of either the federal budget or the GDP seems crazy to me. Why should we (individually or as a nation) refuse to shift our spending from “financial services”, or “entertainment”, or “defense”, or any other “subset of [our] budget”, to “health care” as we grow older?

      You can argue, if you like, that our current allocation of resources between “medicine” and “everything else” is optimal, and therefore must be kept constant “indefinitely”. Is that what you’re actually arguing for?


  5. “Because scientific training typically takes over 10 years, students who decided to enter graduate school in the boom days of the mid-Aughts are now entering a job market that looks nothing like what they expected.”

    This has been the story for about 40 years now.

    Keith, the system of science relies on people going through years of very hard work to get a Ph.D., more years of hard, migratory work (on what other want one to work on) as a post-doc, and then for the overwhelming majority, burning out and leaving the system in their mid-/late-30’s.

    I’m getting sick and tired of ‘fresh meat’ systems.

  6. I would like to point out that this general trend has preceded the NIH at the NSF, EPA, DOE, and USDA. It has nothing to do with lack of success, if it did then DOD would be following that trend well before sequestration. It has everything to do with the concept that government science (funded and/or performed) is by default inferior to private commercial development. If our elected official felt that public support of research and development was important I highly doubt we would be having this conversation. I have collaborated with researchers and participated in research from all these agencies and it has seemed to follow the general declined that we are lamenting for the NIH in terms of the available funding, the level of that funding, and the general area where funding is flowing.

    As a country and a species we will regret this misguided choice.

    1. Sam,
      You are so right. Another big headline this week that feeds into the issue of starving basic research in the name of ideological economism was the news the American high school students are slipping further behind those they’ll be competing with for jobs in the future. We’ve been assured the private sector will meet all these needs, but for things like basic research and public education, there really are no alternatives unless we willingly choose to exclude whole swaths of basic research and large segments of the population from the benefits of these admittedly governmental programs.

      Politicians are quick to take credit for “cutting fat” but after 30 years of that, we’re well into the bone, except the captains of the ship are too drunk on the power of their mythology to notice the sawing at their legs.

      When the dust settles and the US is a backwater nation in a sophisticated future world, those politicians will be safely drawing their retirement. They better hope they don’t need any of the medicine and technology that might have been available if they’d just not cut relative pittances out of the national budget in the name of fiscal conservatism. Will it take that long before we discover that the bizarre economics of tax cuts for those who don’t need them and austerity for everyone else emanating from the Republican Party — and far too many Democrats these days — is every bit as ideologically driven as the Communist Party of the Soviet Union was in the 1980s? Pride goeth before the fall.

      I do appreciate the concerns raised about whether we have the best system in place at NIH to take full advantage of the talents of researchers in a sustainable way and other such issues. Those are important issues, but unless basic funding is in place, there won’t even be a venue to have those arguments within in many cases.

  7. Let’s remember the Agency for Healthcare Research and Quality (AHRQ) as well. It has also been threatened with defunding. It does not discover new basic biomedical facts such as those needed to target mechanisms microbial metabolism and thereby treat infectious disease, but it does fund research to figure out whether existing interventions perform as advertised. When it does its job well, it sometimes figures out things which are unwelcome news to some sectors of the medical-industrial complex. Corporations are not fond of it when it shows that their big money-maker does not work any better than a more affordable alternative.

    The NIH and the AHRQ have different functions, and both are investments in the future health of the population. I am not aware that the NIH steps on many corporate toes, but the AHRQ can and often does.

    I hope that the NIH and AHRQ do not end up competing for funding (or avoiding defunding). But who knows?

  8. Returning to the then all time peak funding levels of 2002 doesn’t strike me as doomsworthy.

    Your graph appears to say that inflation adjusted US spending this year is eighth highest out of about 240 years. Or if you want to limit it to post WWII, eighth highest in almost seventy years. I tend to believe that research is one of the better areas we can spend money on. Medical research even more so. I’m not averse to government research, though the idea that it would be wise to hamstring private research (the Angell thesis) seems wrongheaded and dangerous.

    But expecting 3% growth after inflation is more than a bit much. Labeling a return to very high levels of funding as eating your seed corn is just silly.

    1. But expecting 3% growth after inflation is more than a bit much. Labeling a return to very high levels of funding as eating your seed corn is just silly.

      The reference to “eating seed corn” is specifically referring to young researchers rather than overall scientific progress in the short term. People who have existing grants are likely to still get grants. People who don’t have grants yet are unlikely to ever get any. People who lead research continue to get older and older. Only exceptions are fields that simply did not exist a couple decades ago.

      Basically the same scenario as in many other industries. No more growth. No new jobs unless someone retires. And retirement continues to be disincentivized by the disappearance of pensions.

      1. I will note that the NSF, to its credit, seems to recognize this problem, and actually increased the number of Graduate Research Fellowships it is offering this year, even as its budget is down.

  9. The graph does not appear to include effects of the sequester. True, the main point of the article relates to years of infrastructure neglect, but the agency budgets are further eroded by fiscal intragence on the part of the Republican House.

    1. Increases in the NIH and NSF budgets were always necessary for providing room for new scientists, whether they were above the inflation rate or not. Since early in second Bush Administration success rates have decreased steadily to the point that only 25% of awards (e.g., NIH R01 applications, which are the main mechanism for support of basic biomedical sciences) are renewed, and success rates for new applications are in the single digits (<10% in many disciplines). Regarding the Sequester, my recent application was in a group of 77 in the 2013 NSF Review Panel. Two (2) applications were judged/placed into the "High Priority" category with about 30% of the total in the next category. It is not believable that only 2 of 77 American scientists are capable of writing a good, fundable application. All of the ~23 high-ranking applications in the pile, ranging from Research in Undergraduate Institutions proposals to large multi-center collaborative proposals, were worthy of support. NSF is protecting currently funded awards, which is a good thing. New awards, including applications for continued funding of productive, long-term research programs, are not getting funded, except for those two "high priority" applications and a handful of other lucky duckies. There is no future in American basic science, and prospects for improvement are nil. The current administration seems not to care, but I'm sure I must be mistaken about that.

      It has never been easy to get support and competition has always been keen. No reasonable scientist ever took it for granted. But even in the most parlous times in the past, it wasn't impossible to write a successful application. Now it's just hopeless. The thing is though, despite their perspicacity Program Directors and Review Panels cannot choose the successful proposals ex ante. They can identify those that are logical, well composed, and potentially fruitful by whatever legitimate criteria used in the evaluation. Emphasis on "potentially." You still have to do the experiments, some of which will not work, but many of which will work in ways never contemplated. I think it was Max Delbreuck who wrote that it is essential to leave room for the unexpected in your experiments, otherwise you will never make a discovery. The same is true for funding. In a previous life I worked on bioluminescent cnidarians (jellyfish and soft corals). This work was supported well enough but not particularly lavishly in several laboratories, by both NSF and NIH. It had no "practical" use other than to understand the molecular basis of an interesting natural phenomenon. But out of that work came reagents without which modern cell biology, including studies of gene regulation, would be completely unthinkable. The original reasons for pursuing the research, and for funding it, were that it was "good science," and only that. Who knows what we are missing now? Novel antibiotics and anti-cancer drugs, answers to neurodegenerative diseases? An answer to HIV/AIDS that could come from left field. Yeah, probably.

  10. Here are some inconvenient little facts about where some of those NIH dollars are going
    Faculty salaries: universities “hire” faculty membership at a given salary. The university pays them a portion of that salary, which could be anywhere from 0% to 75%, and the faculty member pays the rest out of their NIH grant. Also fringes, and also indirects. Not blaming the faculty member here-they work hard for the money. But universities are NIH parasites.
    Grad student tuitions, fees, stipends and health insurance. Hiring a grad student can cost about $50k or more. And since so many universities, including places with not great research infrastructure, or not great visibility and therefore not great ability to attract a talented applicant pool, now demand that their faculty get NIH grants, which they could during the doubling, well there are a whole lotta grad students not with 50K per year. But they graduate anyway because “they’ve been here for seven years and we can’t fail them now”, so then they become postdocs, of which there are also too many, or they quit science.
    We are creating too many bad PhDs, and we are doing on the public’s dime.

    And NIH an refuses to do anything about this. Shirley Tilghman started banging the drum about this in the late 90s and was ignored. She lead the Working Group on the Biomedical Work Force, and she suggested that NIH take measures to limit the growth of PhD programs and not allow the creation of new ones. This was ignored. Instead we got new grants for graduate student career development. Which is all fine and good, but it is simply NIH’s way of throwing more money at the problem instead of facing the monster that they have created, admitting guilt, and solving that problem.

    Right now, so many universities are in full panic because a decade ago they decided to build collaborative research buildings, or some such nonsense, that they would fill with researchers who would contribute zero, zip, nada to the educational mission of the university, but would bring in all this NIH money from the budget doubling. And every half-witted provost or starry eyed president who decided to build such a building seemed to think that he or she (mostly he, though) was the only one who had thought of this brilliant idea to bring in more NIH funds.

    And now you have a crap load of shiny new buildings, with marble staircases and glass elevators and curtain wall open labs that are friggin’ half empty because, oh, guess what, the doubling did not continue.

    We can blame Republicans all we want, but at the end of the day, the NIH and universities are to blame for the problems that plague them now. And we will all pay the price later.

    1. Yeah, this too. Our administration has stars in its eyes over a new PhD Program, as if we need more of those. I have often wondered whether Congress “knows” that those R01s often pay 50% or more of the Principal Investigator’s salary, usually out of direct costs and not the 45-55-65% in indirect costs (overhead) that come on top of the primary award. NSF severely limits such salary reimbursement, which is a very good thing.

      1. Also, a significant portion of the R funds pay trainees including tuition, fees and benefits. And yet, NIH imposes NO oversight on the quality of training that an institution can give. Which is amazing, to anyone who has ever applied for a T or F grant. You can’t get a T32 without reporting your admissions stats and you trainee outcomes, and they have to be good. But not so for R funds used for trainee support. A PI does not even have to name the PhD student who are supported on the grant, let alone prove their worthiness. And the vast majority of trainees are supported by R funds, with an ever shrinking proportion being supported by T32s. So the NIH has essentially decided not to oversee how effectively institutions use funds for training, when the source of those funds is R grants.

        A poor graduate program can eat NIH funds and sap investigator productivity like nothing else can. But NIH has not seen fit to address this.

      2. NSF severely limits such salary reimbursement

        NIH also has a salary cap, below what any practicing MD would make, further driving physicians out of science.

        1. Physicians who choose to do science have made the choice NOT to be compensated like their interventional cardiologist and thoracic surgeon colleagues, the former largely a ward of Medicare it should be noted. Still, one of my “favorite” cases is a physician-scientist I know at a Top-20 medical school. A good scientist and very nice person, according to NIHReporter a PI on 4 R-type awards. Direct costs are over $800,000 per year, with another nearly $500,000 in overhead tacked on. All of this as deserved as can be, although a case can be made for limiting the number of awards to one Principal Investigator by requiring at least 25% effort on each (that would get you to a maximum of 3 pretty quickly if the person had any other duties). There can be no objection to NIH paying for some small percentage of the salary, in addition to a bit of not insignificant clinical income. But this particular scientist is expected if not absolutely required to pay up to 90% of salary out of direct costs from the R-awards. This is a scam, pure and simple. The PI is a full professor of the institution, not NIH, and the salary is overhead by any reasonable definition, while those of the students, technicians, and postdocs in the lab are legitimate direct costs. And as maryQ points out, the jig is about up. As an alumnus for fundraising purposes of perhaps the best medical school in the US, I get letters asking me to contribute to their “Bridge Fund” for temporarily (they hope) destitute faculty. Ha! When this particular institution gets the sniffles everyone else is in ICU with double pneumonia and two broken legs.

          1. like their interventional cardiologist and thoracic surgeon colleague

            Very misleading comparison points, the NIH salary cap is below what a jobbing primary care doc makes, and for people with a quarter million dollars or more in med school loans, that often isn’t economically possible.

  11. Good point. And one that also applies to those whose desire is to become a primary care physician in the first place, if by “primary care” you mean Family or Community Medicine, maybe Pediatrics and OB/GYN. Primary care also includes General Surgery, Psychiatry, and Internal Medicine, and perhaps Emergency Medicine, practitioners of which are generally able to handle the debt. Whether we should require such debt as the price of admission to the profession is another matter entirely. Anyway, really, how many physicians are there who want to become biomedical researchers but can’t because of their debt? People who went into medical school with their eyes wide open and the intention of going into research for that kind of pay, even with the premium that MDs get over PhDs in most medical schools? Those who didn’t match in the first place? Generally, academic physician-scientists, with emphasis on the latter, come out of MSTPs. Granted, such student positions are very difficult to get, but almost every medical school has an MD-PhD Program that accounts for 5-10% of the total number of medical students; these students are paid stipends for a minimum of seven years, do not pay tuition in a decent program, and (should) graduate without debt from their MD-PhD Programs. Previous loans are another matter. They would have prepared for their MD-PhD Program from freshman year or earlier. So, who are those physicians driven out of science by debt? Or prevented from entering in the first place? I have spent my entire career in medical schools since I was a first-year postdoc. I haven’t seen too many of them. But I am not a social scientist and I haven’t been looking very hard. Academic physicians who juggle clinical duties with a research program are not uncommon, though they are becoming rarer because of the current funding climate, not their medical school debt.

  12. Primary care also includes General Surgery, Psychiatry

    Only a side point, but I am curious whence that definition derives – I’ve been in psychiatry departments for a quarter century and never heard anyone call it primary care, it is always called a specialty.

  13. Good question. Probably something from the AMA or AAMC? I certainly should know and will ask. Our mission is to prepare “primary care physicians who will practice in rural and underserved areas.” Our dermatologist graduates who treat farmers and other outdoor workers for various forms of skin cancer is absolutely essential but not primary care physicians. Go figure. As for psychiatrists in the rural South, they would have way too many patients if any could afford their services, other than prescriptions Xanax.

Comments are closed.