DEB Numbers: Where do the various official funding rate numbers come from and why are they different from what your PO tells you?

We have recently received several questions by email from PIs wondering how the funding rates shown in official reports can be so different from what they experience in DEB. We thought that it would be good to post a response here on the DEBrief.

What are the official NSF funding rates and where does one find them?

Official funding rates published by NSF vary depending on the program(s), division(s), or directorate(s) being examined. These numbers are calculated in the same way across all of NSF.

You can find official funding rate data here: http://dellweb.bfa.nsf.gov/awdfr3/default.asp and can use the tool to drill down and look at different divisions and programs. These numbers should match the ones you see publicized in official statements and on the nsf.gov site.

As you may notice, the posted funding rates are several times higher than what you hear in discussion with your PO, during panels, or in our context statement.

For instance, according to the official numbers, DEB in 2013 had these stats:Numbers.biis.1

That’s 1,751 proposals, 409 awards, for a 23% funding rate. The mean time to a decision was 4.65 months. An “average award” in 2013 had a duration of just under three years and is funded at ~$80K per year.

Why do these numbers differ from the ~3-7% funding rates you’ve heard from various personal sources?

Missing from the Denominator:

Well, for starters these numbers don’t count preliminary proposals. The official numbers are based only on the proposals that lead directly to funding decisions. You can see that if you look back a couple of years in DEB:Numbers.biis.2

According to these numbers we had ~1000 fewer proposals in 2013 than in 2010. That’s not an error, we did have fewer full proposals, but it is misleading because the source of the count isn’t detailed here. In DEB,when we report funding rates to you, we report separately the pre-proposal invitation rate, and the full proposal funding rate, or we report a single funding rate for the whole process[i].

Notably, this artificial dip in submissions/spike in funding rate can actually be seen on the top-line NSF numbers:

Numbers.biis.3The drop in submissions from 2011 to 2012 is largely because DEB and IOS Preproposals aren’t counted there, resulting in an apparent increase in the NSF-wide funding rate while the number of awards didn’t change much. However, the effect of sequestration (on top of losses from run-of-the-mill inflation and risings costs of research) can be seen from 2012 to 2013 in the drop in awards.

Proposal duplication:

The official numbers count each proposal (“award jacket”) separately. These counts do not combine multi-institutional collaborative proposals into a single unit. This has a sizeable effect on the numbers for DEB, especially within the core programs. Many of our awards involve two or more collaborative proposals reviewed as a single unit; we feel that most of you would consider that such a unit should be counted once instead of each component counting separately and that’s what we do on this blog. However, in the official numbers, a three-partner collaboration counts as three separate proposals and three separate awards or declines[ii]. Generally, collaborations counted in the official manner inflate the funding rate by a few points compared to the individual programmatic reality.

Lumping proposal categories:

We discussed in a previous post the organization of core and special programs in DEB. These are all lumped together in the official DEB funding rate calculation. For the most part, items like RAPIDs, EAGERs, and Conference support are a relatively small piece of the total and are only responsible for about a point of the official vs realistic funding rate discrepancy. You can drill down in the official numbers to get a better look at program-level outcomes. As you can see, there’s much variation here[iii]:

Numbers.biis.4

The two most important messages are a bit buried amid all those lines.

First, the “special programs” are not generally inflators of the funding rate. Dimensions of Biodiversity was at 14%, CNH 6%, and EEID 4%: cumulatively they represent about 20% of the official proposal count and only 10% of the official award count. These programs do not provide an advantage to submitters[iv]. They may be desirable to you for other reasons – interdisciplinary content and potentially larger awards than a disciplinary core project – but they are not an easier route to funding.

Second, the biggest inflator of the funding rate is not obvious to see, but a careful reader might have already picked up on it. Take a close look at the median annual size column, anything stand out? Do the median award sizes for Ecosystem Studies, Evolutionary Ecology, Evolutionary Genetics, Phylogenetic Systematics, and Population and Community Ecology seem a bit small?

A part of this is a reflection of what we already discussed above: each proposal is counted separately so a big $650K project may be comprised of three jackets each receiving $72K per year for 3 years which shifts the median size downward in the official count. But that doesn’t bring it down to $12K or $15K in some of those programs. The main reason official rates appear so much higher than reality is that they lump Doctoral Dissertation Improvement Grants (DDIGs) into the same count as regular research proposals.

Doctoral Dissertation Improvement Grants at ~15K cost just a fraction of a full project: it would take 20, 30, or more DDIGs to account for the cost of a single regular research project. Thus, a little bit of money goes a long way in funding DDIG proposals[v]. The result is that DDIGs can be relatively numerous (50%+ of award jackets in some programs) and enjoy a relatively high funding rate that when lumped into the official count, results in misleadingly high success rates for DEB as a whole.

 

While the topic of funding rates may be a bit confusing, we hope that this post has shed some light on why you may be seeing different funding rate numbers across NSF and DEB. As you all know, with summary statistics the numbers all depend on how the data are being analyzed.

 

[i] These don’t combine exactly because there are proposals like CAREERs and co-reviews that are part of the funding milieu but skip out on the preproposal stage; we covered that previously, here.

[ii] If in our full proposal panels we report funding 4 of 20 projects when counting collaborations only once each for a 20% success rate, the official count would reflect something like 7 of 30 jackets funded for a 23% rate.

[iii] Please note, these program bins don’t actually identify pots of money, they are organizational labels. Those few with only a handful of proposals and high success rates are mostly flukes of labeling. Some, like AToL are old codes that are being retired; others, like LTREB and LTER, are special programs that weren’t entertaining entirely new submissions but other sorts of awards like renewals, and workshops.

[iv] Dimensions might look relatively good, but you have to correct for collaborative projects in the official count here too. The rate as far as whole projects are concerned is ~10%.

[v] In 2013, DEB made 134 DDIG awards. If we never had this opportunity (and ignoring any resulting increases in student support requests), we could have funded approximately 4 additional regular grants, 1 per cluster, bringing the 2013 core award count from 121 to 125 and increasing the funding rate less than 1%.

Assessing the Value of the Doctoral Dissertation Improvement Grant

Caveat: This post is based on the research and analysis of Kara Shervanick, a 2013 Summer Student in DEB. She did valuable work but her time was relatively brief for this complex information gathering and analysis process. This work does provide some context for understanding DDIG program outcomes, however, we point out that the small sample size limits the power of these analyses.

See our other recent posts on the DDIG program here and here. Continue reading

Help Us Recruit for a Leadership Opportunity in DEB (Open Now)!

We are now recruiting for the next Division Director in DEB. Our fearless leader, Dr. Penny Firth will be retiring after many years of commendable service at NSF.

Many of you may already be familiar with NSF’s program for rotating program officers in which your peers in the research community come to NSF and help administer funding programs for a term of 1 to 2 years. Most scientific leadership opportunities at NSF are also filled through rotating positions. These positions can be particularly well suited for experienced people interested in impacting basic science on a broad scale.

NSF/BIO is looking for a new rotator to join NSF as the next Division Director for DEB. This would be an excellent opportunity for someone interested in sharing their exceptional communication, leadership and managerial skills and interested in working with a fun and dynamic group of program officers and administrative staff.

You can view the full job posting details and apply on USAJOBS. We hope this position would be a good fit for you or for one of your colleagues.

Community participation is critical to the recruitment process, especially spreading the word about this opening and encouraging strong candidates to apply. If you know someone with demonstrated leadership skills and a vision for the future of environmental biology, please get in touch with any DEB Program Officer to let us know about them and tell them about this opening.

We also welcome any thoughts you would like to share with us about this search and look forward to hearing your opinions on future DEB leadership in the comments section. As always, you can also reach us at the Division alias at DEBquestions@nsf.gov if you want to say or ask something outside of the blog.

USAJOBS Link:

Division Director, Division of Environmental Biology

DEB Numbers: DDIGing Down into Dissertation Data

So far here on DEBrief, when we’ve looked at success rates and demographics, we’ve generally restricted our discussion to DEB research grants or research grants in the DEB Core Programs.

As we mentioned in a prior post, this time we’re shifting our focus to Doctoral Dissertation Improvement Grants, DDIGs. Continue reading

DDIGs, an Opportunity for Graduate Researchers in DEB

As we began writing this post, proposals were moving through the various stages of review and approval that ultimately result in the awarding of 2014 Doctoral Dissertation Improvement Grants (DDIGs) from the NSF Division of Environmental Biology[i]. This is the culmination of a process that started for many applicants last summer or early fall and for us is the continuation of a commitment that DEB has made to supporting student researchers for over 40 years. You can check out the most recent DDIG recipients through the public NSF Award Search. The list only includes the awards that have been finalized from this year, and it will grow over the coming weeks as the last of the DDIG awards are added to the public database. Continue reading