DEB Summer 2015: Where to find us at Professional Meetings


We’ve got a busy summer meeting schedule and are offering numerous opportunities to hear the latest NSF updates, meet your Program Officers, and learn about funding opportunities in-person. Since it is so close this year, there will be a fairly large contingent of us heading up to Baltimore for the ESA meeting. But, we’re not forgetting the other side of the house; we’ll have representatives at both of the big, international evolution conferences. We’ll also be at the IALE Congress and the joint Botany meeting, and we were already at ASM earlier this season.

And remember, if you can’t make it to our lunchtime brown-bag sessions to hear the latest from DEB, you can always email one of the attending Program Officers to set up another meeting time, catch us in the poster hall, or drop by our information tables (where available).

 

26 – 30 June, 2015: Guarujá, Brazil. Evolution 2015

Featuring: Simon Malcomber, David Mindell, Sam Scheiner, Kelly Zamudio

Presentation Followed by Q & A (NSF Update)
Sunday 28 June, 12:00 – 13:30 (during lunch break), Meeting Room Diamantina

 

6 – 9 July, 2015: Portland, OR. 9th International Assoc for Landscape Ecology World Congress

Featuring: George Malanson

Panel Discussion: Funding Opportunities for Landscape Ecology at the US National Science Foundation
Monday 6 July, 5:30 PM – 6:30 PM, See Final Schedule for Location

Additional Notes: Tuesday eve poster session is a good time to meet up with George.

 

25 – 29 July, Edmonton, Alberta. Botany 2015

Featuring: Joe Miller

With Special Appearances: Roland Roberts, Judy Skog

NSF Information Booth (Exhibitor #114)
All Days, Staffed during poster sessions, by appointment and whenever we can be there, Hall D.

NSF Outreach Presentation and Discussion
Wednesday 29 July, noon, location TBA (check final program).

 

9 – 14 August, 2015: Baltimore, MD. Ecological Society of America 2015

Featuring: Henry Gholz, Doug Levey, Sam Scheiner, Alan Tessier, Alan Wilson, John Adamec, Shannon Jewell

With Special Appearances by: Matt Kane (TBD), Betsy Von Holle (W, Th, F), Emily Leichtman (Su, M)

NSF Information Booth (Exhibitor #438)
Monday 10 – Thursday 14 August, All-day, Baltimore Convention Center Poster/Exhibit Hall.

Special Session (SS 2): Ecology on the Runway: An Eco-Fashion Show and Other Non-Traditional Public Engagement Approaches
Monday 10 August, 11:30 AM-1:15 PM, Baltimore Convention Center 310.

Special Session (SS 10): New Frontiers: Bridging the Gaps Between Continental and Global-Scale Research Networks, A Special AGU-ESA Event and Evening Social
Monday 10 August, 8:00 PM-10:00 PM, Baltimore Convention Center 309.

Workshop (WK 53): Federal Agency Networking Session (Come and meet your Program Officers from NSF and beyond!)
Thursday 13 August, 11:30 AM-1:15 PM, Baltimore Convention Center 316.

 

9 – 15 August, 2015: Lausanne, Switzerland. European Society for Evolutionary Biology

Featuring: George Gilchrist and Leslie Rissler

Presentation followed by Q & A (NSF Update)
Thursday 13 August, noon, location TBA.

Additional Notes: This will be the same program as presented at Evolution 2015 (if you’re like us and had to choose one or the other, we’ve got you covered!)

Draft of revisions to NSF-wide grant and proposal policies up for public comment


Each year or so, NSF releases an updated version of its agency-wide guidance for proposals and grants, called the Proposal and Award Policies and Procedures Guide (PAPPG). This big document consists of two parts: instructions for proposers (the GPG, or Grant Proposal Guide) and instructions for awardees (the AAG, or Award Administration Guide).

The PAPPG sets the ground-rules for NSF programs. Solicitations, like the DEB Core Programs solicitation, exist to enumerate specific variances from the basic rules, for example the format and contents of a preliminary proposal. Solicitations, however, also refer back to the PAPPG and follow the ground-rules for everything except those specific variances. A good example of this is that the requirements for proposal font size are detailed in the PAPPG and we have no reason to repeat or modify that in the DEB Core Programs solicitation but they apply to both preliminary and full proposals.

Changes to the PAPPG trigger new proposal preparation requirements for all NSF programs and may require you to do something differently in your next submission to the DEB Core Programs (or anywhere else), but changes to the PAPPG do not override anything explicitly described in our solicitation.

 

Right now, a draft version of the changes has been made available to the public for comment through 20 July, 2015. The wording of this public version indicates that these revised rules are expected to come into force in January 2016; this is right around our next preliminary proposal deadline. Based on the experience of prior years, the final version will probably be published at some point in October so that you have fair warning of the rule changes and will be expected to follow them beginning on the TBD January date. Between July and October there is a period to review the comments and prepare the final revised version for public posting.

We’re mentioning this here because there are proposed revisions that are likely relevant to you and we want you to be aware of them as early in the process as possible.

The official notice of the request for comments is available in the Federal Register: http://www.gpo.gov/fdsys/pkg/FR-2015-05-19/pdf/2015-12086.pdf

This includes an explanation of the request, how to submit comments, and the comment deadline.

The actual draft is hosted on the nsf.gov website here: http://www.nsf.gov/bfa/dias/policy/papp/pappg16_1/fedreg/draftpappg_may2016.pdf

 

There are numerous small and several more substantial changes noted in the draft. The online document is conveniently marked up with comments and highlights for new/edited text and comments to note where material was removed.

Here are a few revisions that we noted that might be of particular interest to our readers:

On page 20 (of the PDF), it notes that your research office/organizational representative needs to complete organizational sign off before a proposal can be submitted (even for preliminary proposals); this might require modifications to your preparation timeline. (Organizational sign-off is mentioned/added in many other spots throughout the document too.)

On page 25, there’s a re-emphasized note about an issue we’ve mentioned here before: You should have only 1 FastLane ID per individual.

On page 30, the GPG is (finally!) addressing the issue of your collaborator (aka conflict) lists being too long for the 2-page Biosketch by moving them into a separate Single Copy Document.

On page 33, and in a few other places, there are new requirements for reporting via your proposal Cover Page “dual use research of concern” (e.g., work with certain pathogens and toxins).

Pages 34 – 37 include several changes/clarifications relevant to the written components of your proposals: stronger requirement to enter a Project Summary in the FastLane forms (instead of uploading a PDF), a prohibition against hyperlinks in your Project Description, a template for Letters of Collaboration (if you’ve submitted to the DEB core programs recently, you’ve already been doing this), the revised Biosketch format (sans collaborators and other affiliations), and a requirement that each Biosketch be uploaded as a separate file (no more bundling as a single file).

There are a couple of changes with respect to budget preparation, the most notable (at least to us) being a requirement that sub-awards include overhead at the sub-awardee’s federally negotiated rate (or a de minimis rate of 10%).

On page 44, the instructions for current and pending (C&P) support also are changed to require a separate document (no bundling as a single file) for each of the senior personnel on the proposal and the definition of C&P is expanded to include “internal institutional support”.

 

The important outcome here is to make yourself aware of the proposed changes and change timeline and to make sure that your research administration officials are also aware of them so that this fall you will be able to follow the correct version of the GPG for our preliminary proposal deadline.

Are small grants doing well in review?


In contrast to the trend of decreasing numbers of preliminary proposals, we have seen a rapid increase in the category of Small Grant preliminary proposals (these are also included in the total counts in our previous post).

DEB Small Grants 2012 2013 2014 2015
Submitted N/A 83 95 126
Invited N/A 20 25 29
Invite Rate N/A 24% 26% 23%

 

We attribute this to a growing awareness of this option to submit preliminary proposals with total budgets under $150K. Small grants came about in the second year of the preliminary proposal system in response to a long-standing desire, expressed by numerous voices in our communities, for some sort of “small” category. DEB realized it was particularly appropriate in the case of the preliminary proposal system in order that reviewers be able to adjust their expectations for the scope of a project relative to the expense without requiring the extensive preparations of a full budget. We added the category to our solicitation for the 2013 preliminary proposal deadline.

We’ve had lots of positive feedback on this option, but also recognize that awareness still needs to be improved among both applicants and reviewers. This year, 8% of all preliminary proposals were identified as small grants.

Small Grants are found in all four clusters and are generally on the increase, but we also think feedback, such as this post, is necessary to successfully integrate this idea into our communities and maintain enthusiasm for this option. We would not be surprised to see these numbers grow to the point where SGs make up as large a part (or larger) of the preliminary proposal pool as Predominantly Undergraduate Institutions or Beginning Investigators.

Since 2013, we’ve funded 22 awards based on invited full small grants (9 of 18 in 2013, 12 of 24 in 2014, and 1 of 1 in 2015 thus far[1]), for a 51% success rate at the full proposal stage. This is roughly twice the success rate of full proposals without the SG designation.

 

[1] Not everyone who received an invitation eventually submitted a full proposal (individual reasons vary). Also, we have an award already based on a 2015 preliminary proposal because instead of inviting a full proposal, DEB determined this project was appropriate for the EAGER mechanism and invited the team to submit an EAGER proposal allowing for quick turnaround of an award.

DEB Spring 2015 Panel Update


At this point everyone should have heard back on your DEB Preliminary Proposals from the spring panels. If you have not:

1) Log in to your FastLane account. The information should be accessible there, but also make sure your contact email is correct because a typo there would prevent you from receiving updates and notifications.

2) If you were a CoPI, check with the lead PI on the preliminary proposal. The lead PI should have the results of review.

3) Did it wind up in your spam folder?

4) If you have exhausted all of the above options and have had no other contact with your DEB Program Officer, then it’s probably a good time to send us an email.

 

Preliminary Proposal Panel Results

DEB panels reviewed 1495 preliminary proposals; in consideration of the reviews and panel discussion, DEB Program Officers extended 383 invitations to submit full proposals for the August 2015 deadline. The Division-wide invitation rate for the preliminary proposals was 26%. Below, we detail the results of preliminary proposal review by programmatic cluster.

Cluster Invited Not Invited Total Invite Rate
Systematics and Biodiversity Science 87 221 308 28%
Evolutionary Processes 105 331 436 24%
Population and Community Ecology 107 320 427 25%
Ecosystem Science 84 240 324 26%
Grand Total 383 1112 1495 26%

 

This is the fourth round of preliminary proposal review for DEB core programs, which was started in 2012. DEB extended more invitations and achieved a higher invitation rate in comparison to prior years.

2012 2013 2014 2015
Reviewed 1626 1629 1590 1495
Invited 358 365 366 383
Invite Rate 22% 22% 23% 26%

 

As we discussed in our recent post on per-person success rate, the launch of the preliminary proposal system drew in a large number of “new” applicants. We believe we are now seeing this wave of applicants pass and this is reflected in the decrease in number of preliminary proposals reviewed in DEB as our communities realize that preliminary proposals do not make grants easier to get.

At the same time, the number of invitations has gone up. The increase is primarily a result of program management decisions as we have been able to refine our expectations for the number of proposals that will come to the full proposal panel through other mechanisms (CAREER, OPUS, RCN, and co-review).

Research.gov Update: Returned Project Reports


A big thank you to the several PIs who let us know they were having trouble finding the PO comments when reports were sent back from review with requests for revision. We passed them along to the Research.gov team, and it looks like we now have a response.

The most recent update to the Research.gov platform includes changes to the project reporting interface that should make it easier to find and view the PO comments. The screenshot below from the Research.gov online Help guide provides the new details.

Rgov_POComment

(click image to open larger version)

The automatic email you receive when a report is returned should also (now or soon) have a better explanation of how to find these comments, but we haven’t seen that yet.

A reminder to check your FastLane Profiles


For any demographic analysis or comparison, NSF is reliant on the self-reported characteristics of participants in all phases of proposals and awards. Completion of the profiles is voluntary but critical for linking demographic data to proposal, funding, and review patterns. And, importantly, your profile provides the contact information that we use to reach out to you. So if your email address and institutional information are not up to date you may miss out on funding opportunities or critical notifications that affect your eligibility for funding.

So, is your FastLane PI profile complete, up to date, and error-free?

What about your OTHER FastLane profile? When was the last time you completed your Reviewer information?

Yes, that’s right; if you’ve taken part in both sides of the NSF merit review process you have two[i] separate FastLane profiles: one as a PI and another as a reviewer (or panelist).

Across NSF, our community members are pretty good about completing PI profiles (>80% coverage) but are far less likely to complete the profile as a reviewer (<<50% coverage).

As a PI or CoPI, one can update a PI profile in FastLane at any time.

Log in under “Proposals, Awards and Status

FastLane_Profile_1

(click images to enlarge)

You can go directly to your PI profile from the first landing page or update the information before starting work on a proposal.

The form itself includes your name, organizational affiliation, contact information, degree information, and demographic characteristics. (Screenshot below from the FastLane online Help guide.)

FastLane_Profile_2

Before your next application, perhaps right now, please take the time to log in to FastLane and make sure your PI profile is up to date.

Reviewer profiles can only be updated when you log in to complete a review request

(As far as we know, though if you want to take a shot at logging in using a link in an old panel or ad hoc review invitation and find that it does let you access your profile, please tell us so we can update this accordingly.)

Panelists (https://meetings.nsf.gov/jsp/homepage/panelreview.jsp) and individual ad hoc reviewers (https://www.fastlane.nsf.gov/jsp/homepage/prop_review.jsp) have separate log-in pages on FastLane.

However, both take you to similar landing pages, and both provide the same options for updating a profile.

FastLane_Profile_3

(Again, screenshots from the FastLane help guide.)

While you should confirm and take the time to correct any errors in your contact information, the most often missing pieces are demographic. [They’re even incomplete in the above Help Guide images!]

The reviewer demographic form asks the same questions and provides the same response options as the PI profile form.

FastLane_Profile_4

So please, the next time you review for us, take a moment to complete your profile so we can put some data behind our efforts to make sure our review processes are representative of our communities.

Thanks!

[i] We’ve also noticed that a fair number of you have extra accounts lying around beyond those two; please call the FastLane Help desk to have that fixed.

DEB Numbers: Per-Person Success Rate in DEB


Our friends at Dynamic Ecology posted a little while back about the NSF-wide trends in per-person success rate based on this 2014 report to the National Science Board that provided merit review process statistics across the whole agency[i]. There were several questions in the comments to that post regarding the context for the numbers and how they would look for DEB or IOS, especially since preliminary proposals were explicitly excluded from the calculations in the report to the NSB[ii].

So, we’ve put something together with DEB data to follow-up on that discussion. Our analysis sticks to the general approach of the NSF-wide report with modifications to allow inclusion of preliminary proposal data.

Part 1: Inclusion Criteria

First, let’s be clear about what we’re counting here. The NSB report’s Figure 14 illustrated a per-PI success rate based on counts of Competitive Research Grant Actions leading to Award or Decline decisions. That institutional jargon terminology specifies 3 different filters to define what was counted,

A context filter: Competitive (a stand-alone grant request) versus Non-competitive (changes to an existing grant such as a supplement or a PI move to a new institution) decision-making;

A content filter: Research (just what it sounds like, both Core and Special programs) versus Non-research (e.g., fellowships, dissertation support, travel support, conferences) activities;

An outcome filter: Awarded or Declined versus Any Other Outcome (e.g., invite, not invite, still pending a decision, returned without review, or withdrawn before a decision)

This is actually a really good set of filters for narrowing down the universe of “stuff NSF does” to questions about “bread and butter” grants. Ignoring the Any Other Outcome proposals is a good thing since those categories of proposals were never actually part of the competition in most cases across NSF. On the other hand, it complicates measurement of programs where large numbers of preliminary proposals are involved, as is our case.

Part 2: The Proposal Data

Our first table presents the big picture of proposal submissions for DEB for a period of 2006-2014 (chosen mainly because that was the span of complete years beyond which the server was getting angry with us for requesting too many records, #overlyhonestmethods). We’ve divided them up following each of the filters mentioned above and also split out the DEB-relevant sub-units. (Note: for consistency across all of the different proposal types and with the NSF-wide data, this table counts separately all proposals with unique identification numbers in FastLane. This differs from the way DEBrief usually combines separate proposals from collaborative group into a single “project” unit for counts.)

PerPI_Success_2014_1(click image to make legible)

We have discussed some of these trends before, but to quickly review the basic points:

1) Total actions spiked with the launch of the preliminary proposal system in 2012 but have since come down a bit. This was preceded by another spike in 2010 that was in part a reaction to stimulus funding in 2009 (evidenced by upward jumps in DDIGs, and Core programs from 2009-2010) and also a major spike in special programs that reflects the launch of Dimensions of Biodiversity and some other redistribution of special program responsibilities between Divisions in BIO.

2) Economic stimulus (ARRA) money in 2009 and the wiggle-room gained by clearing out some of the backlog of requests and paying down future commitments resulted in significantly elevated award counts in 2009 and 2010 that distort the longer-term pattern.

3) Incoming preliminary proposal numbers (2012-2014) have been nearly flat, as have the number of research grant award actions, especially when considering both core and special program components over the entire period.

We’re not adding per-proposal success rates to this table specifically because the preliminary proposal process crosses fiscal years and the corrections needed to account for the complexities of the process make that number very different from the straight-forward single-year data above (see endnote i). Per-proposal success rates are shown in our FY2014 wrap-up post.

Part 3: Per-Person Calculations

Each action in the table from Part 2 links to a record of between 1 and 5 persons (PIs and CoPIs) on the proposal cover page.

[Contextual tangent: we are not differentiating between core and special programs in DEB for the per-person success rate. Could it be done? Sure, but the special programs and core programs are both funding research grants and we see that applicants to one or the other quite often switch targets depending on the convenience of deadline or opportunity. Ultimately getting one or the other provides the same result, research funding.]

In total, there were 11,789 unique PIs/CoPIs associated with the 20,724 Competitive Research Grant actions in DEB between 2006 and 2014. During the same time frame, DEB made 2,671 Competitive Research Grant Awards that included a total of 2,970 unique PIs or CoPIs. Most individuals (75% of unique PI/CoPIs) who applied to DEB for funding never received a Competitive Research Award during this entire 9-year period.

The NSB report calculated PI success rate in a 3-year moving window, we’ll do that in a moment. First, we want to split it a different way to account for the stimulus (ARRA) funding in 2009; when combined with the smoothing of the window, that spike in awards winds up distorting some details we’d like to explore.

Annual Per-Person Success Rate

Fiscal Year 2006 2007 2008 2009 2010 2011 2012 2013 2014
Total Unique PIs and CoPIs Applied 2052 2057 2060 2274 3266 2733 4310 4546 3950
Unique Women Applied 503 476 488 559 791 676 1127 1211 1093
Total Unique PIs and CoPIs Awarded 352 449 424 620 601 392 424 455 414
Unique Women Awarded 93 105 114 166 163 100 110 126 110
Per-Person
Success Rate
17.2% 21.8% 20.6% 27.3% 18.4% 14.3% 9.8% 10.0% 10.5%
Per-Woman Success Rate 18.5% 22.1% 23.4% 29.7% 20.6% 14.8% 9.8% 10.4% 10.1%
First Recordings of a PI 2052 1149 909 1006 1553 943 1645 1542 990
Last Recordings of a PI 526 542 471 595 1085 780 1600 2240 3950

The notable patterns here:

1) The preliminary proposal system brought in a huge increase in persons applying each year, double pre-stimulus levels.

2) 2010 was a big year, matching what we saw in the proposal load table, with a large increase in people submitting in reaction to the economic stimulus (ARRA) and following the movement of special programs into DEB.

3) These additional PIs were actually “new” people who had not submitted to DEB since at least 2006; and, we saw about 50% higher numbers of new people for each of 2010, 2012, and 2013 than typical in previous years. But, 2014 looks more like the longer-term norm.

4) The stimulus funding had a big, but temporary, effect by allowing an extra ~200 persons to be funded in both 2009 and 2010. While the effect on per PI success was large in 2009, it was much less in 2010 because of the 1000 additional applicants that year.

5) Excepting the stimulus years, the number of persons funded by research grants doesn’t show a trend or even all that much variation over this span: ~420 unique persons per year.

6) The growth in unique PIs we see includes both an absolute increase and an increasing proportion of female investigators among applicants, although the temporal range is small and the female proportion of applicants has yet to exceed 28%. At the same time, women have generally experienced a per-person success rate (17.7 %) similar to that of the general population (16.7%).

PerPI_Success_2014_2

A Quick Sensitivity Check

There’s a legitimate question as to whether counting PIs and CoPIs provides the best metric of success. Perhaps we should count just PIs? This is what the NSB report does. However, at the preliminary proposal stage, with only a single proposal cover page per project team, there are many instances of collaborative PIs that appear as CoPIs as well as collaborative PIs and CoPIs that don’t appear on a cover page at all. The constraints of the FastLane submission system at the preliminary proposal stage generally lead to undercounting total participants and artificially inflating the balance of CoPIs relative to PIs[iii]. Counting only PIs causes two problems: 1) it ignores a portion of the population at the preliminary proposal stage that would have been counted on full proposals and 2) it would artificially raise the per-PI success rate under the two-stage process relative to the pre-2012 submission process. So, to reflect funding reality as best we can, we cast a wide net and include everyone from the cover pages in the calculations above. However, we can also look to see if the numbers come out any differently if we constrain our calculations to only PIs. Other than the counts being somewhat smaller, the per-person success rates are generally not changed and are tightly correlated with results shown above.

Fiscal Year 2006 2007 2008 2009 2010 2011 2012 2013 2014
Total Unique PIs-Only Applied 1299 1322 1358 1419 1973 1706 2390 2305 2140
Unique Women Applied 314 303 321 366 506 458 684 687 660
Total Unique PIs-Only Awarded 232 292 276 397 361 250 266 261 257
Unique Women Awarded 56 63 69 113 108 66 74 76 63
Per-Person SuccessRate 17.9% 22.1% 20.3% 28.0% 18.3% 14.7% 11.1% 11.3% 12.0%
Per-Woman Success Rate 17.8% 20.8% 21.5% 30.9% 21.3% 14.4% 10.8% 11.1% 9.5%

PerPI_Success_2014_3

Based on the tight coupling of these measures, we continue with our analysis of per-person success using both PIs and CoPIs.

3-Year Window Per-Person Success Rate

In comparison to the annual success rate data, many of the details noted above are paved over by the 3-year window method. We’re not disparaging this method; it is quite useful, especially during steadier budget times. Because the typical grant lasts 3 years, the 3-year success rate window roughly measures the percent of the active PI population that could be continuously funded under current expectations for the size and duration of grants. However, in the case where we have multiple shocks to the system occurring over the reporting period, it can generate misperception.

Fiscal Window 2006-2008 2007-2009 2008-2010 2009-2011 2010-2012 2011-2013 2012-2014
DEB Unique PIs and CoPIs Applied 4110 4303 5221 5563 6798 7354 7790
Unique PIs and CoPIs Awarded 1130 1361 1511 1475 1301 1162 1197
Per-Person Success Rate 27.5% 31.6% 28.9% 26.5% 19.1% 15.8% 15.4%
Per-Person Success Rate (PIs-Only) 28.2% 32.5% 29.7% 27.8% 20.7% 17.9% 17.5%
NSF-wide (From NSB Report) Per-Person Success Rate (PIs-Only, excludes preliminary proposals) 37% 40% 40% 38% 35% 35% TBD

In this table, the windows affected by the extra stimulus (ARRA) funding are in italics and the windows affected by the new applicants to the preliminary proposal system are in bold. The 2010-2012 window sits at the intersection of the stimulus-elevated award numbers and preliminary proposal-driven increase in applicants. What we see here is that the pre-stimulus (2006-2008) and post stimulus (2011-2013 and 2012-2014) awardee numbers are quite similar. However, the applicant numbers have grown substantially, reflecting both the influx of new PIs in response to the stimulus and to the preliminary proposal system. This increase in the number of unique PIs/co-PIs applying in a given 3 year window drives the lower per PI funding success rate.

Notably, DEB’s per-person success rate is continually lower than the NSF-wide number but does follow the same pattern across the ARRA funding windows. The exclusion of preliminary proposal PIs from the NSF-wide counts leads to the increasing disparity between DEB and NSF-wide success rates from 2010-2012 onward.

Award Distribution

We can also compare the annual and 3-year window measures to gain insight into another aspect of per-person success rate. A relevant concern we often hear is that “the same well-funded people just get funded over and over again”. If that were true, we would expect persons funded in year 1 of a window to be funded again in year 2 or year 3. So, the count of unique awardees in the 3-year window would be smaller than the sum of the annual counts of unique awardees (i.e., Dr. X is counted once in the three year window measure but twice, once in year 1 and once in year 3, by the annual measure). But, if grants were spread out (and thus, fewer PIs with overlapping/continuous funding), there would be many fewer repeat PIs so the sum of the annual counts would be much closer to the 3-year window count. In our case we have:

Fiscal Window 2006-2008 2007-2009 2008-2010 2009-2011 2010-2012 2011-2013 2012-2014
3-Year Count of Unique PI/CoPI Awardees 1130 1361 1511 1475 1301 1162 1197
Sum of Annual Unique PI/CoPI Awardee Counts 1225 1493 1645 1613 1417 1271 1293

 

What this tells us is that fewer than 10% of awarded PIs in any 3-year window are repeat awardees during that period (~1.5 – 3.1% of all PIs who apply during that period).

If we step back and consider the whole 9 year period, we still find that the majority of PIs are funded only once.

PerPI_Success_2014_4

Even if they were all 5-year grants, continuous funding of a lab from DEB research grants alone is extremely unlikely for the vast majority of PIs.

 

Concluding Thoughts

1) The number of people being supported on DEB research grants (~420 persons on new grants per year) hasn’t changed much over this period, except for the temporary shock of the economic stimulus.

2) The stimulus, and a 3-year method of smoothing, really messes with the general perception of funding rates. (We actually hadn’t really thought about that much except as the one-year outlier we usually label in our posts. This was eye-opening in that regard.)

3) Funding rates, both per-person and per-proposal, are being driven down by increases in the applicant/application pool: primarily growth in actual participant numbers but some intensification of per-person activity is also possible.

4) Of 11,789 unique PI/CoPI applicants, only 2,970 (25% of all applicants) received any funding over the 9-year period examined. Of those 2,970 to receive funding, only 772 received multiple awards (26% of awardees, 6% of all applicants) that could potentially maintain continuous “funding” over this period. Any person applying to DEB’s competitive research programs is unlikely to be funded, and much less likely to maintain continuous support for a lab from this single funding source.

5) Coming back to our original motivation for this post, per-person success rates for funding in DEB were consistently ~10 percentage points lower than the NSF-wide submission and funding data in years leading up to the preliminary proposal system. The exclusion of preliminary proposals from NSF-wide statistics has only deepened the apparent magnitude of this disparity in recent years and has even altered the trajectory of PI participant counts for the agency as a whole.

[i] The 2015 version of the report, with NSF-wide numbers through fiscal year 2014 should be arriving soon.

[ii] Why are preliminary proposals excluded?

The short answer is: the records don’t neatly match up.

The longer answer is: Beyond the major issue that the entire process from receipt of preliminary proposals through decisions on the related cohort of full proposals crosses fiscal years and so defies straight-forward binning, the path from individual preliminary proposal to award can be surprisingly non-linear. Our ability to accommodate these complexities comes at the expense of our ability to enforce strong rules to ensure continuity of the data you provide to us. Collaborative proposals are a prime example. In many cases not all PIs and CoPIs are actually listed on the cover page of the preliminary proposal. When a full collaborative proposal is invited it results in several different cover pages that each contain a different set of names. There’s no guaranteed 1-to-1 mapping of PIs across the entire process. Also, the basic ability to associate a full proposal with a preliminary proposal is tied to the “institution” which is the official owner of the proposals (not the PI). So if a PI changes institutions, or a collaborative reorganizes, or any number of other things that happen quite regularly comes to pass, the system doesn’t allow the full proposal to be linked to the actual preliminary proposal record. There are also people who receive an invite but then elect not to submit a full proposal for various reasons. On top of which you also have a number of CAREER-eligible PIs who (with or without an invite) will submit CAREER based on their preliminary proposal. The twists and turns are multitude and in the choice between flexibly accepting them and rigid data quality, we generally come down on the side of broad acceptance.

[iii] This is why we ask you to submit a personnel list by email and list all of the people on the 1st page of the preliminary proposal project description to ensure reviewers get the full info. Unfortunately, tying those names to FastLane records is not currently practical.