DEB Numbers: FY 2016 Wrap-Up

DEB Numbers: FY 2016 Wrap-Up

Fiscal year 2016 officially closed out on September 30. Now that we are past our panels in October and early November, we have a chance to look back and report on the DEB Core Program merit review and funding outcomes for FY 2016.

This post follows the format we’ve used in previous years. For a refresher, and lengthier discussions of the hows and whys of the metrics, you can visit the 2015,  2014, and 2013 numbers.

Read on to see how 2016 compares. Continue reading

DEB & IOS Preliminary Proposal System Evaluation Update

Our independent evaluation contractor, Abt, closed the PI and Reviewer surveys in mid-November.

At present, they are working on analyzing the survey results. These results will be brought together with analyses of stakeholder interviews and programmatic (proposal, award, and review process) data that Abt has already completed to produce a full evaluation.

We are expecting the final report to be delivered to NSF by the end of February, 2017. We are looking forward to sharing the results with you as we are able.

And, to the many of you who were contacted by Abt to take part in the survey, thank you for your time and participation.

Program Announcement: DEB Core Programs & LTREB Solicitations Updates

Updated guidelines are now available for submissions under the two-stage DEB preliminary/full proposal system. Both DEB Core Programs and Long Term Research in Environmental Biology (LTREB) have been updated.

The new DEB Core Programs publication is NSF 17-512[i].

The new LTREB publication is NSF 17-513[ii].

Please read these guidelines if you plan to submit a preliminary proposal.

In this post, we’re providing a brief summary of the notable points and key changes, but this is not sufficient information to complete a submission.

Both solicitations

  • The definition of “Eligible Institutions” has been updated with limits on the eligible institution types. Institution types that do not meet this definition remain eligible as sub-awardees, but cannot be the primary grant recipient.
  • The deadline for submitting the Personnel List Spreadsheet (from a template, submitted by email) has been reduced to 1 business day (from 3 days) after the proposal deadline for both preliminary and full proposals.
  • The purpose and procedures for requesting a full proposal deferral have been updated and clarified.
  • The requirement for full proposals to provide results of prior NSF support has been clarified and emphasized.
  • The guidelines for Letters of Collaboration (to confirm cooperation or involvement of persons or organizations not receiving funding under the proposal) have been updated to clarify the purpose of, and limits on, such letters.

DEB Core Programs

  • The Core Programs solicitation now includes instructions for submission of international collaborative proposals involving eligible collaborators in the UK (via NERC) or Israel (via BSF). These instructions continue the partnerships originally advertised as Dear Colleague Letters.
  • The budget cap for the small grants (SG) option has been increased to $200,000.

LTREB

  • The Project Description page limit for RENEWAL proposals has been increased from 8 to 10 pages.

Changes Beyond the DEB Solicitations

IOS

Many of our PIs have research interests that overlap between DEB and the Division of Integrative Organismal Systems (IOS). New submission guidelines for the preliminary proposal system in IOS have also been published as NSF 17-508. Check with IOS and the IOS Blog for additional information.

NSF-wide

Please take note that the NSF general proposal guidelines have also been revised. This information is provided in the NSF Proposal & Award Policies & Procedures Guide (PAPPG), which previously comprised two publications known as the Grant Proposal Guide (GPG) and Award & Administration Guide (AAG). The new version of the PAPPG, is a single consolidated guide:  NSF 17-1. The guidelines in PAPPG 17-1 apply for proposals submitted or due, or awards made, on or after January 30, 2017. This document contains the full set of general guidelines to PIs, including everything from proposal preparation to award reporting and close-out.

A summary explanation of the new PAPPG format and changes from the previous edition of the guide can be read here: https://www.nsf.gov/pubs/policydocs/pappg17_1/sigchanges.jsp

These revisions have minimal effect on the requirements for the upcoming DEB preliminary proposal deadline (since the PAPPG comes into force on Jan 30, 2017 – a week after the pre-proposal deadline).

The guidelines in PAPPG 17-1 will apply for invited full proposals (due next August), and other proposals you may be planning to submit to DEB or other NSF programs.

For instance, starting on Jan, 30 2017 any RAPID or EAGER proposals intended for DEB would list the NSF 17-1 PAPPG program announcement number on the proposal cover page.


[i] The old solicitation NSF 15-609 is no longer accepting new proposals.

[ii] The old solicitation NSF 16-500 is no longer accepting new proposals.

Preliminary Proposal Evaluation Survey Reminder

TL;DR

Check your inbox.

Check your spam folder.

Complete the survey!

End the reminder messages.

 

Background (if the above doesn’t make sense to you).

This is about the Preliminary Proposal system in use in both NSF BIO’s Division of Environmental Biology and Division of Integrative Organismal Systems.

We are in the midst of an external evaluation of the effects of this system on the merit review process.

We posted an initial notification letter about stakeholder surveys. And, copies of this letter were sent out to everyone in the sample ahead of the formal invitations.

The formal survey invitations with the active survey links were sent out by mid-September from the evaluator, Abt Associates.

Reminder emails are also coming out and will continue to do so at regular interviews while the survey remains open and incomplete.

If you have been receiving these messages, please complete the survey. If your colleagues have been receiving these messages and have not completed the survey, encourage them to do so.

If you received an invitation to take the survey,

  • Please take the 10 or so minutes to register your responses via the link in the email.
  • Remember that these are single-use individualized links.
  • Your response matters. This isn’t a census: your invitation is part of a stratified random sample selected for inference to the population.

Thank you for your participation!

DEB Numbers: Historical Proposal Loads

Last spring we posted on the per-person success rate and pointed out several interesting findings based on a decade of DEB data. We were seeing a lot of new PIs and, conversely, a lot of PIs who never returned after their first shot. And, the vast majority of PIs who managed to obtain funding are not continuously funded.

This post is a short follow-up to take a bigger picture look at submission rates.

Since preliminary proposals entered the scene, DEB really hasn’t seen much change in the submission pattern: 75% of PIs in any year submit one preliminary proposal and the other 25% submit two (and a small number submit three ideas in a year, if one also counts full proposals to special programs).

Before the preliminary proposals were launched, we ran some numbers on how often people tended to submit. The results were that, in the years immediately prior to preliminary proposals (~2008-2011), around 75% of PIs in a year were on a single proposal submission (25% on two or more). Fewer than 5% of PIs submitted more than two proposals in a year. Further, most PIs didn’t return to submit proposals year after year (either new ideas or re-working of prior submissions); skipping a year or two between submissions was typical. These data conflicted with the perceptions and anecdotes that “everyone” submitted several proposals every year and were increasing their submission intensity. Although recent data don’t support those perceptions, we still wondered if there might be a kernel of truth to be found on a longer time scale. What is the bigger picture of history of proposal load and submission behavior across BIO?

Well, with some digging we were able to put together a data set that lets us take a look at full proposal research grant submissions across BIO, going all the way back to 1991 when, it seems, the NSF started computerized record-keeping. Looking at this bigger picture of submissions, we can see when changes have occurred and how they fit into the broader narrative of the changing funding environment.

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

Total BIO full research grant submissions per year (line, right axis) and proportions of individuals submitting 1, 2, 3, 4, 5, or more proposals each calendar year from 1991 to 2014. (Note: 2015 is excluded because proposals submitted in calendar year 2015 are still being processed at the time of writing.)

 

1990s: Throughout the 1990s BIO received about 4000 proposals per year. This period of relative stability represents the baseline for more than a decade of subsequent discussions of increasing proposal pressure. Interestingly, the proportion of people submitting two or more proposals each year grew over this period, but without seeming to affect total proposal load; this could result from either increasing collaboration (something we’ve seen) or a shrinking PI pool (something we haven’t seen). At this time NSF used a paper-based process, so the cost and effort to prepare a proposal was quite high. Then….

2000s: In 2000, FastLane became fully operational and everyone switched to electronic submission. BIO also saw the launch of special programs in the new Emerging Frontiers division. In a single year, it became easier to submit a proposal and there were more deadlines and target dates to which one could potentially submit. The new electronic submission mechanism and new opportunities likely both contributed to increased submissions in subsequent years.

Following the switch to FastLane, from 2001 to 2005, total annual submissions grew to about 50% above the 1990s average and stayed there for a few years. This period of growth also coincided with an increasing proportion of people submitting 2+ proposals. Increasing numbers of proposals per person had only a limited effect on the total proposal load because of continued growth in collaboration (increasing PIs per proposal). Instead, the major driver of proposal increases was the increasing number of people submitting proposals. This situation was not unique to BIO.

This period from 2001 to 2005 was the rapid growth that sparked widespread discussion in the scientific community of overburdening of the system and threats to the quality of merit review, as summarized in the 2007 IPAMM report.

Eventually, however, the community experienced a declining success rate because BIO budgets did not go up in any way to match the 50% increase in proposal submissions. From 2005-2008 submissions/person seemed to stabilize and submissions peaked in 2006. We interpret this as a shift in behavior in response to decreasing returns for proposal effort (a rebalancing of the effort/benefit ratio for submissions). It would have been interesting to see if this held, but….

2009/2010: In 2009 and 2010, BIO was up another ~1000 proposals over 2006, reaching an all-time high of nearly 7000 proposal submissions. These were the years of ARRA, the economic stimulus package. Even though NSF was very clear that almost all stimulus funding would go toward funding proposals that had been already reviewed (from 2008) and that we wouldn’t otherwise be able to afford, there was a clear reaction from the community. It appears that the idea of more money (or less competition) created a perception that the effort/benefit relationship may have changed, leading to more proposals.

2011: We see a drop in 2011. It is plausible that this was the realization that the ARRA money really was a one-time deal, there were still many more good proposals than could be funded, and that obtaining funding hadn’t suddenly become easier. As a result, the effort/benefit dynamic could be shifting back; or, this could’ve been a one-time off year. We can’t know for sure because…

2012: Starting in 2012 IOS and DEB, the two largest Divisions in BIO, switched to a system of preliminary proposals  to provide a first-pass screening of projects (preliminary proposals are not counted in the chart). This effectively restricted the number of full proposals in the two largest competitions in BIO such that in 2012, 2013, and 2014 the full proposal load across BIO dropped below 5000 proposals per year (down 2000 proposals from the 2010 peak). The proportion of individuals submitting 2+ full proposals per year also dropped, consistent with the submission limits imposed in DEB, IOS, and MCB. PIs now submitting multiple full proposals to BIO in a given year are generally submitting to multiple programs (core program and special program) or multiple Divisions (DEB and [IOS or MCB or EF or DBI]) and diversifying their submission portfolios.

In summary, the introduction of online and multi-institutional submissions via FastLane kicked off a decade of change marked by growth in proposal submissions and per-PI submissions to BIO. The response, a switch to preliminary proposals in IOS and DEB, caused a major (~1/3) reduction in full proposals and also a shift in the proportion of individuals submitting multiple proposals each year. In essence, the pattern of proposal submission in BIO has shifted back to what it was like in the early 2000s. However, even with these reductions, it is still a more competitive context than the 1990s baseline, prior to online submissions via FastLane.

Spring 2016: DEB Preliminary Proposal Results

Notices

All PIs should have received notice of the results of your 2016 DEB Core Program preliminary proposals by now. Full proposal invitation notices were all sent out by the first week of May (ahead of schedule), giving those invited PIs a solid three months to prepare their full proposals. ‘Do Not Invite’ decisions began going out immediately thereafter and throughout the rest of May.

If you haven’t heard, go to fastlane.nsf.gov and log in. Then, select the options for “proposal functions” then “proposal status.” This should bring up your proposal info. If you were a Co-PI, check with the lead PI on your proposal: that person is designated to receive all of the notifications related to the submission.

If you are the lead PI and still have not heard anything AND do not see an updated proposal status in FastLane, then email your Program Officer/Program Director. Be sure to include the seven-digit proposal ID number of your submission in the message.

Process

All told, DEB took 1474 preliminary proposals to 10 panels during March and April of 2016. A big thank you to all of the panelists who served and provided much thoughtful discussion and reasoned recommendations. Note: if you’re interested in hearing a first-hand account of the DEB preliminary proposal panel process, check out this great post by Mike Kaspari.

Panelists received review assignments several weeks prior to the panels and prepared individual written reviews and individual scores. During the panel, each proposal was discussed by the assigned panelists and then presented to the entire panel for additional discussion and assignment to a rating category. Panels were presented two recommendation options for each preliminary proposal: Invite or Do Not Invite. Following discussion, the assigned panelists prepared a panel summary statement to synthesize the key points of the panel discussion and rationale for the assigned rating.

Both the individual written reviews and the panel summary statement are released to the PI of the preliminary proposal.

As we’ve discussed previously, the final decisions on the preliminary proposals are made by the programs with concurrence of senior management. These decisions take into account the panel recommendations, especially the substance of the discussions, as well as expectations for future award-making capacity based on the availability of funds, additional expected proposal load at the full proposal stage, and portfolio balance issues.

Results

Total Reviewed Panel Recommendations Total Invited Invite Rate
DEB Cluster Invite Do Not Invite No Consensus
SBS 289 79 210 0 85 29%
EP 440 94 346 0 101 23%
PCE 439 122 315 2 110 25%
ES 306 94 212 0 86 28%
DEB Total 1474 389 1083 2 382 26%

These numbers are consistent with our goal of inviting the most promising projects while targeting a success rate of approximately 25% for the resulting full proposals that will be submitted this summer.

Big Picture

Comparing to the previous rounds of preliminary proposals…

2012 2013 2014 2015 2016
Reviewed 1626 1629 1590 1495 1474
Invited 358 365 366 383 382
Invite Rate 22% 22% 23% 26% 26%

…we see that the system has recovered somewhat from the initial flood of submissions. Moreover, the invite rate, and subsequent full proposal success rate, has stabilized in a range that reasonably balances against the effort required to produce each submission.

DEB Numbers: Success Rates by Merit Review Recommendation

DEB Numbers: Success Rates by Merit Review Recommendation

We recently received a comment from a panelist (paraphrasing): how likely are good proposals to get funded? We’ve previously discussed differences between the funding rates we report directly to you from panels and the NSF-wide success rate numbers reported on our website.  But the commenter was interested in an even more nuanced question: to what extent do award decisions follow the outcomes of merit review? This is a great topic for a post and, thanks to our Committee of Visitors review last year, we already have the relevant data compiled. (So this is really the perfect data-rich but quick post for panel season.)

To address this question, we need to first define what a “good proposal” is.

In our two-stage annual cycle, each project must pass through review at least twice before being awarded: once as a preliminary proposal, and once as an invited full proposal.

At each stage, review progresses in three steps:

  • Three individual panelists independently read, review, and score each proposal prior to the panel. A single DEB panelist is responsible for reviewing an assigned subset of all proposals at the panel. This is the same for preliminary proposals and full proposals. Full proposals also receive several non-panelist “ad hoc” reviews prior to the panel.
  • The proposal is brought to panel where the panelists discuss the proposal and individual reviews in relation to each other and in the context of the rest of the proposals in the panel to reach a consensus recommendation. This is the same for preliminary proposals and full proposals.
  • The Program Officers managing the program take into consideration the reviews, the recommendations of the panel(s) that assessed the proposal, and their portfolio management responsibilities to arrive at a final recommendation. This is the same for preliminary proposals and full proposals.

In this case, since we are discussing the Program’s actions after peer review, we are defining as “good” anything that received a positive consensus panel recommendation. Initially, the label of “good” will be applied by the preliminary proposal panel. Then, at the full proposal panel it will receive a second label, which may or may not also be “good”. A “good” recommendation for either preliminary or full proposals includes any proposal not placed into the lowest (explicitly negative) rating category. The lowest category usually has the word “not” in it, as in “Do Not Invite” or “Not Fundable”. All other categories are considered “good” recommendations, whether there is a single positive category (e.g., “Invite”) or several ordinal options conveying varying degrees of enthusiasm (e.g., “high priority”, “medium priority”, “low priority”).

To enable this analysis, we traced the individual review scores, panel review recommendations, and outcomes for proposals from the first three years of the DEB preliminary proposal system (i.e., starting with preliminary proposals from January 2012 through full proposals from August 2014).

As we’ve reported previously, preliminary proposal invitation rates are between 20% and 30%, and between 20% and 30% of invited full proposals are funded, leading to end-to-end funding rates around 7%. But, as our commenter noted, that obscures a lot of information and your individual mileage will vary. So…

How likely are “good” proposals to get funded?

In the table below, you can see the overall invitation rate for preliminary proposals is 23%, but it looks very different depending on how well it performed in the panel[i].

Preliminary Proposal Outcomes by Panel Recommendation % of Proposals Receiving Rating Pre-Proposal Outcome
Not Invited Invited Invite Rate
Pre-Proposal Panel Rating High (Good) 19% 22 879 98%
Low (Good) 5% 100 141 59%
Do Not Invite 76% 3597 74 2%
Total 100% 3719 1094 23%

This stage is a major winnowing of projects. On the one hand, we tend toward inviting most of that which is recommended by the panel. On the other hand, for the majority of preliminary proposals that aren’t well-rated (so falling outside our working definition of “good”), it is highly unlikely it will see the full proposal stage. There is a low, 2%, Invite rate for proposals that the panels recommended as Do Not Invite. This is a measure of the extent to which program officers disagree with panelists and choose to take a chance on a particular idea or PI, based on their own knowledge of submission history and portfolio balance issues.

From these invitations, the programs receive full proposals. After review, programs award approximately 25% of the full proposals, but again the outcome is strongly influenced by the panel ratings.

Full Proposal Outcomes by Panel Recommendation % of Proposals Receiving Rating Full Proposal Outcome
Declined Awarded Funding Rate
Full Proposal Panel Rating High (Good) 17% 30 122 80%
Medium (Good) 23% 115 98 46%
Low (Good) 21% 165 21 11%
Not Competitive 39% 349 7 2%
Total 100% 659 248 27%

Program Officers are faced with a greater responsibility for decision-making at the full proposal stage. Whereas, preliminary proposal panels only gave the nod (High or Low positive recommendations) to ~23% of submissions, full proposal panels put 551 of 907 proposals into “fundable” categories (Low, Medium, or High). Since this is more than twice as many as the programs could actually fund,[ii] the work of interpreting individual reviews, panel summaries, and accounting for portfolio balance plays a greater role in making the final cut. Also note, that these are the cumulative results of three years of decision-making by four independently managed program clusters, so “divide by 12” to get a sense of how common any result is for a specific program per year.

Ultimately, the full proposal panel rating is the major influence on an individual proposal’s likelihood of funding and the hierarchy of “fundable” bins guides these decisions:

Success rates of DEB full proposals when categorized by preliminary proposal and full proposal panel recommendations.

Success rates of DEB full proposals when categorized by preliminary proposal and full proposal panel recommendations.

While funding decisions mostly ignore the preliminary proposal ratings, readers may notice an apparent “bonus” effect in the funding rate for “Do Not Invite” preliminary proposals that wind up in fundable full proposal categories. For example, of 15 preliminary proposals that were rated “Do Not Invite” but were invited and received a “Medium” rating at the full proposal stage, 10 (67%) were funded compared to 45% and 42% funding for Medium-rated full proposals that preliminary proposal panelists rated as High or Low priority, respectively.  However, this is a sample size issue. Overall the numbers of Awarded and Declined full proposals are not associated with the preliminary proposal recommendation (Chi-Square = 2.90, p = 0.235).

 

Does Preliminary Proposal rating predict Full Proposal rating?

This is a difficult question to answer since there is nothing solid to compare against.

We don’t have a representative set of non-invited full proposals that we can compare to say “yes, these do fare better, the same as, or worse than the proposals that were rated highly” when it comes to the review ratings. What we do have is the set of “Low” preliminary proposals that were invited, and the small set of “Do Not Invite” preliminary proposals that were invited by the Program Officers against the panel recommendations. However, these groups are confounded by the decision process: these invites were purposely selected because the Program Officers thought they would be competitive at the full proposal stage. They are ideas we thought the panels missed or selected for portfolio balance; therefore, they are not representative of the entire set of preliminary proposals for which the panels recommended Low or Do Not Invite.

Distribution of Full Proposal Panel Ratings versus Preliminary Proposal Ratings # Recvd As Full Proposals Full Proposal Panel Rating
High Medium Low Not Competitive
Pre-Proposal Panel Rating High 728 19% 24% 20% 37%
Low 117 10% 21% 20% 50%
Do Not Invite 62 8% 24% 23% 45%

So, given the active attempts to pick the best proposals out of those in the “Low” and “Do Not Invite” preliminary proposal categories, those which had been invited based on “High” ratings were twice as likely to wind up in the “High” category at the full proposal stage than those that had been invited from Low or Do Not Invite preliminary proposal categories. And, those invited from the Low or Do Not Invite categories were somewhat more likely to wind up in Not Competitive. Moreover, the score data presented below provides additional evidence that suggests this process is, in fact, selecting the best proposals.

 

What do individual review scores say about the outcomes and different panel ratings?

We expect the full proposal review stage to be a more challenging experience than the preliminary proposal stage because most of the clearly non-competitive proposals have already been screened out. Because of this, full proposals should present a tighter grouping of reviewer scores than preliminary proposals. The distribution of average proposal scores across the two stages is shown below. We converted the “P/F/G/V/E” individual review scores to a numerical scale from P=1 to E=5, with split scores as the average of the two letters (e.g., V/G = 3.5). As a reminder, the individual reviewer scores are sent in prior to the panel, without access to other reviewers’ opinions and having access to a relatively small number of proposals. So the average rating (and spread of individual scores for a proposal) is mostly a starting point for discussion and not the end-result of the review[iii].

Distribution of mean review scores at different points in the DEB core program review process.

The preliminary proposal scores are distributed across the entire spectrum, with the average review scores for most in the 3 to 4 range (a Good to Very Good rating). That we don’t see much in the way of scores below 2 might suggest pre-selection on the part of applicants or rating inflation by reviewers. Invitations (and high panel ratings) typically go to preliminary proposals with average scores above Very Good (4). Only a few invitations are sent out for proposals between Very Good and Good or lower.

The average scores for full proposals are more evenly distributed than the preliminary proposal scores with a mean and median around Very Good. The eventual awards draw heavily from the Very Good to Excellent score range and none were lower than an average of Very Good/Good. And, while some full proposals necessarily performed worse than they did at the preliminary proposal stage, there are still roughly twice as many full proposals with average scores above Very Good than the total number of awards made, so there is no dearth of high performing options for award-making.

So, what scores correspond to different panel ratings?

Average Review Score of Invited Full Proposals by Panel Recommendation Full Proposal Panel Rating
High Medium Low Not Competitive Overall
Pre-Proposal Panel Rating High 4.41 4.08 3.76 3.53 3.88
Low 4.32 4.13 3.88 3.52 3.81
Do Not Invite 4.42 4.00 3.75 3.44 3.73
Overall 4.40 4.08 3.78 3.53 3.87

There’s virtually no difference in average full proposal scores among groups of proposals that received different preliminary proposal panel ratings (rows, above). This further supports the notion that the full proposals are being assessed without bias based on the preliminary proposal outcomes (which are available to full proposal panelists after individual reviews are written). There is approximately a whole letter score difference between the average scores of full proposals (columns) from highly rated full proposals (E/V) to Not Competitive Full proposals (V/G). The average score for each rating is distinct.

 

About the Data:

The dataset used in this analysis was originally prepared for the June 2015 DEB Committee of Visitors meeting. We traced the review outcomes of preliminary proposals and subsequent full proposals over the first 3 cycles of proposal review. This dataset included the majority of proposals that have gone through the 2-stage review in DEB, but is not a complete record because preliminary proposal records are only tied to full proposals if this connection is successfully made by the PI at the time of full proposal submission. We discussed some of the difficulties in making this connection on DEBrief in the post titled “DEB Numbers: Per-person success rate in DEB”.

There are 4840 preliminary proposal records in this dataset; 1115 received invitations to submit full proposals. Of those 1115, 928 (83%) submitted full proposals and successfully identified their preliminary proposal. Full proposal records are lacking for the remaining 187 invitees; this is combination of 1) records missing necessary links and 2) ~a few dozen invitations that were never used within the window of this analysis. For full proposal calculations, we considered only those proposals that had links and had been processed to a final decision point as of June 2015 (907 records) when the data was captured.

The records followed the lead proposal of collaborative groups/projects in order to maintain a 1 to 1 relationship of all records across preliminary and full proposal stages and avoid counting duplications of review data. The dataset did not include full proposals that were reviewed alongside invited proposals but submitted under other mechanisms that bypass the preliminary proposal stage such as CAREER, OPUS, and RCN.

Data Cleaning: Panel recommendations are not required to conform to a standard format, and the choice of labels, number of options, and exact wording vary from program to program and has changed over time in DEB. To facilitate analysis, the various terms have been matched onto a 4-level scale (High/Medium/Low/Not Invite (or Not Competitive)), which was the widest scale used by any panel in the dataset; any binary values were matched to the top and bottom of the scale. Where a proposal was co-reviewed in 2 or more panels, the most positive panel rating was used for this analysis.

[i] Cases where the highly recommended preliminary proposal was Not Invited were typically because the project received funding (either we were still waiting on our budget from the prior year and the PI re-submitted, or the same work was picked up by another funding source). So, the effective invite rate for “high priority” recommendations is ~100%. The middle “Low” priority rating was used in only a limited set of preproposal panels in the first years of preproposals; at this point, all DEB preproposal panels used two-level “Invite or Do Not Invite” recommendations.

[ii] 248 is less than what we actually funded from the full proposal panels: when CAREER, OPUS, RCN, and proposals that were not correctly linked to preproposal data are accounted for, we’re a bit over 300 core program projects awarded in FYs 2013, 2014 and 2015: 100 new projects/year.

[iii] If the program were to be purely conservative and follow the scoring exactly in making award decisions, there would have been no awards with an average score below 4.2 (Very Good+) and even then half of the proposals that averaged Very Good (4) or better would go unfunded.

Spring 2016 Progress Update

You may have noticed it has lately been quiet on the blog.

We’re in the middle of processing the 1500 or so preliminary proposals we received at the end of January. After reviewing them for completeness and relevance to the program and sorting out the overlap between us and IOS, ~1450 proposals were accepted for review and assigned to 10 panels, which will run in March and April.

At this point in the review process, many panelists have received their individual review assignments and access to the proposals with more yet to come. And, over the coming weeks, waves of panelists will begin descending upon NSF to meet and discuss the preliminary proposals. When the dust settles, Program Officers will meet together to compare notes and develop invite/do not invite recommendations.

As in prior years, Invite notices will receive first priority and be batched by program (4 program clusters = 4 batches). For example, everyone receiving an invitation from Population and Community Ecology (PCE) should hear back over a span of a few days, and everyone receiving an invitation from Systematics and Biodiversity Sciences (SBS) should hear back over a span of a few days. But the invites for PCE and for SBS are not likely to go out on the same days. This is to maximize the amount of time available to invitees to prepare a full proposal and minimize any delays in notification relative to others competing in the same cluster at the full proposal stage.

Do Not Invite notices will come out after the invites. These will also be batched, but less strictly so the span of notification may not be as narrow.  There is likely to be a week or more between notices of good news and bad news for a particular program.

We are targeting all good news to be delivered by mid-May and all bad news by the end of May.

As always, log in and check FastLane for updates first. Updates will show there even if you have a bad email address on file and the notice doesn’t reach you. If you are a Co-PI, talk to your lead PI first: they receive all the correspondence. And, if June 1 comes and you haven’t heard at all, that is when the lead PI should drop us an email.

Prepare Now! Don’t Panic Later. [Updated x2]

Questions and Reminders about DEB’s Prepropsoal Deadline

We’re starting to get calls in noticeable volume relating to the upcoming preproposal deadline, and a few common questions are sticking out. So, to make sure everyone has access to these answers, we’re posting them for all to see, below. If your question isn’t addressed here, please leave a comment or send us an email and we’ll respond to you and post the response here if it’s sufficiently generalizable.

Continue reading