DEB Numbers: CAREER Redux


Last August we provided a short post looking at CAREER proposal submissions across BIO and what effects the preliminary proposal system may have had on those numbers. In response to a question on our FY 2014 Wrap-Up post, we’re taking this opportunity to share a little bit more about the CAREER program.

Please read the previous CAREER post if you need an intro to the program.

Overall CAREER Trends

As we discussed before, CAREER submissions in DEB increased markedly from FY 2007 to FY 2009, creating a new normal over the last 5 years; annual submissions are now roughly double what they were in the early 2000s.

At the same time, the annual number of CAREER awards in DEB has also increased. However, CAREER awards have always been a small part of the DEB portfolio— a single digits percentage of total awards for many years.

CAREER2fig1

While these award numbers seem small, CAREER awards have increased since 2008 relative to the total number of awards made to our core programs.

CAREER2fig2

This pattern results from a combination of an increase in absolute numbers of CAREER awards and declining total numbers of other core program awards.

The funding rate (Number Awarded/Number Reviewed) for DEB CAREER proposals has been fairly flat in the long term but with some wild swings concurrent with the recent high submission counts and budgetary changes. Over the 13 year period examined here, there were 914 CAREER proposals reviewed in DEB and we made 108 awards. This gives a 12% funding rate.

CAREER2fig3

Even though the overall funding rate has been on a downward trend for the DEB Core Programs (see FY 2014 Wrap-up), CAREER proposals have suffered less and have closed the gap with regular proposals that was evident before FY 2009 (the spike in 2009 was due to ARRA – the stimulus funding). In the long run, the CAREER proposal funding rate is subject to the same external drivers as the overall success rate, submission counts and appropriated budgets, which adds a great deal of uncertainty to any prognostication based on current trends alone.

Under the preliminary proposal system (FYs 2013 and 2014), CAREER proposals fare about as well as regular proposals going through the entire two-stage process (7-8% funded). However, they fare much worse than the invited full proposals with which they share our fall panels. This backs up a frequent comment we hear from our fall panelists that the CAREER proposals don’t seem to do very well. As a whole, we would expect CAREER proposals to fare worse on average than the pre-screened set of invited full proposals in the same panel because the pre-screening wasn’t applied to the CAREER projects. At the end of the entire process though, the system is identifying and prioritizing for funding high-quality CAREER projects at the same rate as regular projects. This suggests that, on the whole, the project ideas in the body of CAREER proposals fare no better and no worse than the ideas in regular proposals through the DEB review system. Neither the myth that CAREERs are “easier” for eligible faculty to obtain nor the opposing claim that they more prone to being trashed in review are supported by the funding trends.

Cluster Numbers

We can break down the numbers of CAREER proposals and awards by cluster. Below, we’ve put the proposals and awards on separate figures to allow more vertical detail and show the contribution of each cluster to the total counts of CAREER proposals and awards.

CAREER2fig4

Submission numbers for the CAREER solicitation tend to mirror the distribution of Core Program submissions in general: Population and Community Ecology (PCE) receiving a plurality, followed closely by Evolutionary Processes (EP), and then Ecosystem Science (ES) and Systematics and Biodiversity Science (SBS). Prior to 2007, ES represented a larger portion of CAREER submissions than at present but has since been eclipsed by growth in submissions to PCE and EP. Our interpretation of this is that because CAREER proposals are all single-investigator projects they exhibit less potential for growth in this community (ES researchers) that places a strong emphasis on collaborative projects.

CAREER2fig5

Award numbers also provide for some interesting discussions within the limits of such small numbers. The growth in DEB CAREER awards is primarily due to increases in the number of awards made in the EP and PCE clusters. While the PCE growth has been consistent with the submission growth (12% funded prior to 2009, 12% funded since 2009), the awards numbers in EP have grown more than the submissions (6% funded prior to 2009, 16% funded since 2009). There has been little to no change in ES CAREER award counts, matching their lack of submission growth, and the same can be said for SBS.

PI Numbers

Pivoting from the cluster view, we also wanted to present some numbers relating to who applies for and who receives CAREER awards.

Times (re)submitted, 2002-2014 PI Count Awarded PIs Success Rate
1 414 58 14%
2 156 32 21%
3 55 18 33%

 

The first thing is that the plurality (if not majority[i]) of CAREER awardees were funded on their first submission. But, a majority of applicants also try only once for a CAREER. The success rate for these 1-timers is ~14%. However, the PIs who come back and try again do better (21%). While fewer PIs succeed at each resubmission, the number attempting resubmissions drops off much more quickly meaning those PIs are funded at much higher rates. Someone on a 3rd attempt at a CAREER is more than twice as likely to be funded as someone on their 1st attempt. (On a side note: to the extent we can do so, we observe a similar pattern in regular proposals too. But, it’s much harder to accurately track “resubmissions” due to PIs juggling multiple proposals, and changes in personnel, titles, and programs; thus the magnitude of the effect for full proposals is uncertain.)

Also, successful CAREER PIs are not necessarily those who are the newest faculty.

CAREER2fig6

Given the requirements of the CAREER solicitation, it can be advantageous to already have a lab up and running and to have gotten into the groove of being a researcher/educator. Also, prior experience with writing a successful grant helps when you’re attempting to write a successful grant for the challenging criteria of the CAREER program.

Closing Thoughts

One of the important messages here that is tough to illustrate directly with charts and tables is that CAREER Awards have a different purpose than regular research grant awards, and this distinction sometimes eludes PIs and reviewers. CAREER projects emphasize the integration of research and educational activities; this requirement is highly selective for a subset of PIs who are passionate about this professional nexus. In fact, these requirements can be so organizationally difficult to execute that we often advise the newest faculty against applying by this route when they are just getting their first lab together. A major failing of many declined CAREER proposals is inadequate integration of research and educational activities: it might have been a competitive regular proposal but because the PI selected a higher bar, they didn’t pass it.

As hard as it is to conceive of and present a strong case for a CAREER proposal, it can also be difficult for reviewers to rate these proposals when they themselves are not experts on the educational integration side of things. That is a good argument for allowing CAREER proposals to come in directly as full proposals because they can benefit from selection of ad hoc reviewers with strengths in those aspects otherwise on the periphery of expectations for a regular proposal. And, journeying into the realm of speculation, one might wonder if the increase in submissions actually contributes to an increased funding rate because reviewers then have a larger pool on which to base their expectations for these projects. Perhaps we hit some critical mass of understanding by panelists that helps overcome misperceptions. Either way, supportive panel reviews are an important part of justifying CAREER award decisions and the charts above should make it clear that we fully expect CAREER awardees to be able to pass that (elevated) bar.

While they are a valuable facet of our award portfolio that we support and try to give every consideration, we don’t have quotas and don’t provide a free pass to funding for CAREER proposals. Any decision to apply for a CAREER should be based on a weighing of your own variables: personal preferences, professional preparation, research interests, practicability, but not simply because someone said it would be “good to do”. To put it another way: CAREER proposals aren’t for everyone and you only get three attempts at it. If this is something you both want to do and feel you can do, that’s great. But, bowing to pressure to submit a CAREER project when it’s not a good fit for you or before you’re ready can distract from and delay pursuit of other potentially successful proposals.

Now that we’ve left you pondering whether it makes sense to submit at all, we want to try to close with something more positive for potential CAREER applicants to consider. Think about submitting a preliminary proposal before you submit a CAREER proposal. This is essentially a “no risk” proposition. Regardless of outcome, you will get some feedback on your ideas. This feedback may or may not be directly applicable to a CAREER proposal given the differences but you’ve gained useful experience. Even if you get a preliminary proposal rejection, you wind up with additional data that you can consider to make an informed (not reflexive) decision as to whether you can improve the work and present a strong case for an integrated educational component. And, you might just get an invitation for a full proposal. In that case you have not only gained feedback but a choice as to whether you want to pursue a full proposal or attempt to turn that idea around and transform the project into a full-fledged CAREER.

[i] Caveat: because we didn’t track individuals back before 2002, those who appear once during the early part of the data could be on a 2nd or 3rd submission.

New SEES Dear Colleague Letter: Food, Energy, Water UPDATED


2/12/15 UPDATE:

NSF is hosting a webinar for the FEW Dear Colleague Letter opportunity on February 25, 2015. This will include a live Q&A with Program Officers. Participants must register for the webinar no later than February 24, 2015. Please follow this link to the NSF even website for full details and registration: http://www.nsf.gov/events/event_summ.jsp?cntn_id=134146&WT.mc_id=USNSF_13&WT.mc_ev=click

 

Dear Colleague Letter: SEES: Interactions of Food Systems with Water and Energy Systems

Please check out the above link to learn about the new funding opportunity. This Dear Colleague Letter provides for Workshops and Supplements (and EAGERs for any of you applying to Math and Physical Sciences (MPS)) addressing the intersection of food, energy, and water systems.

Information about other ongoing and past SEES investments can be found on the SEES homepage: http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504707, and additional information about both SEES and the Food, Energy, Water nexus is available in the President’s FY2016 Budget Request to Congress, details here and here.

 

DEB Numbers: DEB Panelist Survey Results


What do panelists think of the preliminary proposal process?

Here we present the results of recent surveys of DEB panelists, starting with the first round of preliminary proposal panels in March/April 2012 through the fall 2014 full proposal panels.

This survey was meant to gauge panelist reactions to the preliminary proposal process during the first few years in a case where baseline opinion data were lacking. Since many of the survey items relate to perceived differences between the new system and panelist experiences prior to 2012, the usefulness of these questions heading in to the future will be limited as the pool of panelists continues to grow and memories of those prior experiences fade.

Quick Summary

Over the six panel seasons, more than 70% of panelists completed the survey. The respondent counts were higher for preliminary proposal rounds (~170 respondents per round) than for full proposal rounds (~80 respondents per round), consistent with the larger number of panelists required for preliminary proposals.

The majority of respondents reported an improved panel experience in both preliminary and full proposal panels compared to pre-2012 full proposal panels.

Preliminary proposal panelists overwhelmingly agreed that preliminary proposals are conducive to a quality panel review.

Most panelists perceived no change in the intellectual merit of highly rated proposals.

Panelist Experience

2014panelsurvey.1

A majority of respondents reported their panel experiences under the preliminary proposal system were better than in years prior. This is consistent across all preliminary proposal panels and the first two years of full proposal panels, with slightly more positive responses from preliminary proposal panelists. The latest full proposal panelists were less positive, with a larger proportion reporting “No change” but also a smaller proportion reporting “worse” than in the first two full proposal panel cycles.

Preliminary proposal panels represented a greater departure from prior panels in size, goals, and implementation than full proposal panels, so the potential for changes to panelist experience (better and worse) seemed greater here. This was borne out by the large majority of panelists reporting a directional change and of those, positive experiences greatly outweighed negative experiences. For full proposal panels, the major difference from prior panels was that most reviewed proposals had already been through a preliminary proposal review and subsequent revision. We did not expect much change in overall experience of full proposal panelists, so the extent of panelists reporting positive change is encouraging.

Written comments shed light on what full proposal panelists saw as positive and negative changes to their full proposal panel experience. On the plus side, a greater proportion of time and effort was spent on providing feedback to strong proposals. The flip side of that was that the invited full proposals necessitated difficult choices and highlighted the discrepancy between worthwhile science and available resources.2014panelsurvey.2

In addition to their overall experience with the panel, we also asked preliminary proposal panelists about their preparations for the panel. In order to manage the review of the large volume of shorter preliminary proposals, DEB planned for an increased number of assignments per panelist to avoid increasing total panelist burden; this assumed a relationship between proposal length and review work.

The majority of preliminary proposal panelists reported a decreased time burden to prepare for panels, even though the number of proposals per panelist was increased. This indicated to us that we succeeded in balancing the volume/work per proposal trade-off. Comments from panelists also indicated a qualitative improvement in their panel preparation and individual review writing experience. This was generally ascribed to a feeling that, with shorter proposals less time was required to simply cover the entire proposal and they could instead engage more deeply with the project description and literature. A minority of respondents reported that extra preparation time was needed, citing difficulty in adjusting to the new format and changing old review habits.

View of Preliminary Proposal MechanismBar chart depicting DEB panelist survey response results related to the Preproposal format.

Across the 3 rounds of preliminary proposals, we asked panelists to provide feedback on the preliminary proposal format. Over 90% of respondents are in agreement that the content of the preliminary proposals provides adequate basis for evaluating the projects. The 2012 panel highlighted two issues for NSF to consider regarding the preliminary proposal content: reviewer expectations should be better aligned with the preproposal format, and low-cost projects might be identified, relative to higher-cost competitors. The former was resolved, in part, through experience with a new process and by changes to panelist instructions. The latter provided support for the “small grants” track, adopted for the 2013 preliminary proposal submissions. Since then, panelists are nearly unanimous in finding the content adequate for review.

We separately asked panelists to weigh-in on the length adequacy of a 4-page project description. Again, these results are overwhelmingly positive. Based on written comments, some reviewers suggested the page limit would be improved either by adding a page or setting aside specific lengths for various sub-components to ensure PIs sufficiently address them (e.g., 1 page exclusively for broader impacts, 1 page for graphics, limiting the length of introductory material). Others felt that 4 pages was too long and that if preliminary proposals are to stay, DEB should go “all in” with 1 or 2 page descriptions that leave only the main idea and no possibility for reviewers to expect or demand preliminary data and detailed experimental designs. And, a few suggested that while the length was adequate for most work, the complexity of their own specialized research interests defied such succinct description and deserved special dispensation. These conflicting opinions however do not appear much different from concerns about proposal structure typical for the 15-page full proposals prior to the preliminary proposal system. For the vast majority of reviewers, the 4-page project description works for a preliminary proposal evaluation.

Perceived Changes in Proposal Content

The questions about proposal content were deliberately selective. The wording we used specified a perceived change in the quality of the “highest rated” or “top 20%” of panel ratings. The thought behind this when developing the questions prior to the first panel was that 1) we are primarily concerned with changes that might affect the eventual awards, and 2) the relative ease of preparing preliminary proposal packages might invite more off-the-cuff submissions, which would be screened out at that stage but also depress the perception of average quality without altering any actual award decisions.2014panelsurvey.4

The results have been largely consistent with our expectations. The majority of respondents reported no change in intellectual merit of the top proposals for both preliminary and full proposals. During the preliminary proposal panels, respondents reporting a perceived change were pretty evenly split between the proposals being better and worse. Opinions were more positive during the full proposal panels; however, we aren’t putting much weight on that difference since the majority still reports no change. As far as the positive response by full proposal panelists regarding improved quality of proposals, there are at least two non-exclusive explanations: 1) panelists didn’t respond to the question and instead reflected their view of the entire body of full proposals from which most non-competitive applicants had already been removed, and 2) full proposals actually improved in quality after feedback from the preliminary proposal review.2014panelsurvey.5

Respondents’ perceptions of changes in broader impacts mirror those for intellectual merit, though they were more polarized. In all 3 preliminary proposal cycles, the majority reported no change to broader impacts in the top proposals. However, greater proportions reported both positive and negative changes. This seems to reflect a still-divided opinion in the community on what ought to be the emphasis for broader impacts. Comments suggested that the broader impacts were both improved and worsened through less detail in the preliminary proposal. Quite unexpectedly, few respondents thought broader impacts declined at the full proposal stage; far more panelists, even a majority in 2013, felt this component improved over prior panels. We’re not sure how to explain this response, although we note this coincides with the release (January 2012) and incorporation into NSF documents (January 2013) of the revised merit review criteria that sought to clarify the context for assessing broader impacts.

Synthesis

Through 3 cycles of two-stage review, the preliminary proposal process in DEB appears to be improving panelist workload and experience. Panelists also report a high degree of satisfaction with the preliminary proposal mechanism and, allowing for individual differences in formatting preferences, generally find that preliminary proposals supply sufficient evidence on which to base their evaluations. Further, few returning panelists perceived any decline in the quality of projects coming highly rated out of preliminary and full proposal panels. This supports a view that the preliminary proposal process is achieving positive changes and not adversely affecting the overall quality of the review process and award portfolio.

Supplemental Survey Background

Traditionally, DEB includes toward the end of the panel one or two drop-in sessions with DEB and BIO leadership for some general Q&A. As an informal discussion, it’s helpful for sharing information with panelists and for us to hear about external concerns. However, it’s not at all standardized: the topics jump around, much of the discussion depends on who is most willing to speak up first, and the take-away points are mainly just a general sense about several disparate issues. With the launch of the preliminary proposal process, we realized it would be helpful to collect better information from panelists. A survey that didn’t link individuals to their responses was thought to be a “safer” venue to encourage all panelists to voice their opinions. This would hopefully avoid the professional and inter-personal dynamics that may bias who is willing to speak up, how forcefully things are said, and what we ultimately interpret as important or common feelings. The downsides of course were that we were asking for subjective (perception and opinion) information and lacked an established baseline against which to measure any change.

Since the first preliminary proposal panels in April of 2012, we have been asking our panelists to answer a few questions about their perceptions of the preliminary proposal process. Within the limitations of the context, we asked respondents questions about 1) their views of the preliminary proposal mechanism, 2) their panel experience relative to prior participation, 3) and their perception of the proposal content relative to prior participation. A similar and shorter set of questions was used for full proposal panels. There have been minor changes to the survey from year to year as we received feedback that helped clarify wording, and some questions were added or removed as our specific needs changed. This post presents the responses to the core questions that continued across the lifetime of the survey in DEB.

DEB Numbers: FY2014 Wrap-Up


At the end of 2013, we presented DEB submission and award portfolio data examining the initial results of the preliminary proposal process, leading to DEB’s FY2013 awards. In this post, we provide a follow-up for the second round of awards funded under the preliminary proposal system in FY2014.

For a refresher, you can visit last year’s post.

The main takeaway from the 2014 data is that the following aspects of our portfolio do not exhibit changes associated with new system.

FY2014 Summary Numbers

In FY2014, DEB awarded 136 core program projects (comprised of 197 separate awards). The numbers and charts below all reflect counts of projects.

These projects were selected from 520 full proposals reviewed in DEB during October and November of 2013 by invitation under the Core Programs and LTREB solicitations, via the CAREER, OPUS, and RCN solicitations, or as co-review with another NSF program. The invited proposals had been selected from among 1629 preliminary proposals initially submitted in January of 2013.

Below, we present and discuss charts tracking the trends for several dimensions of our project demographics that were raised as concerns coming in to the preliminary proposal system. The numbers are presented as proportions for comparison across the review stages. However, the progressive winnowing of total numbers from preliminary proposals to awards means each step becomes more sensitive to small absolute changes.

In all cases of award trends shown below, the absolute change from FY2013 to FY2014 was no more than 10 projects.

Individual and Collaborative Projects

As seen in the figure below, there was little year-to-year change in the performance of single investigator projects, the small change being consistent with prior inter-annual variation. Most of the apparent jump in the proportion of single investigator awards between the preliminary proposal and full proposal stages is an artifact of the counting method. As we discussed last year, the primarily single-investigator proposals in the CAREER and OPUS categories are not subject to the preliminary proposal screen and thus they make up a relatively larger portion of the full proposals than in years prior to the system and their absence depresses the single investigator proportion of the preliminary proposal counts relative to the historical full proposal baseline.FY14.1

Growth in the proportion of collaborative proposals in our award portfolio continues the generally upward trend from the past several years. We would expect a plateau at some point, but where that might be isn’t clear.Fy14.2

Readers may notice that the year-to-year increase in collaborative project awards for FY2014 is a few percentage points larger than the decrease in single investigator awards shown above. This difference reflects an increase in multi-institutional teams (which meet the NSF definition of “collaborative”) relative to intra-institutional arrangements (intellectual collaboration to be sure, but not a collaborative project).

Gender & Predominantly Undergraduate Institution (PUI) Status

Female PIs experienced a sizeable year to year drop in their proportion of awards this year, although the proportion of submissions at both preliminary and full proposal stages continues to increase. Such a drop is visually jarring, but not unprecedented. In absolute terms, this is a difference of eight projects across four clusters each with 1 or 2 full proposal review panels, essentially noise in the signal.

FY14.3

In contrast, PUIs experienced a large proportional increase in awards this year. Once again this is presumably due to noise within the programs’ decision-making (a difference of only 9 awards) since submissions did not change appreciably.

FY14.4

These single year changes in PUIs and female PIs appear to emerge from the full proposal review and program decision-making stage, not the preliminary proposal stage. This would seem to be a product of PO portfolio management, and such swings an inevitable result of the numerous dimensions of a “balanced portfolio” that need to be tended with a relatively small number of awards.

Early Career Scientists

As we discussed in the FY2013 Wrap-up, there are several imperfect metrics of early career investigator performance, with the “Beginning Investigator” check-box on the cover page being the most immediately visible but also the most inconsistently applied identifier.

FY14.5By the check-box identifier, beginning investigators continue to receive awards in proportion to full proposal submissions. A gap between preliminary and full proposal submission is expected because of the influx of proposals from the CAREER, OPUS, and RCN mechanisms which tend to have lower rates of beginning investigator PIs in DEB. The proportion of checked boxes at the preliminary proposal stage may also be elevated since the box is commonly, but incorrectly, checked in reference to persons other than the PI and at the preliminary proposal stage that could include persons from non-lead collaborator institutions.

The other identifier of career stage is the years since PhD of the PI.

With “Early-career” < 10 year post-PhD, “Mid-career” as 10 – 20 years post-PhD, and “Advanced-career” as >20 years, we can give a broader and more accurate overview of the PI population.FY14.7

FY14.6

From 2013 to 2014, the proportion of submissions decreased slightly for Early-career PIs (-2 percentage points), increased for Mid-career PIs (+6 pts) and decreased for Advanced-career PIs (-4 pts). Even with these changes, the Early-career cohort still represents the largest portion of submissions at 39%.

With respect to awardees, the PI profile shifted prominently toward Mid-career PIs from 2013 to 2014. That cohort increased by 10 pts to 35% of awards, which matches their submission rate. Advanced-career PIs dropped 3 pts, and make up the smallest portion of the award portfolio (32%) but their proportion of awards is still above submission levels. Early-career PIs represented a smaller portion of the 2014 awards (- 7 pts from 2013), and were somewhat underrepresented compared to submissions, constituting the remaining 33% of awards.

The changes in the awardee degree age profile from 2013 to 2014 resulted in a more even distribution between the three categories of Early-, Mid-, and Advanced-career but greater departures from their respective representation in submissions. However, it remains to be determined what distribution represents the “optimal” structure of the awardee population, or even on what criteria to judge optimality.

Success Rate

Success Rate
Fiscal Year 2007 2008 2009 2010 2011 2012 2012/2013 2013/2014 2014/2015
Preliminary Proposal* 22.0% 22.4% 23.0%
Full Proposal** 17.2% 15.3% 22.1% 13.5% 11.9% 16.8% 24.1% 26.2% N/A, awaiting budget
Overall*** 17.2% 15.3% 22.1% 13.5% 11.9% 16.8% 7.3% 7.6% N/A, awaiting budget
*= Ninvited_full / Nsubmitted_preliminary
**= Nawarded / (Ninvited_full + Ndirect_submission^)
***= Nawarded / (Nsubmitted_preliminary + Ndirect_submission^)
^Ndirect_submission = all proposals through 2012, after 2012 only CAREER, OPUS, RCN, co-review, and LTREB renewals taken to panel.

As we noted last year, we don’t put much value on “success rate” as a metric of program performance because it is driven by factors exogenous to DEB: the budget passed by Congress and the number and size of submissions sent in by the research community. However, we recognize its importance to you as a signal of program health and accessibility. In that regard, we are breathing a slight sigh of relief for the first departure, outside of extraordinary circumstances, from the downward slide that has dominated the past decade.

 

Program Announcements: LTER postdoc opportunity and LTER National Communications Office


We’ve got two new opportunities we are happy to share with you related to the Long Term Ecological Research program:

First, directly from DEB, is a new solicitation for a Long Term Ecological Research National Communications Office:

NSF has recently announced an open competition for a Long Term Ecological Research (LTER) National Communications Office (NSF 15-535). The office will coordinate research, education, and outreach programs across the current 25 LTER projects, communicate these activities to diverse audiences, and provide centralized representation of the LTER network to the broad scientific community and the public. The Director of the Office will work with the LTER Science Council and the research community to develop and implement strategic goals and future initiatives, and the Office will serve as the primary point of contact for information about the LTER program. A single National Communications Office will be awarded; the competition is open to universities and colleges, non-profit, and non-academic institutions.

Second, we’d like to point you to the  NSF-supported LTER Postdoctoral Synthesis Fellowships:

The National Socio-Environmental Synthesis Center (SESYNC), in collaboration with NSF’s Long Term Ecological Research (LTER) program, invites applications for two-year postdoctoral synthesis fellowships that will begin August 5, 2015. Synthesis of long-term data sets, on-going experiments, and model results is an important goal of NSF’s LTER program. Successful Postdoctoral Fellowship applications will identify specific research questions and how they will be addressed using synthesis methods and long-term ecological data. Fellowships will engage and assist early-career investigators in the use and analysis of existing long-term data and in advanced computational methods to ask new questions and initiate new research collaborations. Fellows must identify long-term datasets that form the foundation for these syntheses. These must involve LTER data, but applicants are encouraged to include additional long-term data collected by projects outside of the LTER network as well. Proposed projects can focus on ecological or interdisciplinary questions. For details, see http://www.sesync.org/opportunities/sesync-lter-synthesis-postdoctoral-fellowships.

Permanent Program Officer Searches in DEB, closing February 9, 2015


Two DEB clusters, Systematics and Biodiversity Science (SBS) and Ecosystem Science (ES), have initiated searches to fill openings for permanent Program Officers.

The vacancy announcements are available on USAJobs and are open until February 9, 2015.

The position description, qualifications, and application procedures for each opening are described at the respective links.

Farewell to NSF from Penny Firth


How does one say thank you after almost 24 years of opportunity and excitement? My time at NSF has been an extraordinary part of my life, and I am deeply grateful to my co-workers and our remarkable – inspirational – environmental biology community. Our successes have been forged through respect and common goals, and I sincerely appreciate your communications through the years. You helped guide me in so many ways: through things you said, through proposals you submitted, and of course through your research and education activities.

Thank you for conceiving keystone ideas and testing new and classic theoretical constructs. Thank you for participating in our gold-standard peer review system. Thank you for recognizing that broad participation is an imperative.

Over my decades at NSF I have practiced persistence, cooperative leveraging and trust. I’ve helped initiate interdisciplinary and integrative programs such as Water and Watersheds, Dynamics of Coupled Natural and Human Systems, and Dimensions of Biodiversity, and education initiatives like Vision and Change. I’ve been a big supporter of sustaining a robust core that can accept and review any fundamental research question in our fields. I have applauded the novelty and productivity of synthesis centers and the groundbreaking research of science and technology centers.

Thank you for asking questions across spatial scales and evolutionary time, and forecasting our shared future. Thank you for acknowledging the ubiquity of microbial symbionts in every biological frame of reference, and seeking to understand their secrets. Thank you for your attention to the dynamic networks of interactions that produce evolutionary innovations and ecological services.

I am immensely proud of each and every one of our staff here at NSF. Permanent staff and rotators from the community make up a whole that is much more than the sum of its parts. I am profoundly impressed by their talent, dedication, commitment, passion, and just plain hard work.

Thank you for sharing your knowledge, your skills and your data. Thank you for mentoring the next generation, and the former one too. Thank you for supporting broad career horizons, experiential learning, and concept literacy. Thank you for your devotion to real, working international partnerships.

It has been a special privilege to serve DEB and NSF. I’ve been motivated, humbled, and – just often enough – amused. Exciting times lie ahead, and I am confident that DEB and its community will continue to advance with a remarkable blend of creativity, opportunism, and entrepreneurship.

Thank you.

Penny Firth, Division Director, NSF/BIO/DEB

firth