Myths and FAQs about Project Reporting


Annual and final reports have changed quite a bit over the years. Twenty years ago annual reports and final reports were distinct requirements (see section 340) for which you printed out a form, filled it out and mailed it in. In the early 2000s reporting moved online to FastLane; this allowed a degree of integration with other electronic systems and also brought about the creation of a single template that applied to both annual and final reports. Some people took this to mean that the final report was just another annual report, and others started treating each annual report like a cumulative final report. However, since 1) most reports during the FastLane era were uploaded as unstructured PDF narratives they could be reviewed but weren’t useful for systematic analysis (data mining) and 2) both approaches provided the necessary record of activity, there was no pressure to enforce a standard either way on whether the content was annual or cumulative.

The most recent move, to Research.gov, came with a more structured web-form-based report template that enables further integration of reporting requirements into companion systems. The structured report data improves compatibility between grant data systems across agencies (e.g., NASA data in Research.gov search results) and adds records to a database of project results that can automatically compile the cumulative outcomes for any project without the PI having to regurgitate the same items over and over in an ever-expanding list. Once we get beyond the rough period of adjusting to the change, structured reporting of data can be easier to enter and provide greater value for analyses of all types, including those to make a case for future investments in research.

The purpose of this post is to help you shorten the learning curve of Research.gov reporting, answer some of the common questions we hear from you, and debunk the persistent myths and old habits that no longer fit with current practice.

  1. What does DEB do with project reports?

Reports are required because we need to maintain a record regarding how our investments in research are spent. We use them to make the case to various audiences that funds are being spent on good and important work. Your Program Officers (POs) are responsible for overseeing those funds and need to be able to document that they were used for the approved purposes in an appropriate fashion. Because failure to show that grant money was used productively makes it very difficult to justify sending you more money, POs review each report and will request additional information if critical aspects are weak or missing. Failure to report at all places a flag on your file for missing a requirement of your grant that will block further funding until it is cleared. In short, reports are a necessary component of financial stewardship.

With structured data available via the Research.gov reporting form, we are also gaining the ability to better quantify research outcomes and ask new questions about our portfolio, our decision-making, and the effects of different approaches to supporting science.

  1. What should I put in my project report, and how much of each thing?

This is a really common question and a big part of the change from FastLane reporting to Research.gov reporting. The old “upload a PDF” days imposed no limitation whatsoever on reporting and inconsistent requests for more or different information from POs in various programs encouraged a narrative-intensive “everything and the kitchen sink” approach to the report. This resulted in the expenditure of a lot of time and effort by PIs to compile and present all that information and by POs to wade through it to check if the critical pieces were all there.

Please take this to heart: The volume of report text is not an indicator of the quality of the work. A report is also not a journal article.

What we want is an efficient description of what happened/was accomplished during the reporting period. A lot of this comes down to discrete counts of the inputs, activities, and outputs of the project and providing enough detail about each that we could verify the count if needed. The Research.gov template seeks to encourage this by providing clear fields to generate list-style report items with consistent detail requirements.

There are some places with paragraph-sized text boxes for narrative explanations. Some are optional and some are required. Responses to any of them are most helpful when clear and direct. Research.gov imposes character-limits on text boxes where narratives are required/allowed by design.

  1. What are the most common problems that cause POs to return a report for revision?
  • Failure to list participants and collaborators who are mentioned/described in narrative sections of the report text (See Questions 4, 5, and 6).
  • Multiple typographical errors; apparent cut and paste errors (incomplete sentences or paragraphs).
  • Listing items that fall outside of the reporting period.
  1. Who should I list in the Participants section? The other collaborators section?

Between the three “Participants/Organizations” sections, please list everyone who has been engaged in the project within the previous 12 months. This includes students, volunteers and those paid through other sources. As long as their activities were related to the objectives (Intellectual or Broader Impact) of your award, they “count”. A rule of thumb in deciding which section to report under is that individual “participants” carried out the work of the objectives, “organizational partners” directly enabled the work done by the participants, and “other collaborators or contacts” would include indirect supporters or beneficiaries of the work (e.g., schools at which your student conducted a demonstration). Please note that “other collaborators and contacts” are entered into a plain narrative text-box; it doesn’t include any specific structure or data requirements.

If participants worked less than a total of ~100 hours (2.5 weeks), enter “0” under Nearest Person Month Worked. (Yes, zeros count, too.) If they worked far less than 100 hours trust your own judgment about whether to list them as participants– i.e., whether you think their participation was meaningful, or might be better listed as an “other collaborator or contact”.

  1. I have multiple sources of funding and people in my lab often work on overlapping projects. In the Participants section, what should I enter for Funding Support?

If a participant was paid from a funding source other than your current NSF award, please list that source of support. Do not enter anything if the participant was paid solely from your current NSF award or if they were a volunteer.

  1. I have an RCN or workshop award (or any other type award that may involve dozens of participants). Do you really want them all listed as Participants?

Yes. The list of participants provides an increasingly valuable database that NSF can use to quantify the impact of its investments. A common alternative to listing participants individually in the Participants section has been to upload a pdf document under “Supporting Files”. Please keep in mind that NSF cannot search the data provided in those documents, except by opening them one-by-one, and they will generally be ignored in official analyses and comparisons. Hence we always prefer that Participants be entered one-by-one in the Participant section.

  1. I have a collaborative award. How should my reports differ from those of my collaborators?

Obviously, you and your collaborators will have at least some shared objectives and impacts; some overlap in reports is expected. Your report should focus on the components of the project and the personnel unique to your institution.

  1. Are Annual Reports cumulative? Is the Final Report cumulative?

No and no. Current NSF Policy and the Research.gov reporting system instruct PIs to report only on the previous year of work, including for the final report. Except for “Major Goals” and “Impacts”, there should be little or no overlap from one report to the next. The Final Report should be written as an Annual Report – there’s nothing special about it other than it being the last report on a given project.

You may have done it differently in the past or received outdated advice from a colleague because it was different in the past and old habits are tough to shake, but the rules were clarified and the system changed to enforce them when reporting was moved to Research.gov.

  1. What is the Project Outcomes Report?

The Project Outcomes Report is a third reporting requirement (in addition to annual and final reports) that is due at the same time as your final report. It is a counterpart to the “public abstract” of your award. The abstract was written at the start of the project and explained what you planned to do and why it was important. The Project Outcome Report summarizes the overall goal(s) and accomplishments of the project upon its completion. It is provided directly to the general public as a permanent record and justification for our investment of taxpayer dollars in your research. Please write it carefully, paying particular attention to the major accomplishments and general significance of your work; avoid jargon and grammatical errors. Do not cut-and-paste text from your Annual or Final Reports because you wrote them for a very different audience.

  1. What happens if I don’t submit my report?

You and any Co-PIs will not be allowed to receive any new funding (e.g., annual increments, supplements, or new grants) or process any other actions (e.g., no cost extensions, PI changes) until the report is submitted and approved. Your annual report is due starting 90 days before your award anniversary, your final report is due within 90 days after your award end date. After either of those 90 day windows, the report is considered “overdue” and the block is automatically put in place. Even if you aren’t overdue when you submit a report, waiting until late in the 90-day window risks delaying timely release of annual funds and possibly going overdue before we’ve had a chance to review, receive any needed corrections, and approve the report.

  1. I submitted my Annual Report, but there’s still a block in Fastlane preventing me from receiving new funds from NSF. Why?

It’s most likely that your report still needs to be approved by the managing Program Officer; new money cannot go out the door until reports have been submitted and approved. If your report has been languishing, it’s appropriate to ask the managing Program Officer to take a look at it. (Although we enjoy learning about your discoveries, annual reports can pile up when our priorities must be placed elsewhere.)

  1. Can I submit a proposal if I have an overdue report?

Yes. Be aware, however, that every time we attempt to access any of your proposals (submitted or already awarded), we’ll be redirected to a warning message on a separate screen that tells us we cannot approve any money for the proposal because of a missing or overdue report(s). We’re required to acknowledge this by clicking a “Continue” button before we’re allowed to see any of the proposal contents. The effect of those irksome messages on Program Officers is worth keeping in mind.

  1. If one of my collaborators has an overdue report on an award that I’m not associated with, what are the consequences for me?

If that collaborator is a PI/co-PI (i.e., listed on the cover page) of a proposal or award on which you are a PI/co-PI, you will be blocked from receiving any new funds from NSF or processing any other actions in relation to that shared proposal/award. Any proposal that shares a person on the cover page with the cover page of a proposal that has a missing or overdue report is subject to the block.

  1. Why am I being asked to submit my report in June or July when it’s not overdue until August or September (or later)?

Because we don’t want you to miss your annual funding increment. If you have received an award in the last quarter of our fiscal year (the fiscal year runs from October 1 to September 30, so July, August, or September) and are scheduled to receive that grant in annual increments then you have likely encountered this situation: a program officer calls you up and says “hey, can you get your report in this week” but when you look at research.gov it says it won’t be past due for a month (or two or three).

All annual reports are due 90 days before the anniversary of the award: this provides the time to review and process everything in order to get your annual increment released to you by the actual anniversary. Frequently, reports are submitted much closer to the anniversary or even late. This pushes the start of the approval process later and often pushes the release of money to after the anniversary. But, if that anniversary date is late in the fiscal year, any sort of delay — even within the allowed “reporting window” — can push back the processing time over the year-end deadline, at which point the money is no longer available to be released. That’s not a happy state of affairs for you or for us! So if your award was made and started on September 1, the report “due date” would be June 1 of the next year and your PO would probably be hounding you by July 1 to make sure you don’t lose your funding.

This is a little bit annoying, but generally makes sense when the project begins immediately upon receipt of the award. However, some awardees request a later “start date” for the project that is well after the actual award is made (someone receiving money in September might schedule a start date for December or January). At this point things get complicated. Following the September 1 award/January 1 start example: we need an approved report in order to release funds for the second year by the September anniversary of the award being made, otherwise we run out of time within the fiscal year to actually distribute the money. But, the reporting system is, for whatever reason, blind to this and tells you to file a report based on the “start date” so the very beginning of the “due” period is in October and after the point at which our ability to send you the money due upon receipt of that report has been canceled.

So, two lessons here: 1) don’t ask for a start date way after your award is available, especially if doing so crosses the Sept./Oct. dividing line, and 2) if your PO calls and asks you to submit a report RIGHT NOW, please do it; we’re trying to give you the money we promised and not doing it can really muck things up.

 

Additional Reporting Resources

More detailed answers to many of these questions and accompanying screen shots from Research.gov are available in a pdf guide available here (click to follow, but caveat emptor: there appear to have been some additional updates since the file was posted).

If you’re really into this, a long list of guides, tutorials, templates, and demonstrations related to Project Reports is available here .

FYI: AAAS Science & Technology Policy Fellowships


For several years DEB has hosted AAAS Science & Technology Policy Fellows. We have reaped the benefits of this excellent Fellowship program, and think talking about it here will be of interest to some of the faculty, students or post-docs in the DEB community.

Are you looking for a sabbatical or to explore new ways to utilize your scientific training?

Want to learn about federal policy from an inside perspective?

Perhaps you are considering opportunities for non-academic science careers.

Then you may wish to consider applying for a Science & Technology Policy Fellowship with AAAS (the American Association for the Advancement of Science). AAAS Fellows are doctoral-level trained individuals who gain insight into the US federal enterprise during a one to two year post-graduate experience. Fellows can be from a wide array of disciplines, and from any career stage. The distribution of AAAS Fellows’ ages have spanned over 5 decades and ranged from late 20s to early 70s! AAAS S&T Fellows contribute to their offices in myriad ways, but their specific roles are often dependent on the mission of the agency and the needs of the office. In DEB, AAAS S&T Fellows have a unique opportunity to engage in international science policy, offering Division and programmatic level strategic planning, as well as gaining insight into the merit review process.

AAAS S&T Fellows not only make valuable contributions to their offices, but throughout the entire experience the Fellows are engaged in professional development trainings. These trainings range from ‘the essentials of science communication’ to ‘developing a negotiation toolkit’ to technical workshops on ‘text mining big data using R code’.

Furthermore, one of the greatest benefits of the AAAS S&T Fellows program is being inducted into a highly connected network of science professionals. Many AAAS alumnae continue working in government after their fellowship, but others have gone on to influential positions throughout academia, industry, and non-profit sectors. The breadth of the Fellows’ network is truly impressive.

 

What are people saying about the AAAS S&T Policy Fellows?

In a 2014 PNAS article on graduate education and postdoctoral training, authors Bruce Alberts et al, gave the AAAS S&T Policy Fellows program a ringing endorsement by saying:

“…. the AAAS Science and Technology Fellowships for 40 y has allowed carefully selected scientists and engineers with advanced degrees to work in the US government in Washington, DC. Historically, approximately half of these Fellows have remained in policy positions, occupying critical positions that greatly benefit the nation….”

(full disclosure, of course: Bruce Alberts has previously served as editor-in-chief of AAAS’s main publication Science)

However, the National Science Board, the policy making body for NSF, also recognized the AAAS S&T Fellowship program in their hallmark Public Service Award in 2014. #nobigdeal #kindofabigdeal

 

Not convinced yet that this is a unique and special program? Check out some of these notable alums:

Honorable Rush D. Holt: AAAS CEO, former U.S. House of Representatives Congressman

Frances A. Colón: Acting Science and Technology Adviser to the Secretary of State, U.S. Department of State

Rosina Bierbaum: Professor and former Dean, University of Michigan School of Natural Resources & Environment

Steven Buchsbaum: Deputy Director, Discovery at Bill & Melinda Gates Foundation

 

The AAAS S&T Fellowship offers placement in seven different program areas. Check to see if you are eligible to apply and read testimonials from former Fellows.

Applications for the 2016-2017 Fellowship cycle are open from now until November.

DEB Summer 2015: Where to find us at Professional Meetings


We’ve got a busy summer meeting schedule and are offering numerous opportunities to hear the latest NSF updates, meet your Program Officers, and learn about funding opportunities in-person. Since it is so close this year, there will be a fairly large contingent of us heading up to Baltimore for the ESA meeting. But, we’re not forgetting the other side of the house; we’ll have representatives at both of the big, international evolution conferences. We’ll also be at the IALE Congress and the joint Botany meeting, and we were already at ASM earlier this season.

And remember, if you can’t make it to our lunchtime brown-bag sessions to hear the latest from DEB, you can always email one of the attending Program Officers to set up another meeting time, catch us in the poster hall, or drop by our information tables (where available).

 

26 – 30 June, 2015: Guarujá, Brazil. Evolution 2015

Featuring: Simon Malcomber, David Mindell, Sam Scheiner, Kelly Zamudio

Presentation Followed by Q & A (NSF Update)
Sunday 28 June, 12:00 – 13:30 (during lunch break), Meeting Room Diamantina

 

6 – 9 July, 2015: Portland, OR. 9th International Assoc for Landscape Ecology World Congress

Featuring: George Malanson

Panel Discussion: Funding Opportunities for Landscape Ecology at the US National Science Foundation
Monday 6 July, 5:30 PM – 6:30 PM, See Final Schedule for Location

Additional Notes: Tuesday eve poster session is a good time to meet up with George.

 

25 – 29 July, Edmonton, Alberta. Botany 2015

Featuring: Joe Miller

With Special Appearances: Roland Roberts, Judy Skog

NSF Information Booth (Exhibitor #114)
All Days, Staffed during poster sessions, by appointment and whenever we can be there, Hall D.

NSF Outreach Presentation and Discussion
Wednesday 29 July, noon, location TBA (check final program).

 

9 – 14 August, 2015: Baltimore, MD. Ecological Society of America 2015

Featuring: Henry Gholz, Doug Levey, Sam Scheiner, Alan Tessier, Alan Wilson, George Malanson, Diane Pataki, John Adamec, Shannon Jewell

With Special Appearances by: Matt Kane (TBD), Betsy Von Holle (W, Th, F), Emily Leichtman (Su, M)

NSF Information Booth (Exhibitor #438)
Monday 10 – Thursday 14 August, All-day, Baltimore Convention Center Poster/Exhibit Hall.

Special Session (SS 2): Ecology on the Runway: An Eco-Fashion Show and Other Non-Traditional Public Engagement Approaches
Monday 10 August, 11:30 AM-1:15 PM, Baltimore Convention Center 310.

Special Session (SS 10): New Frontiers: Bridging the Gaps Between Continental and Global-Scale Research Networks, A Special AGU-ESA Event and Evening Social
Monday 10 August, 8:00 PM-10:00 PM, Baltimore Convention Center 309.

Workshop (WK 53): Federal Agency Networking Session (Come and meet your Program Officers from NSF and beyond!)
Thursday 13 August, 11:30 AM-1:15 PM, Baltimore Convention Center 316.

 

9 – 15 August, 2015: Lausanne, Switzerland. European Society for Evolutionary Biology

Featuring: George Gilchrist and Leslie Rissler

Presentation followed by Q & A (NSF Update)
Thursday 13 August, noon, location TBA.

Additional Notes: This will be the same program as presented at Evolution 2015 (if you’re like us and had to choose one or the other, we’ve got you covered!)

Draft of revisions to NSF-wide grant and proposal policies up for public comment


Each year or so, NSF releases an updated version of its agency-wide guidance for proposals and grants, called the Proposal and Award Policies and Procedures Guide (PAPPG). This big document consists of two parts: instructions for proposers (the GPG, or Grant Proposal Guide) and instructions for awardees (the AAG, or Award Administration Guide).

The PAPPG sets the ground-rules for NSF programs. Solicitations, like the DEB Core Programs solicitation, exist to enumerate specific variances from the basic rules, for example the format and contents of a preliminary proposal. Solicitations, however, also refer back to the PAPPG and follow the ground-rules for everything except those specific variances. A good example of this is that the requirements for proposal font size are detailed in the PAPPG and we have no reason to repeat or modify that in the DEB Core Programs solicitation but they apply to both preliminary and full proposals.

Changes to the PAPPG trigger new proposal preparation requirements for all NSF programs and may require you to do something differently in your next submission to the DEB Core Programs (or anywhere else), but changes to the PAPPG do not override anything explicitly described in our solicitation.

 

Right now, a draft version of the changes has been made available to the public for comment through 20 July, 2015. The wording of this public version indicates that these revised rules are expected to come into force in January 2016; this is right around our next preliminary proposal deadline. Based on the experience of prior years, the final version will probably be published at some point in October so that you have fair warning of the rule changes and will be expected to follow them beginning on the TBD January date. Between July and October there is a period to review the comments and prepare the final revised version for public posting.

We’re mentioning this here because there are proposed revisions that are likely relevant to you and we want you to be aware of them as early in the process as possible.

The official notice of the request for comments is available in the Federal Register: http://www.gpo.gov/fdsys/pkg/FR-2015-05-19/pdf/2015-12086.pdf

This includes an explanation of the request, how to submit comments, and the comment deadline.

The actual draft is hosted on the nsf.gov website here: http://www.nsf.gov/bfa/dias/policy/papp/pappg16_1/fedreg/draftpappg_may2016.pdf

 

There are numerous small and several more substantial changes noted in the draft. The online document is conveniently marked up with comments and highlights for new/edited text and comments to note where material was removed.

Here are a few revisions that we noted that might be of particular interest to our readers:

On page 20 (of the PDF), it notes that your research office/organizational representative needs to complete organizational sign off before a proposal can be submitted (even for preliminary proposals); this might require modifications to your preparation timeline. (Organizational sign-off is mentioned/added in many other spots throughout the document too.)

On page 25, there’s a re-emphasized note about an issue we’ve mentioned here before: You should have only 1 FastLane ID per individual.

On page 30, the GPG is (finally!) addressing the issue of your collaborator (aka conflict) lists being too long for the 2-page Biosketch by moving them into a separate Single Copy Document.

On page 33, and in a few other places, there are new requirements for reporting via your proposal Cover Page “dual use research of concern” (e.g., work with certain pathogens and toxins).

Pages 34 – 37 include several changes/clarifications relevant to the written components of your proposals: stronger requirement to enter a Project Summary in the FastLane forms (instead of uploading a PDF), a prohibition against hyperlinks in your Project Description, a template for Letters of Collaboration (if you’ve submitted to the DEB core programs recently, you’ve already been doing this), the revised Biosketch format (sans collaborators and other affiliations), and a requirement that each Biosketch be uploaded as a separate file (no more bundling as a single file).

There are a couple of changes with respect to budget preparation, the most notable (at least to us) being a requirement that sub-awards include overhead at the sub-awardee’s federally negotiated rate (or a de minimis rate of 10%).

On page 44, the instructions for current and pending (C&P) support also are changed to require a separate document (no bundling as a single file) for each of the senior personnel on the proposal and the definition of C&P is expanded to include “internal institutional support”.

 

The important outcome here is to make yourself aware of the proposed changes and change timeline and to make sure that your research administration officials are also aware of them so that this fall you will be able to follow the correct version of the GPG for our preliminary proposal deadline.

Are small grants doing well in review?


In contrast to the trend of decreasing numbers of preliminary proposals, we have seen a rapid increase in the category of Small Grant preliminary proposals (these are also included in the total counts in our previous post).

DEB Small Grants 2012 2013 2014 2015
Submitted N/A 83 95 126
Invited N/A 20 25 29
Invite Rate N/A 24% 26% 23%

 

We attribute this to a growing awareness of this option to submit preliminary proposals with total budgets under $150K. Small grants came about in the second year of the preliminary proposal system in response to a long-standing desire, expressed by numerous voices in our communities, for some sort of “small” category. DEB realized it was particularly appropriate in the case of the preliminary proposal system in order that reviewers be able to adjust their expectations for the scope of a project relative to the expense without requiring the extensive preparations of a full budget. We added the category to our solicitation for the 2013 preliminary proposal deadline.

We’ve had lots of positive feedback on this option, but also recognize that awareness still needs to be improved among both applicants and reviewers. This year, 8% of all preliminary proposals were identified as small grants.

Small Grants are found in all four clusters and are generally on the increase, but we also think feedback, such as this post, is necessary to successfully integrate this idea into our communities and maintain enthusiasm for this option. We would not be surprised to see these numbers grow to the point where SGs make up as large a part (or larger) of the preliminary proposal pool as Predominantly Undergraduate Institutions or Beginning Investigators.

Since 2013, we’ve funded 22 awards based on invited full small grants (9 of 18 in 2013, 12 of 24 in 2014, and 1 of 1 in 2015 thus far[1]), for a 51% success rate at the full proposal stage. This is roughly twice the success rate of full proposals without the SG designation.

 

[1] Not everyone who received an invitation eventually submitted a full proposal (individual reasons vary). Also, we have an award already based on a 2015 preliminary proposal because instead of inviting a full proposal, DEB determined this project was appropriate for the EAGER mechanism and invited the team to submit an EAGER proposal allowing for quick turnaround of an award.

DEB Spring 2015 Panel Update


At this point everyone should have heard back on your DEB Preliminary Proposals from the spring panels. If you have not:

1) Log in to your FastLane account. The information should be accessible there, but also make sure your contact email is correct because a typo there would prevent you from receiving updates and notifications.

2) If you were a CoPI, check with the lead PI on the preliminary proposal. The lead PI should have the results of review.

3) Did it wind up in your spam folder?

4) If you have exhausted all of the above options and have had no other contact with your DEB Program Officer, then it’s probably a good time to send us an email.

 

Preliminary Proposal Panel Results

DEB panels reviewed 1495 preliminary proposals; in consideration of the reviews and panel discussion, DEB Program Officers extended 383 invitations to submit full proposals for the August 2015 deadline. The Division-wide invitation rate for the preliminary proposals was 26%. Below, we detail the results of preliminary proposal review by programmatic cluster.

Cluster Invited Not Invited Total Invite Rate
Systematics and Biodiversity Science 87 221 308 28%
Evolutionary Processes 105 331 436 24%
Population and Community Ecology 107 320 427 25%
Ecosystem Science 84 240 324 26%
Grand Total 383 1112 1495 26%

 

This is the fourth round of preliminary proposal review for DEB core programs, which was started in 2012. DEB extended more invitations and achieved a higher invitation rate in comparison to prior years.

2012 2013 2014 2015
Reviewed 1626 1629 1590 1495
Invited 358 365 366 383
Invite Rate 22% 22% 23% 26%

 

As we discussed in our recent post on per-person success rate, the launch of the preliminary proposal system drew in a large number of “new” applicants. We believe we are now seeing this wave of applicants pass and this is reflected in the decrease in number of preliminary proposals reviewed in DEB as our communities realize that preliminary proposals do not make grants easier to get.

At the same time, the number of invitations has gone up. The increase is primarily a result of program management decisions as we have been able to refine our expectations for the number of proposals that will come to the full proposal panel through other mechanisms (CAREER, OPUS, RCN, and co-review).

Research.gov Update: Returned Project Reports


A big thank you to the several PIs who let us know they were having trouble finding the PO comments when reports were sent back from review with requests for revision. We passed them along to the Research.gov team, and it looks like we now have a response.

The most recent update to the Research.gov platform includes changes to the project reporting interface that should make it easier to find and view the PO comments. The screenshot below from the Research.gov online Help guide provides the new details.

Rgov_POComment

(click image to open larger version)

The automatic email you receive when a report is returned should also (now or soon) have a better explanation of how to find these comments, but we haven’t seen that yet.