ExpectMore.gov


Detailed Information on the
EPA Ecological Research Assessment

Program Code 10001135
Program Title EPA Ecological Research
Department Name Environmental Protection Agy
Agency/Bureau Name Environmental Protection Agency
Program Type(s) Research and Development Program
Assessment Year 2007
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 70%
Program Management 88%
Program Results/Accountability 53%
Program Funding Level
(in millions)
FY2007 $86
FY2008 $79
FY2009 $68

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Increase the transparency of budget, program, and performance information in budget documents.

Action taken, but not completed ORD separated the Ecological and Human Health Research budget dollar amounts in the FY 2010 Congressional Justification and better indicated how dollars would result in progress toward long-term goals. ORD plans to work with OMB through the next budget submission process to ensure appropriate levels of transparency.
2007

Develop and publish a revised multi-year research plan clearly demonstrating how the program's research supports the EPA mission and avoids duplication with other research programs.

Action taken, but not completed The program expects to receive peer review feedback on its strategy and MYP in Spring, 2008, and will revise documents as needed.
2006

Develop a program specific customer survey to improve the program's utility to the Agency.

Action taken, but not completed The program has drafted a survey and plans to distribute it to partners in late 2008/ early 2009. The survey has been delayed so that it better corresponds with the timing of the program's BOSC reviews.
2008

Reassess meaningfulness of current efficiency measure in light of recent National Academy of Sciences (NAS) report on efficiency measurement.

Action taken, but not completed Milestones: ?? October 2008: Explore the feasibility of tracking savings resulting from the AEP effort as an ORD-PART efficiency measure. ?? December 2008: Continue interagency dialogue regarding NAS recommendations. ?? June 2009: Reach agreement on approach. Update measures in PART Web during the Spring Update.
2008

Identify appropriate targets for bibliometric analysis measures by benchmarking with other agencies.

Action taken, but not completed In 2008, the program will begin investigating the ??h-index?? for benchmarking with other agencies, and will use the resulting benchmarking information to negotiate meaningful targets by 2009.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Refine the questions used in independent scientific reviews to improve EPA's understanding of program utility and performance in relationship to environmental outcomes.

Completed Using feedback from a workgroup comprised of OMB, ORD, and the BOSC, the program developed a set of questions that an independent panel could use to rate the program's progress & success. The program collected initial long-term measurement data during its 2007 mid-cycle BOSC review, earning an overall progress rating of "meets expectations" (http://www.epa.gov/osp/bosc/pdf/dwmc082007rpt.pdf). The program will collect formal long-term measurement data during its comprehensive BOSC review in 2009.
2006

Link budget resources to annual and long term performance targets by requesting and reporting Human Health Research and Ecosystem Research funding separately.

Completed In order to improve the linkage between budget resources and long-term performance targets, the Agency created sub-program-projects in the FY 2008 budget to allow for better distinction between the ecosystems and human health research programs.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Utility of ORD's causal diagnosis tools and methods for States, tribes, and relevant EPA offices to determine causes of ecological degradation and achieve positive environmental outcomes.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/ecomc0708rpt.pdf. The program's formal action plan can be found at http://www.epa.gov/OSP/bosc/pdf/ecomc0801resp.pdf.

Year Target Actual
2010 Exceeds Expectations
2014 Exceeds Expectations
Long-term Outcome

Measure: Utility of ORD's environmental forecasting tools and methods for States, tribes, and relevant EPA offices to forecast the ecological impacts of various actions and achieve positive environmental outcomes.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/ecomc0708rpt.pdf. The program's formal action plan can be found at http://www.epa.gov/OSP/bosc/pdf/ecomc0801resp.pdf.

Year Target Actual
2010 Exceeds Expectations
2014 Exceeds Expectations
Long-term Outcome

Measure: Utility of ORD's environmental restoration and services tools and methods for States, tribes, and relevant EPA offices to protect and restore ecological condition and services to achieve positive environmental outcomes.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/ecomc0708rpt.pdf. The program's formal action plan can be found at http://www.epa.gov/OSP/bosc/pdf/ecomc0801resp.pdf.

Year Target Actual
2010 Exceeds Expectations
2014 Exceeds Expectations
Long-term Outcome

Measure: States use a common monitoring design and appropriate indicators to determine the status and trends of ecological resources and the effectiveness of programs and policies.


Explanation:Data reflect the number of States with which the program has worked collaboratively to assist in using a common monitoring design and developing appropriate indicators.

Year Target Actual
2008 35
2011 50
Annual Output

Measure: Percentage of Ecological Research publications rated as highly-cited publications.


Explanation:This metric provides a systematic way of quantifying research performance and impact by counting the number of times an article is cited within other publications. The "highly cited" data are based on the percentage of all program publications that are cited in the top 10% of their field, as determined by "Thomson's Essential Science Indicator" (ESI). Each analysis evaluates the publications from the last ten year period, and is timed to match the cycle for independent expert program reviews by the Board of Scientific Counselors (BOSC). This "highly cited" metric provides information on the quality of the program's research, as well as the degree to which that research is impacting the science community. As such, it is an instructive tool both for the program and for independent panels??such as the BOSC?? in their program reviews. To best establish ambitious and appropriate targets in the future, ORD will collect benchmarking information by conducting an analysis of bibliometric measures used in R&D programs outside of EPA.

Year Target Actual
2005 baseline 19.4%
2007 20.4% 21.1%
2009 21.4%
2011 22.4%
Annual Output

Measure: Percentage of Ecological Research publications in "high-impact" journals.


Explanation:This measure provides a systematic way of quantifying research quality and impact by counting those articles that are published in prestigious journals. The "high impact" data are based on the percentage of all program articles that are published in prestigious journals, as determined by "Thomson's Journal Citation Reports" (JCR). Each analysis evaluates the publications from the last ten year period, and is timed to match the cycle for independent expert program reviews by the Board of Scientific Counselors (BOSC). This "high impact" metric provides information on the quality of the program's research, as well as the degree to which that research is impacting the science community. As such, it is an instructive tool both for the program and for independent panels??such as the BOSC?? in their program reviews. To best establish ambitious and appropriate targets in the future, ORD will collect benchmarking information by conducting an analysis of bibliometric measures used in R&D programs outside of EPA.

Year Target Actual
2005 baseline 19.3%
2007 20.3% 20.8%
2009 21.3%
2011 22.3%
Annual Output

Measure: Number of States using a common monitoring design and appropriate indicators to determine the status and trends of ecological resources and the effectiveness of programs and policies.


Explanation:Data reflect the number of States with which the program has worked collaboratively to assist in using a common monitoring design and developing appropriate indicators. The actual number for 2007 is an interim value. We expect that by the end of 2007 the target will be met.

Year Target Actual
2005 20 22
2006 25 25
2007 30 30
2008 35
2009 40
2010 45
2011 50
Annual Output

Measure: Percentage of planned outputs delivered in support of State, tribe, and relevant EPA office needs for causal diagnosis tools and methods to determine causes of ecological degradation and achieve positive environmental outcomes.


Explanation: At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2005 100% 100%
2006 100% 86%
2007 100% 100%
2008 100%
2009 100%
2010 100%
Annual Output

Measure: Percentage of planned outputs delivered in support of State, tribe, and relevant EPA office needs for environmental forecasting tools and methods to forecast the ecological impacts of various actions and achieve positive environmental outcomes.


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2005 100% 83%
2006 100% 100%
2007 100% 100%
2008 100%
2009 100%
2010 100%
Annual Output

Measure: Percentage of planned outputs delivered in support of State, tribe, and relevant EPA office needs for environmental restoration and services tools and methods to protect and restore ecological condition and services to achieve positive environmental outcomes.


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2005 100% 50%
2006 100% 100%
2007 100% 100%
2008 100%
2009 100%
2010 100%
Annual Efficiency

Measure: Percent variance from planned cost and schedule.


Explanation:This measure captures the ability of the program to increase cost effectiveness based on the extent to which it delivers annual research outputs relative to the amount of funds spent. Using an approach similar to Earned Value Management, the data are calculated by: 1) determining the difference between planned and actual performance and cost for each Long-Term Goal, 2) adding these data together to generate program totals, and 3) dividing the Earned Value of all work completed by the Actual Cost of all program activities. 100 percent or above represents an ideal level of cost effectiveness.

Year Target Actual
2004 Baseline -8.1%
2005 N/A -15.6%
2006 -13.6% -1.2%
2007 -11.6% Data Lag
2008 -9.6% Data Lag
2009 -7.6%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the Ecological Research Program (ERP) is to provide the scientific understanding to measure, model, maintain, and/or restore, at multiple scales, the integrity and sustainability of highly valued ecosystems now and in the future. This purpose is described in the program's Multi-year plan, and illustrated in the logic model (1, 2). Specifically, the ERP provides 1) monitoring of ecosystem condition that verifies they are improving or degrading as a result of management actions; 2) methods and models that can diagnose the cause of degradation to ecosystems when they are not improving and forecast future conditions that will occur with alternative management strategies to achieve improvements, and 3) restoration strategies that are cost-effective and stakeholder-driven in the event such action is required. In its 2005 review of the program, EPA's Board of Scientific Counselors (BOSC) reported that "the program is developing major tools for measuring environmental health, and these tools are being adopted in the field" (3) The research conducted by the program is specifically authorized under a number of statutes, including the Clean Water Act, Clean Air Act Amendments, and Toxic Substances Control Act (4-6). In addition, Executive Order 12866 states that "In deciding whether and how to regulate, agencies should assess all costs and benefits of available regulatory alternatives, including the alternative of not regulating. Costs and benefits shall be understood to include both quantifiable measures (to the fullest extent that these can be usefully estimated) and qualitative measures of costs and benefits that are difficult to quantify, but nevertheless essential to consider. Further, in choosing among alternative regulatory approaches, agencies should select those approaches that maximize net benefits (including potential economic, environmental, public health and safety, and other advantages; distributive impacts; and equity), unless a statute requires another regulatory approach." (7) A key purpose of the ERP is to provide information needed to evaluate the benefits to the environment of regulations covered by this Executive Order and executive order 13422 (8), which expands the previous Executive Order to include guidance documents.

Evidence: 1. Ecological Research 2003 Multi-Year Plan (Pages 6-11) 2. Logic model 3. Ecological Research Program BOSC Review (2005), page 2 4. Clean Water Act (CWA) Title I (33 USC 1251-1271) pages 3, 6-8 5. The Clean Air Act Amendments, Section 109 (highlighted), pages Sect. 231(i)(3) - "research on preventing, measuring and controlling emissions [from ethanol use in diesels] and evaluating associated health and environmental risks."Section 231 (m)(1)(C)(4) - [conduct research and improve monitoring] on deposition monitoring methods [related to atmospheric deposition to Great Lakes and Coastal Waters]. 6. Toxic Substances Control Act, Section 2609 (highlighted) 7. Executive Order 12866, http://www.epa.gov/fedrgstr/eo/eo12866.htm 8. Executive Order 13422

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: EPA's 2003 Draft Report on the Environment found that the data to determine the condition of the nation's surface waters was scientifically inadequate to be used to verify that surface water quality was improving at regional and national scales (1,2). This indicated a need for greater accountability (3). Similar work by the Heinz Center also found that there are significant data gaps that prevent effective reporting on key indicators of the condition and use of ecosystems, which limit the capacity for informed decision making (4,5). A review by GAO stated "GAO recommends that the Chair of CEQ develop institutional arrangements needed to ensure a concerted systematic and stable approach to the development, coordination, and integration of environmental indicator sets. Moreover, GAO recommends that the EPA Administrator establish clear lines of responsibility and accountability and identify specific requirements for developing and using indicators. CEQ and EPA generally agreed with GAO's recommendations." (6) To address these challenges and support EPA's strategic goals, the ERP provides scientific information and tools for use by all EPA programs that consider ecosystems in policy decisions (7). An independent expert peer review panel (BOSC) supported this approach in its 2005 report, noting that "To achieve this broad goal, EPA must devise scientifically valid measurements of the health of the environment and this requires an intensive program of ecological research" (8). The ERP meets this specific need in a number of ways. For example, its work in developing the Environmental Monitoring and Assessment Program (EMAP), which included indicators and a statistically based monitoring design, provided the only data available for statistically defensible estimates, regionally and nationally, of the current condition and future trends of the effectiveness of surface water management policies. Through the Office of Water's adoption of the research outputs, the Agency will be able to verify that its actions are having the desired effect (9). The ERP research also addresses specific environmental needs at the state and local level. For example, the ERP's Regional Vulnerability Assessment Program (ReVA) provided the States of North and South Carolina, and the City of Charlotte and surrounding counties, with the results of alternative development strategies that allowed them to make management decisions to improve both air and water quality (10-12). This region is facing a severe test of its natural and man-made resources. The challenge for area planners is to guide growth with a regional perspective that sustains the environment and provides quality of life for residents. As another example, EPA Regions and other local communities are also applying the ReVA program to address their specific environmental needs (13,14). In the Science Advisory Board's (SAB) review of ReVA, the SAB commented "it is the opinion of the SAB that the suite of tools in ReVA can be exceptionally useful to local and regional resource managers for assessments and of current and future regional conditions" (15). Another example is a project to identify the best methods for mapping seagrass distribution and estimating seagrass abundance in turbid estuaries of the Pacific Northwest. This project was initiated in response to a specific need identified by the EPA Region 10 office (16). Given the uncertainty of the effects of global climate change, loss of habitat by urban sprawl, invasive species, non-point source pollution, increased water temperatures, wetland encroachment and cumulative and interactive effects of all of these on water, air, and land condition and services, there remains a clear interest and need by environmental managers at all levels to have the tools to optimize financially and environmentally their management decisions. The program's logic model shows how the ERP has and will continue to focus research on meeting these needs. (17)

Evidence: 1. 2003 Draft Report on the Environment, http://www.epa.gov/indicators/roe/html/roeEcoCha.htm 2. Limits of 305b data, OEI memo 3. Ecological Research 2003 Multi-Year Plan (Pages 6-12). 4. The State of the Nation's Ecosystems, Chapter 1 http://www.heinzctr.org/ecosystems/intro/reporting.shtml 5. Filling the Gaps Report, Executive Summary 6. GAO report http://www.gao.gov/new.items/d0552.pdf 7. EPA Strategic Plan, Introduction http://www.epa.gov/ocfo/plan/2006/04_cfo_introduction.pdf 8. Ecological Research Program BOSC Review (2005), pg 3 9. NHEERL EMAP Uses 10. PART 1.5 NERL request from OAQPS for ReVA support in SEQL 11. PART 1.5 NERl request for ReVA support in SEQL by Charlotte 12. SEQL website http://www.seql.org/ 13. NERL R5 Support Letter 14. PART 1.5 NERL request for ReVA in Reg 4 15. NERL ReVA SAB Advisory, page vii, Executive Summary 16. NHEERL RARE report - addresses a specific need identified by Region 10 (Executive Summary, pages v-vii)Draft logic model for revised MYP 2009 to 2013 17. Draft logic model for revised MYP 2009 to 2013

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The ERP is designed to be non-duplicative of other ecological research efforts and produce research that directly supports the EPA's mission and needs. This is achieved through extensive coordination and collaboration internally at EPA and with other federal R&D programs to provide unique research that fills gaps in scientific knowledge and tools.

Evidence: 1. Charter, Subcommittee on Ecological Systems, Committee on Environment and Natural Resources, National Science and Technology Council http://www.ostp.gov/NSTC/html/committee/chartercenr.html 2. MOA w/ NOAA for NCA, pg 1 - "Through this agreement, NOAA and EPA will combine assessment and monitoring efforts to prevent duplication of efforts, and bring together complementary resources" 3. NERL MOU NOAA 4. NERL MLRC IAG 5. NERL MRLC Sect 1.3 6. NERL NOAA MOA 7. NERL SWGAP IAG 8. NHEERL IAG NOAA (western Pilot) 9. NHEERL IAG NOAA (agreement) 10. NHEERL IAG w/USGS 11. IAG w NPS for NCA (see page 5-6 on duplication of effort) 12. IAG w USGS for NCA 13. EPA v USGS v FWS 14. NRMRL Ecosystem Restoration Program Collaboration 2000-2006 15. NERL MRLC Memo 16. NERL SWGAP IAG Memo 17. NHEERL Decision Memo on IAG w/ USGS 18. Distinctions - Eco & WQ

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The 2005 BOSC review concluded that the overall program design is sound. Specifically the report states "The three LTGs are well designed and follow a multiple-scale, hierarchical framework that facilitates addressing environmental issues at the national, state, and local levels. LTGs are well articulated and are not only relevant, but also crucial, to the overall mission of EPA." (1) It also states that The ERP has a logical and comprehensive design, which is adequate for ORD's planning process and for demonstrating progress toward its overall goals." (2) and "... the ERP has developed a probability-based design and sampling framework that is solid in theory and efficient in practice." (3) At the project/laboratory and center level, the EPA laboratories and centers conduct independent reviews of the programs. While there are always specific recommendations for improvements, the results of these reviews generally show individual components of the program are well designed (5-11). For example, the review of ReVA concluded that "the suite of tools in ReVA can be exceptionally useful to local and regional resource managers for assessments of current and future regional conditions." The report noted a number of limitations and suggested improvements, but no major design flaws (10). The Peer Review of CADDIS concluded that "CADDIS is a necessary tool to advance the practice of stressor identification." While the reviewers identified improvements and additional case studies that would be useful in improving the utility of CADDIS there were no major flaws identified (11). Further evidence supporting a flaw-free design is the significant number of publications in all parts of the program which undergo internal and external review to identify just such problems. To this end, the program has published over 250 articles in peer-reviewed journals in the last two years. The Program will be proposing changes to the goals and directions in 2008 and beyond. These changes, as per the ORD policy will undergo several levels of review, beginning with an SAB review of any redirection in September of 2007.

Evidence: 1. BOSC Review 2005, (p. 12, para 4) 2. BOSC Review 2005, (p.12, para 2) 3. BOSC Review 2005, (p. 6, para 3) 4. AED Peer Review 5. GED Peer Review 6. MED Peer Review 7. WED Peer Review 8. GWERD 2005 Peer Review 9. GWERD 2006 Re-Review 10. NERL ReVA SAB Advisory 11. CADDIS Peer Review

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: To ensure the intended beneficiaries are reached, the Program interacts directly with all major EPA Program Offices in the development of its overall and annual research planning (1). The National Program Director also meets with senior representatives of the Program Offices and Regions to verify high priority needs for both the immediate and longer-term future. Many offices and regions provide priority lists of their needs for consideration (2-7) and those are factored in to the development of the MYP. The program also responds to specific needs that fall within the scope of the program as they are identified. This type of coordination within the planning process has resulted in improved research products that provide direct benefits to program clients. A key example is the use of EMAP results. In the Draft 2007 Report on the Environment, many of the indicators relay on EMAP-related databases (8). As another example, EPA Regions identified specific needs to measure environmental outcomes that were met by the ERP through the Regional Vulnerability Assessment program (9-12).Other examples of the ERP's research reaching its intended beneficiaries are individual tools such as the Causal Analysis/Diagnosis Decision Information System (CADDIS) and others, which have directly contributed to the work of State and Local partners (13-18). Further evidence that ERP's design is providing direct benefits to its clients are client testimonials provided to the expert peer panel (BOSC) as part of its program review (19-24). The ERP also uses a variety of ways to transfer its research and technology to intended beneficiaries through peer reviewed articles, seminars, training, consultations, web/video conferences, desktop versions, web-based dissemination, etc (25).

Evidence: 1. Ecological Research 2003, Multi-Year Plan, pg 3 list of authors and contributors 2. BOSC review of Ecological Research Program (2005) 3. NRSC SPC presentation 4. OWOW New eco research areas 5. OWOW Priorities 2004 6. OWOW top ten 7. FY2007 Research Needs 8. Use of EMAP for the 2007 ROE 9. NERL Region3 RA Letter - This need was met by NERL ReVA Program Toolkit development. 10. PART 1.5 NERL request for ReVA in Reg 4 11. PART 1.5 NERL request for ReVA support in SEQL by Charlotte 12. PART 1.5 NERL request from OAQPS for ReVA support in SEQL 13. CT CADDIS letter "Identification of the cause of impairment is critical to developing a TMDL or mitigation strategy??The SI procedure developed by ORD has proven to be a particularly effective tool for that purpose." 14. IA CADDIS letter "IDNR believes that additional SI technical guidance and resources, particularly the Causal Analysis/Diagnosis Decision Information System, will be very useful in the agency deal with SI needs" 15. CADDIS program quotes 16. NHEERL EMAP Uses 17. NERL_WDT_Support_Letter "The WDT" will be very helpful in our projects and management scenario evaluations as it provides coverage of the entire geography and projects loading reductions from current authorities under the CAA and the CAIR". 18. NERL_CMAC_UCD_support_letter "Results of the CMAQ-UCD model will provide resource managers in Tampa Bay with 1) greatly improved estimates of both direct and indirect deposition of nitrogen to the bay and watershed??" 19. BOSC - CSU presentation 20. BOSC - Centalina Presentation 21. BOSC - MD DNR presentation 22. BOSC - OAR presentation 23. BOSC - OW presentation 24. BOSC - WI DNR presentation 25. Table of projects - "Transfer of Information" column on "Forecasting Measure" tab and "Restoration-Eco Services" tab.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has four long-term performance measures focused on outcomes, and all were previously established in consultation with OMB during the 2005 PART review. The first three measure the utility of ERP outputs in contributing towards the program's client's achievement of positive environmental outcomes (1). Specifically, these measures assess the extent to which States, tribes and relevant EPA offices have improved their ability to 1) determine causes of ecological degradation through the application of recently developed ORD causal diagnosis tools and methods; 2) forecast the ecological impacts of various actions through the application of recently developed ORD environmental forecasting tools and methods; and 3) protect and restore ecological condition and services through the application of recently developed (past five years) ORD environmental restoration and services tools and methods. Progress towards achieving each outcome will be rated by an independent expert peer review panel (ORD's Board of Scientific Counselors) using a standardized methodology of well-defined adjectives in addition to a summary narrative providing context and rationale for the chosen rating (2). This methodology was developed in partnership with OMB and independent experts. The fourth measure assesses the number of States using a common monitoring design developed by the Program to determine status and trends of any ecosystem chosen for study. These measures are meaningful because they evaluate the utility of our program products to our clients in meeting the needs that are discussed in question 1.2 . New long-term measures are being established to assess the percentage of program publications that are highly-cited and published in high-impact journals (3,4). These measures have been standardized for ORD programs, in consultation with OMB during the 2006 PART process, because they evaluate the extent to which the outputs of ORD programs are being used by the larger scientific community, which is an indication of both the quality and relevance of the program. These measures, which will capture data on a biennial basis, replace one of the program's annual measures that was previously established during the 2005 PART review to capture similar data.

Evidence: 1. PART long-term goals and measures (PART-Web "Measures" tab) 2. BOSC Subcommittee Handbook, Appendix B, pg 17-18 3. Performance Measurement: Standards for ORD Bibliometric Analysis 4. Performance Measurement: Methodology for Bibliometric Analysis

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The ERP has set a target of "Exceeds Expectations" for its first three long-term outcome measures, as measured by an independent expert peer review panel (1). This means the program is committed to meeting all of its key annual research goals while exceeding expectations in scientific quality and/or speed (2). The BOSC mid-cycle review in May will provide a single program rating on its progress. Although this rating will not be sufficient to establish formal baseline data for the 3 measures, it should be sufficient for establishing future targets. For the fourth outcome measure, the long-term target is for all 50 States to use a common monitoring design by 2011, with an interim target of 35 by 2008. This is an ambitious increase of 5 States per year, starting in 2005, because there is no requirement or financial incentive for States to make necessary changes to begin using the common monitoring design. For the new long-term measures, 2005 data show that 19.4% of the Ecological Research Program publications were highly-cited papers and that 19.3% of the papers were published in high-impact journals. The targets for these measures are a 1% increase over each 2-year cycle (3). This is ambitious because, based on the standardized methodology which examines the relative impact of publications over a 10-year window, only a small portion of the ERP publications included in the biennial analyses are newly released. That means a higher proportion of papers must be ranked as "highly cited" or "high impact" within the newest publications to achieve these increased targets on a near-term basis.

Evidence: 1. PART long-term goals and measures (PART-Web "Measures" tab) 2. BOSC Subcommittee Handbook - Methodology for Rating, Appendix B pg 17-18 3. Methodology for bibliometric analysis 4. Standards for ORD Bibliometric analysis

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The ERP annually tracks the number of states using a common monitoring design. This measure was established during the 2005 PART review and provides an annual indicator of progress towards the long-term target of 50 States using a common monitoring design and indicators to determine ecological status and trends. The program previously established an annual measure that tracks citations of ORD research in peer-reviewed journals (2), but these data are now captured biennially in the program's long-term measures. Therefore, the ERP has established four new annual measures associated with each of the program's four long-term goals that track progress towards completion of key research outputs within each of the long-term goals (3,4). Tracking completion of these key outputs on an annual basis provide a better indicator of progress toward achieving the program's long-term outcomes.

Evidence: 1. PART measures tab. 2. Methodology for measure on completion of APMs 3. ORD's performance measurement handbook - DRAFT 4. ORD's performance measurement policy - DRAFT

YES 10%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The program has established baselines and ambitious targets for all of its annual measures (1). The annual targets for the measure of States using the common monitoring design is an increase of 5 States per year. For the new annual measures that track the completion of key research outputs, the program has set ambitious targets of annually completing 100% of its key research outputs under each of the four long-term goals (2). These targets are ambitious because program planning assumes full utilization of available resources, and these conditions may change with time (e.g. key personnel may leave for other positions or resource limitations may affect specific programs). Additionally unexpected reductions to the appropriation after the start of a fiscal year also can result in delayed or cancelled research activities. The annual targets for the measure that previously tracked citations of ERP research in peer-reviewed journals were an increase of 1% every two years, which conform to the targets for the new long-term measures tracking the impact of publications (3).

Evidence: 1. PART measures tab 2. ORD's performance measurement handbook 3. Standards for ORD bibliometric analysis

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: EPA Program Offices and Regions all contribute towards the development of the ERP's long-term and annual goals (1,2). Additional partnerships that the Ecological Research Program uses are contractors and other government partners. For contractors, the Statements of Work (SOW) describe generally the goals of the program and how the planned work supports those goals (3-6). Agreements with other government partners also identify how the work supports the EPA goals (7-12). By signing these agreements the partners commit to supporting the goals of the program. These agreements also contain performance reporting requirements (13-14). However, the specific long-term goals of the program are not explicitly mentioned nor could the program demonstrate regular performance reporting on progress made by program partners towards its long-term goals.

Evidence: 1. MYP 2003, pg 3 list of authors 2. Ecological Discussion Group 3. NERL_EERD_SOW 4. NHEERL MED SOW 5. NHHERL SOW from MED 6. NHHERL SOW from MED on Lake Superior 7. NHEERL IAG w/USGS 8. NHEERL IAG w/USGS (PCEIS) 9. NRMRL USGS IAG for Mine Bank Run: Routing Checklist, p 2 10. NRMRL On-Site Technical Support Contract Routing Checklist, p2 11. NRMRL USGS IAG for Mine Bank Run (NRMRL), p 5 12. 3-24 Co-Op linking to Program Goals with EMAP, box 2a 13. NRMRL On-Site Technical Support Contract, Progress Report (1) 14. NRMRL On-Site Technical Support Contract Prog Reports (2)

NO 0%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: ORD is using the Board of Scientific Counselors (BOSC) to advise it on overall program performance and progress (1). External, independent evaluations of the overall Ecological Research program are now conducted by the Board of Scientific Counselors on a 4-5 year basis, with a mid-cycle review. The first such review was completed on March 7-9, 2005 in Research Triangle Park, NC (2). The BOSC Ecological Research subcommittee is comprised of a distinguished body of scientists drawn from academia, industry, non-EPA government or state agencies, and the environmental field. BOSC designated federal officers use FACA procedures to develop this subcommittee and to manage the review process (3,4). As special government employees, panelists are required to take ethics training and to submit financial information to maintain independence and minimize/prevent conflicts of interests (5). The next BOSC review will take place May 2007, and will assess progress made by the ERP on completing its recommendations since the 2005 review (6). Additionally, the EPA Science Advisory Board (SAB) conducts targeted reviews of the elements of the Ecological Research Program on an as needed basis; however, all of the major program within the Ecological Research Program (i.e., EMAP, ReVA, CADDIS, RePLUS, STAR) have received an SAB review or some other external program element review in the last 5 years (7,8). The review of ReVA concluded that "the suite of tools in ReVA can be exceptionally useful to local and regional resource managers for assessments of current and future regional conditions." The report noted a number of limitations and suggested improvements (8). The Peer Review of CADDIS concluded that "CADDIS is a necessary tool to advance the practice of stressor identification." While there were no major flaws identified, the reviewers identified improvements and additional case studies that would be useful in improving the utility of CADDIS (9). Internally, independent evaluations are regularly conducted to inform program planning and evaluate the appropriateness of its direction (10-16).

Evidence: 1. BOSC Subcommittee Handbook, Appendix B Charge Questions, pg 17-18 2. BOSC review of Ecological Research Program (2005) 3. OSP BOSC Website (www.epa.gov/OSP/bosc) 4. OSP BOSC fact sheet 5. Confidential Disclosure for form Special Government Employees Serving on Federal Advisory Committees at the U.S. Environmental Protection Agency 6. Out-year BOSC schedule 7. Science Advisory Board (SAB) (http://www.epa.gov/sab/pdf/epec02001.pdf) 8. NERL SAB Advisory 9. CADDIS Peer Review Report, Executive Summary 10. NRMRL Division Reviews SOP, p. 9 (for scope, independence, etc.) 11. NRMRL GWERD 2005 Peer Review(2) 12. NRMRL GWERD 2006 Re-Review on Ecology Restoration Program 13. AED Peer Review Report 14. GED Peer Review Report 15. MED Peer Review Report 16. WED Peer Review Report

YES 10%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Ecological research program budgets at the long-term goal level, clearly define the relationship between long-term performance targets and resources on an annual basis. Additionally, the detailed budget requests are aligned with the program priorities and outputs identified in the Multi-Year Plan. The program's budget presentation in the annual congressional justification contains information on how funding shifts or reductions affect the program's performance. In an effort to better integrate budget and performance, the program recently adopted an efficiency measure??"percent deviation from planned cost and schedule"?? against which progress will be reported in future Congressional Justifications. Additionally, the program is working to provide more transparent discussions in budget documents regarding how shifts in resources are based on program performance and feedback from the BOSC, and the extent to which resources levels are expected to affect future performance.

Evidence: 1. FY 2008 Congressional justification http://www.epa.gov/ocfo/budget/2008/sciencetech.pdf (pages 104-113) 2. MYP 2003 (pages 47-80) 3. PART Measures Tab - Efficiency Measure Explanation

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The BOSC review, the primary review body for the Ecological Research Program was completed on March 7-9, 2005. The BOSC suggested numerous changes that might improve the effectiveness of the program (1). In its response to the BOSC review, EPA identified specific actions that would be taken to address the deficiencies identified (2). Using the results of the BOSC review, as well as the early PART review, the Ecological Research Program has taken a number of steps to improve the program. The program has developed a progress report and a summary table for the upcoming BOSC mid-cycle review which demonstrates how the proposed changes were implemented (3, 4). For example, in response to the BOSC recommendation to improve integration across the long-term goals ORD instituted a series of teams to develop cross lab programs in place-based goals, accountability, diagnostics and forecasting, ecological forensics, and ecological services. For ecological services the ERP is focusing its efforts on not only the development of this concept as one of the long-term goals, but the infusion of this concept in many of its research efforts in the other long-term goals. The mid-cycle review will potentially highlight other possible improvements and the same process will be repeated. Additionally, research divisions within the program also undergo peer reviews to evaluate strategic planning. These reviews are also done every five years, with a mid-cycle evaluation completed to evaluate progress made on addressing review's recommendations. Generally, the mid-cycle reports found significant progress had been made since the first evaluation (5-12). For example, the follow-up evaluation of the Ground Water and Ecology Division stated "??it was clear that exceptional progress had been made. Most impressive were the efforts to articulate a coherent strategic plan and to implement the salient features of the plan into action in the very short time of just over one year" (12). Additionally, the Office of Research and Development is preparing to issue new policies and procedures for extramural awards that will require award documents to be more transparent about the extent to which the program's partners are committed to the program goals.

Evidence: 1. BOSC Review of the Ecological Research Program, 2005, page 2 2. ORD response to the BOSC review 3. Draft progress report information provided to the BOSC for the May 2007 review (under development) 4. Summary table of BOSC recommendations (under development) 5. NRMRL GWERD 2006 Re-Review on Ecology Restoration Program - "??it was clear that exceptional progress had been made. Most impressive were the efforts to articulate a coherent strategic plan and to implement the salient features of the plan into action in the very short time of just over one year" - pg 2 6. AED Peer Review 7. AED Mid-Cycle Report "Importantly, significant strides have been made by the division since its initial review??there have been substantial changes in research direction, and the enthusiasm shown by division staff is evident" - pg 2 8. GED Peer Review 9. GED Mid-Cycle Report - "As recommended, the division made many adjustments in its priorities, consolidating and even eliminating certain areas of research. This is commendable" 10. MED Peer Review 11. MED Mid-Cycle Report - "The reviewers were pleased with the progress made by MED in addressing the issues raised during its 2002 peer review" 12. WED Peer Review - "WED scientists have made excellent progress on both continuing research and major initiatives since the last review. Much of this progress has been made in cutting edge research such as the work on gene flow or in areas that are not typically associated with EPA-ORD such as salmon ecology" -pg 5 13. NERL ReVA SAB Response 14. Integrating Partner Performance with Program Goals - Draft Policy

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation: The program uses its Multi-year Plan development process and contingency planning process to assess and compare the benefits of efforts within the program (1, 2). This process involves soliciting input from client offices and determining where the program can have the greatest impact (3, 4). However, despite the high level of collaboration and coordination with other agencies to avoid duplication of work, the program has not formally compared the potential cost and benefits of its research efforts relative to other agencies.

Evidence: 1. MYP 2003, pages 3-11 2. OWOW New eco research areas 3. OWOW Priorities 2004 4. OWOW top ten 5. FY2007 research needs

NO 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Explanation: ORD has a process to prioritize current and planned research activities as well as additional research that could be undertaken (1). ORD's research planning process determines the research that needs to be accomplished in order to meet Annual Performance Goals (APGs) and Long-Term Goals (LTGs). To the extent that budget reductions prevent the program from completing all research needs within the originally anticipated timeframe and require resource allocation adjustments, program managers continually identify lower and higher priority research through active discussion and participation with Program Offices and Regions (2-5). These discussions ensure that budget requests and funding decisions will allow the most critical research to be completed in a timely manner.

Evidence: 1. MYP 2003 2. OWOW New eco research areas 3. OWOW Priorities 2004 4. OWOW top ten 5. FY2007 research needs

YES 10%
Section 2 - Strategic Planning Score 70%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The ERP regularly collects performance information from within the program and from its partners. The program tracks quarterly the status and progress made on its key research outputs (1-5). The ERP also collects performance data on both intramural and extramural scientific publications through bibliometric analyses that provide quantitative indicators of scientific quality and impact (6-7). Both contractors and recipients of assistance agreements provide the ERP with regular progress reports on progress made on program deliverables and outputs (8-15). Consistent with NAS recommendations on evaluating research performance, the overall program, specific research divisions, research areas, and projects within the ERP all undergo regular independent peer review to systematically evaluate the scientific credibility and quality of research projects and products (16). Program partners are an integral part of the program. To improve determinations of status and trends for ecological resources, the ERP tracks how many states have implemented a common monitoring design and appropriate indicators (17). However, even though the program collects and tracks performance, it's not apparent how it incorporates performance into management decisions and actions such as budget requests.

Evidence: 1. Quarter 1 APM Performance Tracking Memo 2. Quarter 2 APM Performance Tracking Memo 3. Quarter 3 APM Performance Tracking Memo 4. Quarter 4 APM Performance Tracking Memo 5. Ecosystem's APM list & status 6. 2005 Bibliometic Analysis 7. 2007 Bibliometric Analysis 8. EPA Policy on Compliance, Review and Monitoring, pg 3 9. NHEERL Performance Results under Assistance Agreement I 10. NHEERL Performance Results under Assistance Agreement II 11. NHEERL Performance Results under Assistance Agreement III 12. NHEERL REMAP Assistance Agreement Performance Results 13. NRMRL - On-site technical support services progress report I 14. NRMRL - On-site technical support services progress report II 15. Grantee Progress Report 16. Implementing the Government Performance and Results Act for Research (National Academy of Sciences), pg 37, "The most effective technique for evaluating research programs is review by panels of experts??" 17. FY 2005 Ecological Research PART, Measures tab

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Key managers within the Ecological Research Program include the National Program Director, Lab/Center Directors, Lab/Center Associate Directors, and Division Directors(1). The ERP incorporates program performance into personnel performance evaluation criteria. In their mid-year and end-of-year performance reviews, senior managers are held accountable for specific performance standards relating to quality and program goals, including progress toward achieving the targets and timelines described in the multi-year plan (2-5). Contractors are explicitly held accountable for quality, deliverables, costs, and schedules in evaluation criteria and in the statements of work (6-14). Contractors provide the program with monthly progress reports that describe progress made on deliverables and costs in accomplishing progress to date (15-16). Additionally the program's project officers numerically rate the Contractor on their performance in areas such as cost control, timeliness, business relations, and customer satisfaction (17-19) Past performance is considered in contract awards and renewals (20). Additionally, other assistance agreement partners also provide progress reports describing performance and costs, and quality management plans that specify requirements for rectifying situations should deficiencies be uncovered (21-30). The program's project officers are responsible for seeing that agreements are awarded and managed according to government regulations.

Evidence: 1. List of key managers 2. NHEERL SES Performance Agreement 3. ALD PARS Agreements 4. NERL Division Director Performance Standard 5. PI PARs 6. NHEERL QA Plan - "Requirements for Extramural Research", .pdf pg 23 (pg 18 hard copy) 7. 3-24 IAG w/USGS for NCA - Agreement on QAPP 8. QAPP Adherence Report 9. Performance Audit 10. EPA Contracts Management Manual Section 15.2 pg 275 pdf (pg 3 hard copy) 11. Project Officer Evaluation Template for Contracts 12. Project Officer Checklist for Invoice Review 13. Invoice Review Process in EPA Contract Management Manual 14. Management: EASYLite System for Invoice Payments_ORMA 15. NRMRL - On-site technical support services progress report I 16. NRMRL - On-site technical support services progress report II 17. NERL_PerfRep 18. NERL_Report 19. Template Contractor Performance Report 20. EPA Contracts Management Manual - Use of performance information section 21. NHEERL Assistance Agreement Terms and Conditions II 22. NHEERL Assistance Agreement Accountability - Details requirements for costs, performance, and schedule from an Assistance Agreement 23. NHEERL COOP Conditions and Agreements for Accountability 24. NHEERL Performance Results under Assistance Agreement I 25. NHEERL Performance Results under Assistance Agreement II 26. NHEERL Performance Results under Assistance Agreement III 27. NHEERL REMAP Assistance Agreement Results 28. NCER Grantee Site Visit 1 29. NCER Grantee Site Visit 2 30. NCER Grantee Site Visit 3

YES 12%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: The ERP develops a yearly operating plan and obligates its funds by object class (1). Resource tracking is also done at the long-term goal level. Both the program and its partners take resource needs into account when establishing schedules for obligations. For the past three years the percentage of obligated funds was 101.1%, 102.7%, and 91.6%, with seven months remaining on the FY06 two-year appropriation (1). Obligations and expenditures are tracked in the Agency's Integrated Financial Management System (IFMS) against the Operating Plan. Fund transfers between program objectives in excess of Congressionally-established limits and/or program direction require Congressional notification and/or approval. Monthly progress reports from contractors are reviewed by the Program's project officers and contract officer's representatives to compare costs against actual progress made (3-5). Example protocols applied for contracts, IAGs, assistance agreements, and purchase cards shows procedures in place for preventing erroneous payments (6-9). Additionally, post-award monitoring of assistance agreements includes monitoring the draw-down of funds against progress on workplan tasks and deliverables. This monitoring ensures that recipients are spending the funds designated to each program area for the intended purpose (10). EPA received a clean audit opinion and had no material weaknesses in its latest financial statement audits (11-13).

Evidence: 1. Finance Report showing BOC Breakout 2. EPA Contract Management Manual - Invoice Review Process 3. EASYLite System for Invoice Payments 4. NRMRL - On-site technical support services progress report I 5. NRMRL - On-site technical support services progress report II 6. NERL Extramural Protocol 7. NERL IAG Protocol 8. NERL Purchase Card protocol 9. NERL Assistance Agreements Protocol 10. EPA Policy on Monitoring Assistance Agreements 11. Information on EPA's Budget Automation System 12. Information on EPA's Resources Management Directive 13. OIG Audit Opinions and EPA Financial Statements FY04 - FY06

YES 12%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The ERP has an efficiency measure with baseline data and targets to track the percentage variance from the planned cost and schedule for delivering key program outputs that contribute toward achieving long-term outcomes (1). This measure incorporates elements of cost and performance for important annual decision points. Additionally, the ERP works to promote cost effectiveness through its research in several ways. For example, probability designs developed by the program for assessing the status of Ecological resources are more scientifically defensible, cheaper and faster than other conventional monitoring designs (2). Additionally, the Program leverages expertise and benefits for its own work by utilizing collaborations with other research and development programs. These collaborations such as the Multi Resolution Land Characteristics Consortium (MRLC), increase the cost effectiveness of operations by providing the ERP greater access to extensive sets of data produced by the other Federal programs at a fraction of the total cost to produce them (3-7). As a major program under the Office of Research and Development (ORD), the ERP continuously participates in efforts to achieve operational and administrative efficiencies across ORD. For example, program savings achieved through information technology and administrative efficiencies are redirected toward mission-critical research and other needs to improve program performance. ORD's Information Technology Improvement Project achieved a savings of $2 million in FY 2007 by investing in a more powerful, shared platform for high performance computing and reducing storage costs (8). ORD's Total Cost of Ownership Initiative created a standard desktop platform, established a centralized Call Center, and consolidated aspects of ORD's core computer infrastructure and maintenance to achieve an annual savings of $2 million starting in FY 2005 (9). These savings were reinvested in computational toxicology and human health risk assessment research, two high priority areas for ORD. Finally, ORD held in 2005 five streamlined competitive sourcing competitions involving 22 administrative FTE (10-11).

Evidence: 1. PART Measures Tab - Efficiency Measure 2. ORD's Environmental Monitoring and Assessment Program, slide 11 3. MRLC IAG Memo, "That leveraging enables EPA-ORD to produce lynchpin data for meeting APMs at about 10% of the cost the EPA-ORD would incur if it tried to solely fund national land-cover mapping" (pg 4). 4. MRLC Sect 4.3 Explanation 5. NHEERL Decision Memo on IAG w/ USGS pg 6. NHEERL IAG NOAA (Agreement on Western Pilot), pg 9 of pdf discusses benefits to ERP through this collaboration 7. NHEERL IAG w/ USGS, pg 5 discusses benefits to ERP through this collaboration 8. IT Savings Report 9. TCO Progress Report 10. Competitive Sourcing Summary Sheet 11. Completed Competition Results

YES 12%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: An expert peer review panel (BOSC) stated in its 2005 report that the ERP "has a superb track record of collaboration with a variety of partners" (1). These collaborations include a variety of projects with partners such as Federal, State, and Academia (2-15). One example of direct collaboration with other agencies is the National Coastal Assessment program (LTG 1) where reporting on the condition of the nation's coastal waters is a collaborative effort among, EPA (ORD and Office of Water), NOAA (National Ocean Service and National Marine Fisheries Service) and DOI (US Geological Survey and US Fish and Wildlife Service). These six Federal groups bring specific monitoring element results (non-redundant) into the production of a single National Coastal Condition Report (16-18). As part of the BOSC review, the Ecological Research Program demonstrated that it had affected nearly 450 non-STAR grant collaborations with other Federal Agencies, state resource agencies, tribes and academia in the period 2000-2005. In addition, the STAR grants program dispensed several hundred grants to academic institutions, many of which became direct collaborations with EPA researchers. Other major collaborations include working federal agencies like Energy, USGS, Army Corps of Engineers and USDA on issues related to multimedia environmental models and within the Multi-Resolution Land Characteristics Consortium (MRLC) that combines data on land-cover information between a number of Federal agencies (19-21).

Evidence: 1. BOSC Eco program review, pdf page 27 (pg 23 hard copy) 2. NERL MOU NOAA 3. NERL NOAA MOA 4. NERL SWGAP IAG Memo 5. NERL USDA IAG Memo 6. NHEERL Decision Memo on IAG w/ USGS 7. NHEERL IAG with USDA Forest Service 8. NHEERL IAG NOAA (Western Pilot) 9. NHEERL IAG w/ USGS (PCEIS) 10. NRMRL - USGS IAG for Minebank Run 11. NRMRL Ecosystem Restoration Program Collaboration 12. BOSC - CSU presentation 13. BOSC - Centalina presentation 14. BOSC - MD DNR presentation 15. BOSC - WI DNR presentation 16. IAG w/ USGS for NCA 17. MOA w/ NOAA for NCA 18. IAG w NPS for NCA 19. Ecological Research Multi-Year Plan, pg 18-19, 21, and 32-33 20. NERL MRLC Memo 21. NERL MRLC Sect 3.5

YES 12%
3.6

Does the program use strong financial management practices?

Explanation: The ERP follows EPA's financial management guidelines for committing, obligating, reprogramming, and reconciling appropriated funds. Agency officials have a system of controls and accountability (EPA's Resources Management Directives System), based on GAO, Treasury and OMB guidance as well as generally accepted accounting practices, to minimize improper payments (1-6). Protocols and procedures are also in place that strengthen financial management (7-10). The program is served by Funds Control Officers (FCOs) that have documented experience and/or training in EPA's budget execution and financial management systems (11). The program has no material weaknesses as reported by the Office of the Inspector General (OIG) and has procedures in place to minimize erroneous payments (12-17).

Evidence: 1. EPA 2006 Annual Plan 2. EPA Records Schedule BAS 3. Project Officer Checklist for Invoice 4. EPA Electronic Approval System for EASYLite Manual 5. ORD Acquisition Memorandum on Extramural training 6. EPA Contracts Management Manual Invoice Review 7. NERL Extramural Protocol 8. NERL IAG Protocol 9. NERL Purchase Card protocol 10. NERL Assistance Agreements Protocol 11. ORBIT Training Completion 12. EPA Annual Reports and Financial Statements 13. NERL FMFIA Letter 14. NHEERL FMFIA Letter 15. NCER FMFIA Letter 16. NCEA FMFIA Letter 17. NRMRL FMFIA Letter

YES 12%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: In the past, one of the program short-comings was the lack of permanent appointed program leadership. An acting director was appointed in the spring of 2005 until the position could be permanently filled. In October 2006 the ERP hired a national program director with sufficient authority and responsibility to manage the budget, establish a research agenda, develop measures of accountability, and to conduct program evaluations and take necessary actions to improve quality management. This action served to strengthen existing management of the ERP and provide leadership in defining user-oriented research products. (1-2) All labs and centers apart of the ERP conduct systematic annual FMFIA reviews to identify and correct material weaknesses (3-7). Noted deficiencies found within these reviews are corrected. The overall program management and division level management are both peer reviewed by independent expert panels on a regular cycle. The overall program was peer reviewed in 2005 and has addressed all recommendations from the BOSC (8,9). The program will receive a follow up evaluation in May 2007 by the same panel to rate progress made in addressing their recommendations. The research divisions within the Ecosystems program also undergo peer review that cover a variety of management topics. Each of the divisions prepares a response letter to each of the recommendations from the peer review committee (10-13). To assess progress made on these recommendations, each division has a follow-up review within two years. Generally speaking, the peer review committees noted satisfaction with the progress made in responding to their recommendations (14-17). For example, in a 2006 follow-up report on the "Ground Water and Ecosystems Restoration" division, an independent panel noted that "exceptional progress" had been made by the division in correcting deficiencies noted in its earlier 2005 report (18,19).

Evidence: 1. NPD Memo, May 2005 2. E??mail on Ecology NPD, 10/16/06 3. NERL FMFIA Letter 4. NHEERL FMFIA Letter 5. NCER FMFIA Letter 6. NCEA FMFIA Letter 7. NRMRL FMFIA Letter 8. BOSC Eco Program Review 9. BOSC ORD Response 10. AED Response Letter 11. GED Response Letter 12. MED Response Letter 13. WED Response Letter 14. AED Mid-Cycle Report 15. GED Mid-Cycle Report 16. MED Mid-Cycle Report 17. WED Mid-Cycle Report 18. NRMRL GWERD 2005 Review 19. NRMRL GWERD 2006 Follow-Up Review on Ecology Restoration Program, pg 2

YES 12%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: In its 2005 report, the BOSC panel stated the overall quality of the program's research was "superior" and that stringent quality assurance practices are followed (1). Different labs and centers producing research within the ERP have unique capabilities justifying their funding (2-10). Research projects within the ERP develop a "Quality Assurance Project Plan" before the project is initiated and all resulting findings and products are peer reviewed for scientific quality and credibility according to Agency standards (11-15). Additionally all research completed by contractors and grantees are also required to develop and adhere to a quality assurance plan (16-19). Internal funding is allocated to high priority areas as determined by ORD's planning process and internal programmatic reviews (20). Extramural funding allocations are also made to other Federal programs with unique capabilities and expertise through vehicles such as IAGs (21).

Evidence: 1. BOSC Eco Program Review 2. BOSC Review NCEA 2003 3. BOSC Review NCER 2003 4. BOSC Review NERL 2003, "The risk paradigm is the critical tool for integrating NERL's work with that of other EPA Laboratories and Centers and key customers. NERL places itself at the "source-to-dose" end of the risk paradigm. At the "source" point, NERL interfaces with the National Risk Management Research Laboratory (NRMRL) concerning source characterization of pollutants and stressors. At the dose end, a major interface is with the National Health and Environmental Effects Research Laboratory (NHEERL), where NERL focuses on pathways of exposure to dose, and NHEERL on dose-response relationships" 5. NERL EERD Website 6. NERL ERD Website 7. BOSC Review NHEERL 2003 8. Atlantic Ecology Division Website 9. BOSC Review NRMRL 2003 10. NRMRL GWERD Website 11. NHEERL Quality Management Plan 12. EPA Peer Review Guidelines 13. NHEERL LMMB Peer Review Charge - Example of peer review 14. NHEERL Peer Review Questions for Lake Michigan Project 15. NHEERL LMMB Peer Review Report - Example of peer review 16. NHEERL SOW from MED - Example of Quality Assurance project plan - "Modeling quality assurance plans shall be developed for all modeling projects consistent with the MED-D Quality Assurance Guidelines for Modeling (pg 2). 17. NHEERL SOW from MED on Lake Superior - Example of Quality Assurance project plan - "Modeling quality assurance plans shall be developed for all modeling projects consistent with the Quality Assurance Guidelines for Modeling" (pg 4). 18. CRG Audit 19. MEDQ QAPP Adherence Report 20. FY 2007 research needs 21. NHEERL Decision Memo on IAG w/ USGS

YES 12%
Section 3 - Program Management Score 88%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The ERP has made measurable progress towards meeting its 2008 target of 35 states using a common monitoring design and appropriate indicators to determine the status and trends of ecological resources. All incremental targets over the last two years have been met, and the program is on target to meet the 2007 target of 30 States by the end of the year (1, 2). As of September 2006 the number of States is 28, and there are an additional 2 States that are expected to be added by the end of 2007. Additionally, the ERP has also demonstrated improvements over its 2005 baselines with its two new long-term measures. The percentage of program publications rated as "highly-cited" increased to a total of 19.9% compared to its 2005 baseline of 19.4%. While this does not meet the target, it is an improvement over the baseline. "High Impact" publications also increased to 20.2% over its 2005 baseline of 19.3%, just shy of the target of 20.3%. (3, 4) An expert independent peer review panel (BOSC) will rate the progress of the program in meeting its long-term research goals within the next two years using well-defined rating criteria (5). The program has set a target rating of "Exceeds Expectations" for progress made in achieving its long-term goals. In preparation for this review, the ERP has developed a progress report and table that shows how the program's research is being used by its intended recipients for achieving positive environmental outcomes (6, 7). One example of this is the Automated Geospatial Watershed Assessment which is widely used for a number of different activities: TMDL planning & development, CWA sections 404/402 permit oversight, environmental impact assessments, post-fire assessment and rehabilitation planning, real-time flash flood warning, rapid watershed assessment, hydraulic design, research and education.

Evidence: 1. States using EMAP as of September 2006 2. PART long-term goals and measures (PART-Web "Measures" tab) 3. Bibliometric analysis 2005 4. Bibliometric analysis 2007 5. BOSC Subcommittee Handbook - Methodology for Rating, Appendix B pg 18 6. Draft progress report information provided to the BOSC for the May 2007 review 7. Table of Projects

SMALL EXTENT 7%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The ERP has met each annual target set to measure progress made on the number of states using a common monitoring design and appropriate indicators to determine the status and trends of ecological resources. The program is on target to meet the 2007 target of 30 States by the end of the year (1, 2). As of September 2006 the number of States is 28, and there are an additional 2 States that are expected to be added by the end of 2007. The program has added in four new annual measures to provide quantitative indicators of progress made in achieving its annual goals (1). Each of these measures has set a goal of delivering 100% of its key research outputs on an annual basis. From its 2005 baseline, only one out of four targets of 100% were met. However, in 2006 the program successfully increased the number of targets met to three out of four and is also on track in 2007 for meeting three out the four targets (3). Based on the previously established annual measure of publication citation rates, in the 2007 bibliometric analysis, 88% of program publications had been cited within the wider scientific body of literature, an increase of 2% over the 2005 baseline. This exceeds the target of 87% that was identified in the previous PART review. Although these data are now being captured in the two new long-term measures, which are better indicators of quality and impact, the data clearly show annual progress toward long-term impacts. (4, 5)

Evidence: 1. PART measures tab 2. States using EMAP as of September 2006 3. 3-24 Ecosystem's APM List & Status 4. Bibliometric analysis 2005 5. Bibliometric analysis 2007

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The ERP has established a new efficiency measure that measures cost and schedule variance for its key research outputs (1, 2). This measure will serve as a tool for managers to monitor progress made in achieving key research outputs relative to the amount of funds spent. The program's 2004 and 2005 baselines are -8.1% and -15.6%, respectively. Final data for 2006 will be available at the end of the fiscal year 2007, due to the program's two-year appropriation. The ERP leverages additional expertise and benefits for its own work by utilizing collaborations with other research and development programs. For example, a collaboration of 10 Agencies within the Multi Resolution Land Characteristics Consortium (MRLC) has increased the cost effectiveness of operations by providing the ERP access to extensive sets of data produced by the other Federal programs at a fraction of the total cost to produce them. Contributing towards this collaboration has effectively saved the ERP, as well as other participating federal programs, millions of dollars (3, 4). Methodology designed by the program for determining ecosystem condition (EMAP) reduces costs, better prioritizes important areas and stressors, and provides a common framework to aggregate data from local up to national levels. Comparisons between conventional methodology and EMAP show application of EMAP costs significantly less while providing more and better information. The amount of data that has to be collected to reach the same conclusions as conventional methodology is also significantly reduced. For example, EMAP characterized the trophic status of northeastern U.S. lakes using only 344 lakes, a savings in both time and cost over a conventional study requiring a census of 2756 lakes. Increased use of this methodology by other Federal partners and States will continue to increase cost savings (8). As a major program under the Office of Research and Development (ORD), the ERP continuously participates in efforts to achieve operational and administrative efficiencies across ORD. For example, program savings achieved through information technology and administrative efficiencies are redirected toward mission-critical research and other needs to improve program performance. ORD's Information Technology Improvement Project achieved a savings of $2 million in FY 2007 by investing in a more powerful, shared platform for high performance computing and reducing storage costs (9). ORD's Total Cost of Ownership Initiative created a standard desktop platform, established a centralized Call Center, and consolidated aspects of ORD's core computer infrastructure and maintenance to achieve an annual savings of $2 million starting in FY 2005 (10). These savings were reinvested in computational toxicology and human health risk assessment research, two high priority areas for ORD. Finally, ORD held in 2005 five streamlined competitive sourcing competitions involving 22 administrative FTE (11-12).

Evidence: 1. PART measures tab - Efficiency Measures Explanation 2. Methodology for efficiency measure 3. MRLC IAG Memo, "That leveraging enables EPA-ORD to produce lynchpin data for meeting APMs at about 10% of the cost the EPA-ORD would incur if it tried to solely fund national land-cover mapping" (pg 4). 4. MRLC Sect 4.3 Explanation 5. NHEERL Decision Memo on IAG w/ USGS 6. NHEERL IAG NOAA (Agreement on Western Pilot), pg 9 of pdf discusses benefits to ERP through this collaboration 7. NHEERL IAG w/ USGS, pg 5 discusses benefits to ERP through this collaboration 8. About EMAP presentation 9. IT Savings Report 10. TCO Progress Report 11. Competitive Sourcing Summary Sheet 12. Completed Competition Results

SMALL EXTENT 7%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: Explanation: The recent BOSC review of the Ecological Research program reported the research in this program to be " very good at what it does - high quality scientific research", "The major source conducting ecosystem research in the U.S."and a "Very fine science organization" (1). A joint peer review of the Ecological Monitoring Assessment Program (EMAP) by the Ecological Society of America and the American Statistical Association stated "We know of no other means by which direct evidence of the consequences of environmental protection actions can be gathered??R-EMAP demonstration programs have put EMAP at the forefront of having solid data from both probability and GIS-based design" (2). EMAP is also the only statistically-valid approach to determining state and national aquatic ecosystem condition, and is now being applied by a variety of Federal, State, and Local partners for determining ecosystem condition (3). The ERP also conducts a formal bibliometric analysis to compare its scientific publications against other publications within the same field using recognized quantitative indicators of quality and significance (4). In the 2005 analysis, ERP publications were "more highly cited than the average paper" and nearly 20% of the papers were published in the top 10% of journals, indications of greater impact and quality. (5)

Evidence: 1. Results of BOSC Review of the Ecological Research Program, 2005 2. Peer Review of EMAP byESA and ASA, pg 1 3. NHEERL EMAP Uses 4. A Toolkit for Evaluating Public R&D Investment" (NIST) - "The more other scientists cite a research paper or patent, the greater its assumed relevance, impact, quality, and dissemination, other things being equal.", pg 47-48. 5. ERP Bibliometric analysis 2005

LARGE EXTENT 13%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Since the last PART review in 2005, the ERP has had a number of reviews, and a mid-cycle review by the Board of Scientific Counselors was held in May 2007, which found the program met their expectations and had made "significant progress"' since the 2005 review (1). ORD is holding regular independent external reviews of its research programs to assess quality, relevance, and progress towards achieving long-term goals as identified in the MYPs. ORD will use feedback from these reviews to improve its program design, measurement and management. Programs will be assessed every three to four years on a rotating basis (2-3). The recent BOSC programmatic review of the Ecological Research program reported the research in this program to be " very good at what it does - high quality scientific research", "The major source conducting ecosystem research in the U.S."and a "Very fine science organization" (4). The EPA Science Advisory Board (SAB) conducts targeted reviews of the elements of the Ecological Research Program on an as needed basis; however, all of the major program within the Ecological Research Program (i.e., EMAP, ReVA(4), CADDIS, RePLUS) have received an SAB review or some other external program element review in the last 5 years. For example, the SAB reviewed ReVA and concluded that "the suite of tools in ReVA can be exceptionally useful to local and regional resource managers for assessments of current and future regional conditions." (5) In addition, each laboratory and center conducts independent reviews. These reviews also show the individual research divisions that make up the ERP are achieving results (6-13). For instance, in the peer review for the Mid-Continent Ecology Division (MED), the panel found "Overall, the research program at the MED laboratory appears to be very strong, at the leading edge of science and contributing to both the Agency's programmatic and long-term core needs" (8). Finally, the GAO also examined the EMAP program and its contributions towards ecological indicators. GAO noted that "EPA regions will benefit as well from consistent and comparable environmental data as a result of the EMAP approach" (14).

Evidence: 1. Board of Scientific Counselors Mid-Cycle Review Ecological Research Program - Final Report 2. OSP BOSC Website (www.epa.gov/OSP/bosc) 3. OSP BOSC fact sheet 4. BOSC review of Ecological Research Program (2005) 5. NERL ReVa SAB Advisory 6. AED divisional peer review 7. GED divisional peer review 8. MED divisional peer review, .pdf pg 5 (pg 1 hard copy) 9. WED divisional peer review 2001 10. WED divisional peer review 2006 11. NRMRL GWERD 2005 Peer Review(2) 12. NRMRL GWERD 2006 Re-Review on Ecology Restoration Program 13. NRMRL Division Reviews SOP, p. 9 (for scope, independence, etc.) 14. Environmental Information - Status of Federal Data Programs that Support Ecological Indicators, Government Accountability Office

LARGE EXTENT 13%
Section 4 - Program Results/Accountability Score 53%


Last updated: 09062008.2007SPR