ExpectMore.gov


Detailed Information on the
EPA Human Health Research Assessment

Program Code 10004373
Program Title EPA Human Health Research
Department Name Environmental Protection Agy
Agency/Bureau Name Environmental Protection Agency
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2005
Assessment Rating Adequate
Assessment Section Scores
Section Score
Program Purpose & Design 80%
Strategic Planning 70%
Program Management 91%
Program Results/Accountability 40%
Program Funding Level
(in millions)
FY2007 $61
FY2008 $62
FY2009 $57

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2008

Identify appropriate targets for bibliometric analysis measures by benchmarking with other agencies.

Action taken, but not completed In 2008, the program will begin investigating the ??h-index?? for benchmarking with other agencies, and will use the resulting benchmarking information to negotiate meaningful targets by 2009.
2007

Implement follow-up recommendations resulting from the Human Health Subcommittee Board of Scientific Counselors (BOSC) mid-cycle review. Follow up actions are those actions committed to in the Human Health Research program's formal response to the BOSC.

Action taken, but not completed The program has completed a draft response to the BOSC mid-cycle review, outlining its plan for addressing recommendations.
2007

Establish formal baselines for the program's BOSC-informed long-term measures at the next comprehensive BOSC review.

Action taken, but not completed At its 2007 mid-cycle BOSC review, the program received a ??meets expectations?? rating for its progress. At its next comprehensive review, the program will obtain formal baseline data as to its progress on each long-term goal.
2007

Increase the transparency of budget, program, and performance information in budget documents.

Action taken, but not completed ORD separated the Eco and HH budget dollar amounts in the FY 2010 Congressional Justification and better indicated how dollars would result in progress toward long-term goals. ORD plans to work with OMB through the next budget submission process to ensure appropriate levels of transparency.
2008

Reassess meaningfulness of current efficiency measure in light of recent National Academy of Sciences (NAS) report on efficiency measurement.

Action taken, but not completed Milestones: ?? October 2008: Explore the feasibility of tracking savings resulting from the AEP effort as an ORD-PART efficiency measure. ?? December 2008: Continue interagency dialogue regarding NAS recommendations. ?? June 2009: Reach agreement on approach. Update measures in PART Web during the Spring Update.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Improve ability to link budget resources to annual and long-term performance targets by requesting and reporting Human Health research and Ecosystem research funding as separate program-projects.

Completed In order to improve the linkage between budget resources and long-term performance targets, the Agency created sub-program-projects in the FY 2008 budget to allow for better distinction between the ecosystems and human health research programs.
2006

Develop ambitious long-term performance targets that clearly define what outcomes would represent a successful program.

Completed The program collected initial long-term measurement data during its mid-cycle BOSC review in early 2007. Based on this data, the program has set ambitious long-term performance targets. The program will collect formal long-term measurement data during its comprehensive BOSC review scheduled for fall, 2008.
2006

Implement follow up recommendations resulting from external expert review by the Human Health Subcommittee of the Board of Scientific Counselors (BOSC). Follow up actions are those actions committed to in the Human Health Research program's formal response to the BOSC in September 2005.

Completed In early 2007, the program underwent a BOSC mid-cycle review, in which the BOSC assessed the extent to which the program had taken planned actions in response to recommendations resulting from the previous review. In its mid-cycle report, the BOSC noted that the program??s response ??included an initial response letter (http://www.epa.gov/osp/bosc/pdf/hh0509resp.pdf) and then specific actions including revising the MYP and changes in the HHRP scope and activities consistent with the BOSC review.??

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Utility of ORD's methods and models for risk assessors and risk managers to evaluate the effectiveness of public health outcomes


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/hhmc0707rpt.pdf. The program's formal action plan can be found at http://www.epa.gov/OSP/bosc/pdf/hhmc0801resp.pdf.

Year Target Actual
2008 Exceeds Expectations
2012 Exceeds Expectations
Annual Output

Measure: Percentage of planned outputs delivered in support of the public health outcomes long term goal


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2002 100% 100%
2003 100% 100%
2004 100% 100%
2005 100% 100%
2006 100% 100%
2007 100% 100%
2008 100%
2009 100%
Long-term Outcome

Measure: Utility of ORD's methods, model, and data for risk assessors and risk managers to characterize aggregate and cumulative risk in order to manage risk of humans exposed to multiple environmental stressors


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/hhmc0707rpt.pdf. The program's formal action plan can be found at http://www.epa.gov/OSP/bosc/pdf/hhmc0801resp.pdf.

Year Target Actual
2008 Exceeds Expectations
2012 Exceeds Expectations
Long-term Outcome

Measure: Percentage of peer-reviewed EPA risk assessments in which ORD's characterization of aggregate/cumulative risk is cited as supporting a decision to move away from or to apply default risk assessment assumptions.


Explanation:Percentage is calculated by dividing the number of externally peer-reviewed EPA risk assessments in which ORD's research avoids or confirms the use of default assumptions by the total number of externally peer-reviewed risk assessments produced by EPA during that period. For the purposes of this calculation, ORD's products include both EPA-authored and EPA-funded reports.

Year Target Actual
2005 Baseline 5%
2009 5.5%
2013 6%
Annual Output

Measure: Percentage of planned outputs delivered in support of the aggregate and cumulative risk long term goal


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2000 100% 80%
2001 100% 83%
2002 100% 80%
2003 100% 87%
2004 100% 88%
2005 100% 86%
2006 100% 100%
2007 100% 100%
2008 100%
2009 100%
Long-term Outcome

Measure: Utility of ORD's methods, models, and data for risk assessors and risk managers to use mechanistic (mode of action) information to reduce uncertainty in risk assessment


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/hhmc0707rpt.pdf.

Year Target Actual
2008 Exceeds Expectations
2012 Exceeds Expectations
Long-term Outcome

Measure: Percentage of peer-reviewed EPA risk assessments in which ORD's mechanistic information is cited as supporting a decision to move away from or to apply default risk assessment assumptions.


Explanation:Percentage is calculated by dividing the number of externally peer-reviewed EPA risk assessments in which ORD's research avoids or confirms the use of default assumptions by the total number of externally peer-reviewed risk assessments produced by EPA during that period. For the purposes of this calculation, ORD's products include both EPA-authored and EPA-funded reports.

Year Target Actual
2005 Baseline 15%
2009 16.5%
2013 18%
Annual Output

Measure: Percentage of planned outputs delivered in support of mechanistic data long term goal


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2000 100% 100%
2001 100% 100%
2002 100% 100%
2003 100% 100%
2004 100% 100%
2005 100% 93%
2006 100% 92%
2007 100% 100%
2008 100%
2009 100%
Long-term Outcome

Measure: Utility of ORD's methods, models, and data for risk assessors and risk managers to characterize and provide adequate protection for susceptible subpopulations.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research in this area. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2008 mid-cycle report can be found at: http://www.epa.gov/OSP/bosc/pdf/hhmc0707rpt.pdf. The program's formal action plan can be found at http://www.epa.gov/OSP/bosc/pdf/hhmc0801resp.pdf.

Year Target Actual
2008 Exceeds Expectations
2012 Exceeds Expectations
Long-term Outcome

Measure: Percentage of peer-reviewed EPA risk assessments in which ORD's methods, models or data for assessing risk to susceptible subpopulations is cited as supporting a decision to move away from or to apply default risk assessment assumptions.


Explanation:Percentage is calculated by dividing the number of externally peer-reviewed EPA risk assessments in which ORD's research avoids or confirms the use of default assumptions by the total number of externally peer-reviewed risk assessments produced by EPA during that period. For the purposes of this calculation, ORD's products include both EPA-authored and EPA-funded reports.

Year Target Actual
2005 Baseline 3%
2009 3.5%
2013 4%
Annual Output

Measure: Percentage of planned outputs delivered in support of the susceptible subpopulations long term goal


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2000 100% 100%
2001 100% 100%
2002 100% 100%
2003 100% 93%
2004 100% 98%
2005 100% 100%
2006 100% 100%
2007 100% 100%
2008 100%
2009 100%
Annual Outcome

Measure: Percentage of Human Health program publications rated as highly cited papers (top 10% in field) in research journals


Explanation:This metric provides a systematic way of quantifying research performance and impact by counting the number of times an article is cited within other publications. The "highly cited" data are based on the percentage of all program publications that are cited in the top 10% of their field, as determined by "Thomson's Essential Science Indicator" (ESI). Each analysis evaluates the publications from the last ten year period, and is timed to match the cycle for independent expert program reviews by the Board of Scientific Counselors (BOSC). This "highly cited" metric provides information on the quality of the program's research, as well as the degree to which that research is impacting the science community. As such, it is an instructive tool both for the program and for independent panels??such as the BOSC?? in their program reviews. To best establish ambitious and appropriate targets in the future, ORD will collect benchmarking information by conducting an analysis of bibliometric measures used in R&D programs outside of EPA.

Year Target Actual
2005 Baseline 24%
2006 N/A 25%
2008 25.5%
2010 26.5%
Annual Efficiency

Measure: Average time (in days) to process research grant proposals from RFA closure to submittal to EPA's Grants Administration Division, while maintaining a credible and efficient competitive merit review system (as evaluated by external expert review)


Explanation:Improvements in processing time will enable grantees to receive funding more quickly; allow important, relevant research to begin sooner; and result in more expeditious delivery of research products and data.

Year Target Actual
2003 Baseline 405
2004 NA 350
2005 NA 340
2006 323 277
2007 307 254
2008 292
2009 277

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The overall purpose of EPA's Human Health (HH) Research Program is to address limitations in human health risk assessment with a focus on biological modes of toxicity, aggregate and cumulative risk, susceptible subpopulations, and evaluations of public health outcomes resulting from risk management decisions. The program provides broad fundamental scientific information that will improve understanding of problem-driven human health issues arising from risk assessment in EPA's Program and Regional Offices, other Federal agencies, international health organizations, the regulated community, and the academic community.

Evidence: ?? 2003-2008 EPA Strategic Plan (Introduction, p.2, 102) www.epa.gov/ocfo/plan/plan.htm ?? Human Health Research Strategy (p. 1-1) www.epa.gov/ord/htm/researchstrategies.htm ?? Human Health Multi-Year Plan (p. 2-3) www.epa.gov/osp/myp.htm

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The HH Research Program is designed to address specific needs identified by The Presidential/ Congressional Commission on Risk Assessment and Risk Management, which was mandated as part of the Clean Air Act Amendments of 1990. Specifically, the Commission noted that1) there is a need for a common metric for risk assessment of both carcinogens and non-carcinogens; 2) mechanistic knowledge will improve interpretation of margins of exposure and protection; 3) knowledge of mode or mechanism of action is needed to evaluate interactions of chemicals in mixtures; 4) there is a need to account for multiple and cumulative exposures in risk assessment; and 5) risk assessments should identify subpopulations especially susceptible to specific chemical exposures. The program has aligned its long term research goals to address these needs.

Evidence: ?? Risk Assessment and Risk Management in Regulatory-Decision Making. The Presidential/Congressional Commission on Risk Assessment and Risk Management. Final Report Volume 2, 1997 (pp. ii, 44-46, pp 68-69, pp. 71-72). www.riskworld.com/nreports/1996/risk_rpt/ ?? Human Health Research Strategy (p. 3-1) www.epa.gov/ord/htm/researchstrategies.htm ?? Summary Report: Environmental Public Outcomes Workshop Proceedings. August, 2002. Office of Research and Development, EPA/625/R-03/001 (p. 1-1).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The program identifies several agencies with similar human health goals, but there is not evidence that EPA has performed the necessary due diligence to justify and coordinate any duplication that exists with current efforts. A recent program review by EPA's Board of Scientific Counselors noted evidence of collaboration but also recommended that the program track and collaborate with specific research programs actively engaged in relevant research.

Evidence: ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005 ?? Other agencies with related goals include USDA, NCTR, Agency for Toxic Substances and Disease Registry (ATSDR), CDC, and NIH.

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no evidence of major flaws that would limit the program's effectiveness or efficiency. The program has an unambiguous, focused design, as evidenced by the Human Health Research Strategy. The Human Health research program uses a combination of in-house and competitive grants to carry out research. This leverages the knowledge of both the agency and other institutions. Findings from a recent independent expert panel review by EPA's Board of Scientific Counselors indicate no major design flaws.

Evidence: ?? Human Health Research Strategy www.epa.gov/ord/htm/researchstrategies.htm ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: There is evidence that the program is designed so that its outputs (methods, models and data) directly address the program purpose and are relevant to customer needs. The Board of Scientific Counselors (BOSC) review found evidence that clients regularly participate in research planning and design, and that the program's research outputs had been used by clients and stakeholders to address the program's purpose. The program's extramural research grants are also aligned with the program purpose and long-term goals. ORD scientists and ORD clients at EPA Program/Regional Office staff work together to generate Requests for Applications (RFAs) for the human health program to ensure appropriate scientific focus and programmatic relevance. The BOSC review noted that there was effective and strategic coordination between the intramural and extramural components of the program.

Evidence: ?? Program Design Model. ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005 (p. 5, 43-45). ?? National Center for Environmental Research 1999-2000 RFA's for STAR Grants and Cooperative Agreements ?? Documentation package for RFA demonstrating Program Office involvement and review.

YES 20%
Section 1 - Program Purpose & Design Score 80%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has two sets of long-term performance measures that are meaningful because they focus on the four major areas of research presented in the program's multi-year plan: 1) the use of mechanistic data in risk assessment; 2) improved characterization of aggregate and cumulative risk; 3) characterization of susceptible subpopulations; and 4) evaluation of public health outcomes. Each research area has two measures. The first set of measures relies on expert panel reviews conducted every 4 years to assess use of program outputs in risk assessments, and the second set of measures specifically focuses on the use of program outputs to move away from the default risk assumptions that are frequently used in risk assessments. The program's purpose is to provide scientific information to improve risk assessment (ultimately improving protection of human health), so both sets of measures are outcome-oriented. EPA will also use a bibliometric analysis to determine citation counts and cites per paper scores, which are indicators of overall influence and impact of EPA's Human Health research.

Evidence: ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report) ?? Human Health Research Strategy (p. 1-1) www.epa.gov/ord/htm/researchstrategies.htm ?? Human Health Multi-Year Plan (p. 2-3) www.epa.gov/osp/myp.htm

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The targets for future reviews by the Board of Scientific Counselors (BOSC) fail to define what outcomes would represent a successful program. Therefore it is impossible to determine if the targets provided are ambitious, and it is unclear that these targets will promote continued improvement. It is also unclear that the BOSC will hold the program accountable for deviations from multi-year research schedules. Though quantitative, verifiable baselines are in place for the set of default assumption measures, it is also unclear whether the targets (10 percent increase every 4 years) are ambitious.

Evidence: ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report) ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005 (p. 9-42, 47-49)

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The Human Health research program conducts basic research, and the process through which research products ultimately inform risk assessments makes annual assessment of program outcomes impractical. Therefore the program assesses its annual progress toward its long term measures by measuring and reporting on its success in completing planned research outputs. These annual measures tie directly to the program's longer term research milestones because the output objectives are determined through the multi-year planning process and are published in the program's Multi-year Plan. Progress against these measures is monitored quarterly and reported annually. EPA also is in the process of developing and implementing an annual survey to assess client satisfaction over time and the impact of ORD methods, models and data.

Evidence: ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report) ?? Draft Client Survey measure implementation plan

YES 10%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The baselines and targets for the annual milestone measures are created through the multi-year planning process. A limited number of outputs (20-25) per year is negotiated with regional and program office clients. The Multi-Year Plan (MYP) is updated every 3-4 years with new outputs put into place for out-years (3-4 years in advance). The measure is calculated in a given year as percent completed relative to the number proposed in the MYP (previous commitments are not dropped from the MYP, even in cases of delays or non-completion). Targets of 100% are ambitious because output negotiations assume full utilization of available resources, and these conditions may change with time (e.g., RFAs for grants may get delayed for a year, key personnel may leave for other positions, or resources limitations may affect specific programs) and significant coordination across all organizational elements is required to meet all outputs on schedule. A baseline for the annual survey measure should be available in FY 2006.

Evidence: ?? Human Health Multi-Year Plan www.epa.gov/osp/myp.htm ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report)

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: There is evidence that partners (e.g., STAR grantees, contractors) commit to goals that address limitations in human health risk assessment with a focus on use of mechanistic data in risk assessment, aggregate and cumulative risk, susceptible subpopulations, and evaluation of public health outcomes. Requests for Applications (RFAs) for grants are generated with input from Program/Regional Office stakeholders and are explicitly associated with the program's long-term goals. Proposals are reviewed for relevancy to program goals during internal review and while undergoing outside expert peer review. Projected outputs from grants and contracts are evaluated by the Research Coordinating Teams (RCTs) annually. Partners report on their performance toward annual and long-term goals through ORD's Integrated Resource Management System. The BOSC review found that the program successfully utilizes the grants program to advance its research agenda, suggesting that grantees work toward the long-term goals.

Evidence: ?? http://es.epa.gov/ncer/grants/ (see Previous Solicitations (Archive) for 1999 to 2004) and http://es.epa.gov/ncer/rfa/ for Current Funding Opportunities (A table provided by the Agency shows that RFAs for NCER STAR Grants and Cooperative Agreements are aligned by long-term goal). ?? Summary of Peer Review Process for ORD Solicitation No. NERL/HEASD/001 (Demonstrates that relevancy to LTG's is considered in peer review process) ?? Summary Report on the In-House Review of Proposals Submitted in Response to NCER Request for Applications 2003-STAR-E1 (LTG2). ?? NAS Review of STAR Grants Program: "STAR's procedures for incorporating missions relevance into its...selection of proposals to fund exceed those practiced by most other agencies." ?? Integrated Resource Management System reports for Human Health program.

YES 10%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: EPA's HH Research Program, consisting of the intramural, extramural STAR, and collaborative research programs, was recently externally reviewed by a subcommittee of ORD's Board of Scientific Counselors (BOSC), and will continue to be assessed every 4-5 years. BOSC subcommittees comprise a distinguished body of scientists and engineers who may be drawn from academia, industry, non-EPA government or state agencies, and environmental communities. The BOSC Designated Federal Officers (DFOs) use FACA procedures to search for subcommittee candidates who are recognized leaders in their field, but are also free of both real and perceived conflict of interest. The charge for the review--including a clear objective, background information, and detailed questions aligned with the R&D criteria--ensures that the review is rigorous. The BOSC HH Subcommittee was asked to evaluate the program using 24 questions related to the R&D Investment Criteria with respect to Relevance, Quality, Performance and Scientific Leadership.

Evidence: ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005. ?? Charge to the Board of Scientific Counselors (p. 2 shows reviews are to occur every 4-5 years)

YES 10%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The program has a process of prioritization within the long-term performance goals, and there is some evidence that the planning process is tied to performance, but budget materials do not clearly link the program's budget to annual and long-term performance targets. To earn a yes, the program must improve its ability to link budget resources to annual and long-term performance targets.

Evidence: ?? ORD FY2005-2006 Final Report of the Human Health Research Planning Workgroup. ORD FY2005-2006 Contingency Plan. ?? ORD FY04 Final Report of the Human Health Working Group (p.4, 5,7) ?? ORD FY07 Summary Report of the Human Health Working Group

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Prior to beginning preparations for the PART review, the program had not established performance measures with associated targets and baselines. The program has since developed and adopted long-term and annual performance measures that demonstrate progress toward achieving long-term goals. During the BOSC review, several strategic planning deficiencies were identified, and program staff have developed an action plan to address each recommendation and then discussed this plan with the BOSC Executive Council. This document includes a timetable for changes, and future BOSC reviews will assess progress against this timetable.

Evidence: ?? 2005 PART measures (Agency has committed to including this measure in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report) ?? Human Health Multi-Year Plan www.epa.gov/osp/myp.htm ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005 (p. 3). ?? Draft Response to the Human Health Subcommittee of the Board of Scientific Counselors review

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation:  

Evidence:  

NA 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: The program follows an internal prioritization process to guide budget requests and funding decisions. External advisory bodies such as the EPA Science Advisory Board and ORD's Board of Scientific Counselors are consulted and their recommendations are considered in priority-setting. Prioritization is the basis for the program's Multi-Year Plan (MYP), which guides programmatic research and resource allocation. The program is able to identify a set of current priorities that is a subset of activities covered in the MYP. Though the program satisfies the criteria for this question, it should make an effort to communicate and justify this subset of priorities more clearly in budget justifications.

Evidence: ?? Fiscal year 2006 Justification of Appropriation Estimates for the Committee of Appropriations. ?? Strategic Plan of the Office of Research and Development (pp. viii, 29-32) 1997 Update to ORD's Strategic Plan (p. 29) ?? Human Health Multi-Year Plan www.epa.gov/osp/myp.htm ?? ORD FY2005-2006 Final Report of the Human Health Research Planning Workgroup. ORD FY2005-2006 Contingency Plan. ?? ORD FY07 Planning Summary from Human Health Working Group

YES 10%
Section 2 - Strategic Planning Score 70%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The HH program collects information on output milestones quarterly. The milestones are derived from the multi-year research planning process and are linked to the long term goals. ORD's Deputy AA for Management is provided with this information each quarter, and uses it to inform the annual planning process as well as to update the MYP. For example, in the FY07 contingency planning, work on population-based assessment of dioxin was recommended for disinvestment because work was not being done in a timely fashion and program and regional offices did not feel it was a high priority. STAR grantees are required to report annual progress and final results, including significant accomplishments, which are posted on a public website. This information is used to provide the status of performance measures for STAR grants.

Evidence: ?? ORD Integrated Resources Management System (provides mechanism for quarterly data collection) ?? National Center for Environmental Research Website http://es.epa.gov/ncer/grants (reports from STAR grantees) ?? ORD FY04 Final Report of the Human Health Working Group. ?? ORD FY07 Summary Report of the Human Health Working Group (Appendix B, page 2)

YES 9%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: The HH Research Program incorporates program performance into personnel performance evaluation criteria. Laboratory/Center Directors are identified as the managers who are responsible for program results. These senior managers are held accountable for specific performance standards related to program goals through mid-year and end-of-year performance reviews conducted by ORD's Deputy Assistant Administrator for Management. The research program also monitors progress against performance targets. Contractors and grantees are explicitly held accountable for deliverables, costs, and schedules in evaluation criteria and in the statements of work. ORD project officers are responsible for seeing that agreements are awarded and managed according to government regulations.

Evidence: ?? Sample personnel performance evaluations incorporate program performance into personal performance evaluation. ?? Sample Contract With Evaluation Criteria and Prior Performance Information ?? Technical Evaluation Criteria for contract assessment requires contract awards to consider past performance.

YES 9%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Prior to the beginning of the fiscal year, the program develops an operating plan to allocate resources down to the program and object class level. Operating plans are then adjusted to reflect actual appropriated levels. EPA's budget and annual Operating Plan are aligned with the Agency's Strategic Plan and approved by OMB and Congressional Appropriations Committees. During execution, obligations and expenditures are tracked in the Agency's Integrated Financial Management System (IFMS) against the Operating Plan. Fund transfers between program objectives in excess of Congressionally established limits require Congressional notification and/or approval. As of March 31, 2005, the Human Health Research Program had obligated 82% of its total two-year FY2004/2005 resources. Partner funds are tracked according to EPA's Policy on Compliance, Review, and Monitoring, which requires submission of annual progress reports and compliance with federal requirements. Grant project officers get an annual report on the budget from grantees, and project officers have access to the EPA funding system which permits them to review the current and entire expense records on any EPA grant at any time. Through Research Progress Evaluation Reports, ORD reviews grantees' expenditures and other financial issues.

Evidence:  EPA's Annual Plan for FY2005 (pp. IV-7, IV-12): (www.epa.gov/ocfo/budget/2005/2005ap/goal4.pdf)  EPA's Annual Reports and Financial Statements: www.epa.gov/ocfo/finstatement/finstatement.htm  EPA's Policy on Compliance, Review, and Monitoring [EPA 5700.6, Advanced post-award monitoring (i.e. on and off-site grantee review) reports, documentation of post-award monitoring in assistance agreement files, grantee financial status reports]: www.epa.gov/records/policy/schedule/sched/183.htm  Sample Research Progress Evaluation Reports, EPA ORD National Center for Environmental Research.

YES 9%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The program has an efficiency measure (with baselines and targets) in place to track the average time it takes to process research grant proposals from RFA closure to submittal to EPA's Grants Administration Division. Improvements in processing time will enable grantees to receive funding more quickly; allow important, relevant research to begin sooner; and result in more expeditious delivery of research products and data. In addition, the HH program also uses the annual planning process managed through the HH Research Coordination Team (RCT) to identify areas for improved efficiencies or cost effectiveness. Determination of the extent to which parts of the program could be combined or eliminated, thereby resulting in increased cost-effectiveness, is a criterion used in the RCT process. ORD has also implemented the Total Cost of Ownership (TCO) initiative to consolidate computer infrastructure and maintenance.

Evidence: ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report) ?? The Total Cost of Ownership (TCO) initiative builds upon guidance from the Federal Chief Information Officer's (CIO) Council and best industry practices to develop more effective, efficient ways to provide quality desktop and server management. The three components of this initiative are Desktop Replacement, Network Operations Center, and Consolidated Call Center. Expected benefits include reduced IT expenditures, better cost control, enhanced resource sharing, and other service improvements.

YES 9%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: There is evidence that the HH program collaborates meaningfully with related research programs. Through its grants program, EPA has published 2 Requests for Applications (RFAs) with the National Institute of Environmental Health Sciences (NIEHS) to co-fund university centers that will conduct multidisciplinary basic and applied research in combination with community-based research projects to support studies on the causes and mechanisms of children's developmental disorders. EPA is also a member of a consortium involving the Centers for Disease Control and Prevention, and the National Institute of Environmental Health Sciences in the planning and implementation of the National Children's Study. Federal experts, including EPA personnel, have been enlisted into Working Groups representing over 35 universities and 27 organization, including industry. EPA has also collaborated with the Food and Drug Administration and Centers for Disease Control and Prevention in the National Human Exposure Assessment Survey Program to address limitations of single chemical exposure route studies. The BOSC commented that presentations by HH research program scientists showed clear evidence that collaboration is occurring between Agency scientists and scientists from other governmental agencies (e.g., National Institute of Environmental Health Sciences).

Evidence: ?? National Center for Environmental Research RFA Announcement on Centers for Children's Environmental Health and Disease Prevention Research es.epa.gov/ncer/rfa/archive/grants/01/kidscenter01.html es.epa.gov/ncer/rfa/current/2003_child_health.html ?? National Children's Study. House Appropriation Committee Response on the National Children's Study. ?? National Children's Study Webpage nationalchildrensstudy.gov ?? Human Exposure Measurements: National Human Exposure Assessment Survey www.epa.gov/heasd/edrb/nhexas.htm ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005.

YES 9%
3.6

Does the program use strong financial management practices?

Explanation: The program follows EPA's financial management guidelines for committing, obligating, reprogramming, and reconciling appropriated funds. Agency officials have a system of controls and accountability (EPA's Resources Management Directives System), based on GAO, Treasury and OMB guidance as well as generally accepted accounting practices (GAAP), to ensure that improper payments are not made. At each step in the process, the propriety of the payment is reviewed. EPA trains individuals to ensure that they understand their roles and responsibilities for invoice review and for carrying out the financial aspects of program objectives. EPA received an unqualified audit opinion on its FY04 financial statements and had no material weaknesses associated with the audit. EPA is taking steps to meet the new accelerated due dates for financial statements. The Human Health Research Program has no material weaknesses as reported by the Office of the Inspector General (OIG) and has procedures in place to minimize erroneous payments.

Evidence: ?? EPA's Annual Plan for FY2005 (pp. IV-7, IV-12). (www.epa.gov/ocfo/budget/2005/2005ap/goal4.pdf) ?? EPA Records Schedule 299-Budget Automation System (BAS) www.epa.gov/records/policy/schedule/sched/299.htm ?? EPA's Annual Reports and Financial Statements. Unqualified audit opinion on EPA FY04 and 03 financial statements www.epa.gov/ocfo/finstatement/finstatement.htm

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The HH program identifies and corrects deficiencies in at least 3 ways: The BOSC reviews management at the program level, and a number of the panel's recommendations are being used in FY07 planning. In addition, on-going research in the Labs/Centers is evaluated by an external peer review process. For example, the 5 Health Divisions in the National Health and Environmental Effects Research Laboratory are reviewed every 3-4 years with a less formal evaluation conducted mid-way through the cycle to track progress. External peer reviewers submit a detailed report to Laboratory management. Findings and recommendations are used by senior management for research planning, resource allocation and strategic direction. Divisions respond to the report, indicating where changes have been made in the program to accommodate reviewers' recommendations. The HH Research Coordination Team (RCT) also meets routinely to evaluate the status of on-going research. Deficiencies noted by the HH RCT are transmitted to Assistant Laboratory Directors of ORD Labs/Centers serving on the HH RCT for remediation.

Evidence: ?? Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors on Peer-Review, February 28-March 2, 2005. (pp. 9-11). Program staff have developed an action plan to address BOSC recommendations and have discussed this plan with the BOSC Executive Council. This document includes a timetable for implementation, and future BOSC reviews will assess progress against this timetable. ?? Peer review of the Reproductive Toxicology Division, National Health and Environmental Effects Research Laboratory, October 2001. ?? The following is an example of a change made as a result of this type of review: The peer review of the Reproductive Toxicology Division indicated that the Division should expand efforts to promote extrapolation between animal and human studies. In response, the Division outlined steps taken to develop research projects collaboratively with the Human Studies Division. Review of the Reproductive Toxicology Division, NHEERL, October 22-24, 2001 (Executive Summary, p2) Response to the October 2001 Peer Review of the Reproductive Toxicology Division, NHEERL, November 2, 2002 (Executive Summary, p.4)

YES 9%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: All of the Human Health research grants are awarded through ORD's competitive STAR extramural grants program, using external scientific peer reviewers to rate applications based on scientific merit. EPA's process for soliciting and selecting proposals is clearly articulated on its website. To attract new investigators, research solicitations are announced in the Federal Register, posted on the NCER website for at least 90 days, emailed to institutions and individuals that have indicated an interest in receiving them, distributed at scientific conferences, and disseminated to researchers by other federal agencies. Renewals are rarely granted and governed by a revised competition order. 84% of grantees received STAR grants for the first time. EPA uses a broadly competitive process based on merit for allocation of resources to R&D programs other than competitive grants. EPA uses contracts to support large programs such as the National Children's Study, Human Environmental Epidemiologic Studies, and Exposure to Dose Model Development. The process is competitive and involves development of a RFA, review of applications by an external review panel or board, and awarding funds based on based on merit.

Evidence: ?? The program's award process for STAR grants is as follow: External scientific peer reviewers rate applications based on scientific merit. Only applications rated as excellent or very good (usually 10-20% of proposals) are then considered for funding based on relevance to EPA programmatic priorities. Further information on the STAR program is available on the National Center for Environmental Research Website http://es.epa.gov/ncer/ ?? Sample procurement milestones for competitive full & open award process ?? National Academy of Sciences review, "The Measure of STAR: Review of the U.S. Environmental Protection Agency's Science to Achieve Results (STAR) Research Grants Program," pp. 30, 34, 152.

YES 9%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The research program designates grant project officers to monitor grantee performance, including submission of annual progress reports and compliance with federal requirements. Prior to award, project officers closely review the budgets and often request additional explanations or modifications. Grantees provide a list of publications, presentations and other activities on an annual basis and at the end of their grant period. Project officers also conduct formal site visits to monitor the progress of active grants. EPA offices are required to conduct evaluative reviews on at least 10% of active recipient institutions each year. The STAR program sponsors workshops to bring together the HH grantees with ORD scientists and representatives from across the Agency to inform one another of scientific results and upcoming policy considerations. In addition to the scientific progress reports, project officers get an annual report on the budget from grantees and project officers have access to the EPA funding system which permits them to review the current and entire expense records on any EPA grant at any time. ORD's oversight practices for cooperative agreements are similar to those for grants in that project officers monitor performance through progress reports and are provided lists of publications, presentations and other activities annually and at the end of the agreement. Because EPA researchers work much more collaboratively with cooperative agreement partners than with grantees, there is a significantly greater amount of involvement in, and oversight of, their activities. Contractors interface with the EPA project Officer and Work Assignment Managers in the assignment and management of work as defined in the Contract's Statement of Work.

Evidence: ?? EPA's Policy on Compliance, Review, and Monitoring [EPA 5700.6, Advanced post-award monitoring (i.e. on and off-site grantee review) reports, documentation of post-award monitoring in assistance agreement files, grantee financial status reports]. www.epa.gov/records/policy/schedule/sched/183.htm ?? NCER website for annual reports, publications, activities, and Workshop Proceedings (www.epa.gov/ncer); ?? Project Officer Grant Review Template.

YES 9%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: An annual progress report is submitted by each grantee and posted on the EPA NCER website. Reports are distributed to EPA staff to disseminate to interested parties. These reports include summaries of progress in relation to project objectives as well as publications of research results. Grantees also present results at a multitude of national and international scientific conferences held annually. Progress reports and publication information are posted on the NCER website. Project officers monitor cooperative agreement performance through annual progress reports. Results of cooperative agreements are made available through publication in scientific journals. Contractors provide reports to the Agency as required by the Statement of Work.

Evidence: ?? NCER website for a link to Grant reports, information about grantee reporting requirements, and STAR grant Progress-Review Workshops: http://es.epa.gov/ncer/guidance/tscs99.html ?? Sample statements of work for contracts show reporting requirements for partners and contractors.

YES 9%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: Approximately 40 percent of the program's non-support funding is allocated to EPA research laboratories and centers. Though this allocation occurs through ORD's planning process, it is not competitive, and the program has not provided a sufficient explanation of the unique capabilities of these labs and centers to justify the allocation of funds through this process.

Evidence: Competitive allocation of extramural funds for R&D programs other than competitive grants was discussed in response to question 3.CO1. In allocating in-house funds, the HH research program follows ORD's planning process. Annual planning for the fiscal year two years in advance begins in early spring. Planning summaries and other information for each program are provided by the National Program Directors (NPDs) to ORD's Executive Council (EC) to inform the EC's discussions of planning and budgeting for the year. These materials include: 1) a contingency pool representing a significant portion of ORD's research program, 2) planning summaries describing the base research program, programmatic redirections and realignments and explanations for such changes, and 3) a statement for each area in the contingency pool explaining performance impacts if the reduction was implemented. These materials continue to be used throughout the budget and operating plan development processes to ensure that funding is distributed to the highest quality projects.

NO 0%
Section 3 - Program Management Score 91%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The BOSC evaluated progress on Human Health (HH) research in the four strategic research areas reflected in the program's long term goals: human health risk assessment with a focus on use of mechanistic information in risk assessment, aggregate and cumulative risk, susceptible subpopulations, and evaluation of public health outcomes. Though the BOSC did not assign a rating of excellent, moderately effective, or ineffective, the panel concluded that the HH Program has demonstrated progress toward its long-term performance goals and that research results are being used to reduce uncertainty in risk assessment. EPA is also using a set of quantitative long-term measures to show increased use ORD HH research in risk assessments, but these are new measures and long-term performance data is not yet available. EPA is also using a bibliometric analysis to measure the impact of ORD HH research. The bibliometric analysis is based on citation counts and cites per paper scores of all papers published in the field and is an indicator of overall influence and impact of the output. An analysis conducted in FY2005 found that nearly a quarter (24.3%) of the human health publications are highly cited papers. The next analysis will be conducted in FY2009.

Evidence: ?? Board of Scientific Counselors (BOSC) Human Health Subcommittee Report (pp. 3-4, 21-24) ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report) ?? Bibliometric Analysis for Papers on Topics Related to Human Health (April 18th, 2005)

SMALL EXTENT 7%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Though documentation of performance goals from before 2003 is limited (the program did not have a multi-year plan prior to 2003), the measure tracking annual outputs indicates all output milestones were met for use of mechanistic data in risk assessment and evaluation of public health outcomes and a majority of the planned 2000-2004 milestones for the other two research areas (aggregate/cumulative risk and susceptible subpopulations) were met. The Multi-Year Plan includes outputs from program's partners (i.e., grantees), indicating that these partners generally meet their annual performance goals as well. In the future, EPA will also evaluate annual progress using feedback obtained through a quantitative survey of Program/Regional Office stakeholders.

Evidence: ?? 2005 PART measures (Agency has committed to including these measures in forthcoming GPRA documents, including the FY 2006 Performance and Accountability Report)

SMALL EXTENT 7%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: Performance data for the program's efficiency measure shows improvement over time in grants processing efficiency. These improvements enable grantees to receive funding more quickly, allow research to begin sooner, and result in more expeditious delivery of research products and data. There is evidence that other improvements were made through the annual planning process, which have most likely improved efficiency. For example, in FY 2004, the HH RCT recommended that pharmacokinetic work in two labs be collapsed into a single project. This resulted in greater efficiency in use of resources and personnel. ORD has also implemented the Total Cost of Ownership Initiative, but quantitative results of efficiency improvements are not available.

Evidence: ?? ORD FY04 Final Report of the Human Health Working Group (p.4-5) ?? Total Cost of Ownership Initiative

SMALL EXTENT 7%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: A recent independent bibliometric analysis of peer-reviewed papers from the HH program indicated that in 13 of 17 disciplines in which human health research papers are published, papers from the HH program are more highly cited that the average papers in those fields.

Evidence: Bibliometric Analysis for Papers on Topics Related to Human Health

SMALL EXTENT 7%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The BOSC review evaluated the relevance, quality, performance (the R&D Research Criteria) and scientific leadership of the HH research program. The BOSC found that the program was of high quality and appropriately focused, was multi-disciplinary, had good participation by stakeholders in planning and implementation. The BOSC also determined that the program informed risk assessments by reducing uncertainty and that its scientists were highly recognized by the larger scientific community and that the research was being effectively transferred to the Program Offices and Regions and used to reduce uncertainty in human health risk assessment.

Evidence: Report from Human Health Subcommittee of the ORD's Board of Scientific Counselors Peer-Review, February 28-March 2, 2005 (front page, executive summary, p.1, 3-4, 9-11

LARGE EXTENT 13%
Section 4 - Program Results/Accountability Score 40%


Last updated: 09062008.2005SPR