ExpectMore.gov


Detailed Information on the
Pollution Prevention and New Technologies Research Assessment

Program Code 10001138
Program Title Pollution Prevention and New Technologies Research
Department Name Environmental Protection Agy
Agency/Bureau Name Environmental Protection Agency
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2003
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 60%
Strategic Planning 10%
Program Management 70%
Program Results/Accountability 7%
Program Funding Level
(in millions)
FY2007 $24
FY2008 $22
FY2009 $20

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2008

Reassess meaningfulness of current efficiency measure in light of recent National Academy of Sciences (NAS) report on efficiency measurement.

Action taken, but not completed ORD sponsored a National Academy of Sciences (NAS) study on the measurement of research program efficiency, and has been a leader in promoting sound efficiency measurement approaches across the government. ORD will continue working with OMB to develop an approach that meets both PART guidance and NAS standards for efficiency measurement.
2007

Implement follow-up recommendations resulting from the Technology for Sustainability Subcommittee Board of Scientific Counselors (BOSC) review. Follow up actions are those actions committed to in the Pollution Prevention and New Technologies Research Assessment program's formal response to the BOSC

Action taken, but not completed The program will develop a formal response to the BOSC including planned actions to address recommendations.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

Shift funding from this research program to another Environmental Protection Agency pollution prevention program that has shown results (see New Chemicals PART).

Completed The President's Budget for FY 2005 proposed a $5M transfer from ORD's Pollution Prevention program to OPPTS' Pollution Prevention program (New Chemicals). Funding for the Pollution Prevention and New Technologies Research program was reduced in appropriations, but the Congress did not transfer these funds to OPPTS.
2004

Improve the program's strategic planning. These improvements should include a plan for independent evaluation of the program, responses to previous evaluations, and should clearly explain why the program should pursue projects instead of other capable parties.

Completed EPA's Science Advisory Board (SAB) reviewed the Sustainability Research Strategy and the Science and Technology for Sustainability Multi-Year Plan on June 13-15, 2006. Background information and supporting documents are available at SAB's website: "Environmental Engineering Committee Advisory on EPA's Sustainability Research Strategy and Multi-Year Plan" http://www.epa.gov/sab/panels/eec_adv_srs_myp.htm .
2005

Develop and publish a revised multi-year research plan with an improved strategic focus and clear goals and priorities. This plan must include explicit statements of: specific issues motivating the program; broad goals and more specific tasks meant to address the issue; priorities among goals and activities; human and capital resources anticipated; and intended program outcomes against which success may later be assessed.

Completed ORD has completed a final draft of the ORD Sustainability Research Strategy (http://www.epa.gov/sustainability/pdfs/EPA-12057_SRS_R4-1.pdf) as well as the Science and Technology for Sustainability MYP.
2006

Institute a plan for regular, external reviews of the quality of the program's research and research performers, including a plan to use the results from these reviews to guide future program decisions.

Completed The Board of Scientific Counselors (BOSC) reviewed the program for the first time in March, 2007, and will subsequently review the program every 4-5 years, with mid-cycle reviews occurring between the comprehensive reviews. In support of its initial review, the program developed specific questions to be used to measure long-term progress, and identified specific data sources that could be provided to the BOSC to inform the review.
2004

Establish performance measures, including efficiency measures.

Completed The program formalized measures in conjunction with the 2008 PART Measure Review process.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Utility of ORD-identified and developed metrics for quantitatively assessesing environmental systems for sustainability.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research under Long-Term Goal 1. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2007 report can be found at: http://www.epa.gov/OSP/bosc/pdf/sust080321rpt.pdf. The program's formal action plan will be posted to http://www.epa.gov/OSP/bosc/reports.htm when available.

Year Target Actual
2007 N/A N/A
2011 Baseline Baseline
2015 TBD TBD
Long-term Outcome

Measure: Utility of ORD-developed decision support tools and methodologies for promoting environmental stewardship and sustainable environmental management practices.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research under Long-Term Goal 2. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2007 report can be found at: http://www.epa.gov/OSP/bosc/pdf/sust080321rpt.pdf. The program's formal action plan will be posted to http://www.epa.gov/OSP/bosc/reports.htm when available.

Year Target Actual
2007 Baseline Exceeds Expectations
2011 Exceeds Expectations TBD
2015 Exceeds Expectations TBD
Long-term Outcome

Measure: Utility of innovative technologies developed or verified by ORD for solving environmental problems and contributing to sustainable outcomes.


Explanation:This measure captures the assessment by an independent expert review panel of the relevance, quality, and use of the program's research under Long-Term Goal 3. Using a well-defined, consistent methodology developed through an OMB/ORD/Board of Scientific Counselors (BOSC) workgroup, the BOSC provides a qualitative rating and summary narrative regarding the performance of each Long-Term Goal. Rating categories include: Exceptional, Exceeds Expectations, Meets Expectations, and Not Satisfactory. Full ratings are expected approximately every 4 years, though the BOSC will provide progress ratings at the mid-point between full program reviews. Targets for this measure are set using the previous BOSC rating??and BOSC recommendations??as a guide. The program outlines an action plan in response to BOSC recommendations; completion of the actions in this plan demonstrates progress from the baseline. The BOSC's 2007 report can be found at: http://www.epa.gov/OSP/bosc/pdf/sust080321rpt.pdf. The program's formal action plan will be posted to http://www.epa.gov/OSP/bosc/reports.htm when available.

Year Target Actual
2007 Baseline Meets Expectations
2011 Exceeds Expectations TBD
2015 Exceeds Expectations TBD
Annual Output

Measure: Percentage of planned outputs delivered in support of STS's goal that decision makers adopt ORD-identified and developed metrics to quantitatively assess environmental systems for sustainability.


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2003 100% 100% (4/4)
2004 N/A N/A
2005 N/A N/A
2006 N/A N/A
2007 N/A N/A
2008 100% TBD
2009 100% TBD
2010 100% TBD
Annual Output

Measure: Percentage of planned outputs delivered in support of STS's goal that decision makers adopt ORD-developed decision support tools and methodologies to promote environmental stewardship and sustainable environmental management practices.


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2003 100% 75% (6/8)
2004 100% 78% (7/9)
2005 100% 67% (4/6)
2006 100% 100% (2/2)
2007 100% 100% (3/3)
2008 100% TBD
2009 100% TBD
2010 100% TBD
Annual Output

Measure: Percentage of planned outputs delivered in support of STS's goal that decision makers adopt innovative technologies developed or verified by ORD to solve environmental problems, contributing to sustainable outcomes.


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2003 100% 75% (6/8)
2004 100% 29% (2/7)
2005 100% 90% (9/10)
2006 100% 94% (17/18)
2007 100% 100% (10/10)
2008 100%
2009 100%
2010 100%
Annual Output

Measure: Percentage of planned outputs delivered in support of STS's goal that decision makers adopt ORD-developed decision support tools and methodologies to promote environmental stewardship and sustainable environmental management practices.


Explanation:At the end of the fiscal year, the program reports on its success in meeting its planned annual outputs (detailed in the program's Multi-Year Plan). The program strives to complete 100% of its planned outputs each year so that it can best meet EPA and other partners' needs. To ensure the ambitiousness of its annual output measures, ORD has better formalized the process for developing and modifying program outputs, including requiring that ORD programs engage partners when making modifications. Involving partners in this process helps to ensure the ambitiousness of outputs on the basis of partner utility. In addition, EPA's Board of Scientific Counselors (BOSC) periodically reviews programs' goals and outputs and determines whether they are appropriate and ambitious.

Year Target Actual
2003 100% 75% (6/8)
2004 100% 78% (7/9)
2005 100% 29% (2/7)
2006 100% 90% (9/10)
2007 100% 94% (17/18)
2008 100% TBD
2009 100% TBD
2010 100% TBD
Annual Output

Measure: Percentage of Science and Technology for Sustainability (STS) publications rated as highly cited publications.


Explanation:This metric provides a systematic way of quantifying research performance and impact by counting the number of times an article is cited within other publications. The "highly cited" data are based on the percentage of all program publications that are cited in the top 10% of their field, as determined by "Thomson's Essential Science Indicator" (ESI). Each analysis evaluates the publications from the last ten year period, and is timed to match the cycle for independent expert program reviews by the Board of Scientific Counselors (BOSC). This "highly cited" metric provides information on the quality of the program's research, as well as the degree to which that research is impacting the science community. As such, it is an instructive tool both for the program and for independent panels??such as the BOSC?? in their program reviews To best establish ambitious and appropriate targets in the future, ORD will collect benchmarking information by conducting an analysis of bibliometric measures used in R&D programs outside of EPA.

Year Target Actual
2005 Baseline 34.2%
2007 N/A 28.2%
2009 29.2% TBD
Annual Output

Measure: Percentage of Science and Technology for Sustainability (STS) publications in "high impact" journals.


Explanation:This measure provides a systematic way of quantifying research quality and impact by counting those articles that are published in prestigious journals. The "high impact" data are based on the percentage of all program articles that are published in prestigious journals, as determined by "Thomson's Journal Citation Reports" (JCR). Each analysis evaluates the publications from the last ten year period, and is timed to match the cycle for independent expert program reviews by the Board of Scientific Counselors (BOSC). This "high impact" metric provides information on the quality of the program's research, as well as the degree to which that research is impacting the science community. As such, it is an instructive tool both for the program and for independent panels??such as the BOSC?? in their program reviews To best establish ambitious and appropriate targets in the future, ORD will collect benchmarking information by conducting an analysis of bibliometric measures used in R&D programs outside of EPA.

Year Target Actual
2005 Baseline 30.4%
2007 N/A 34.3%
2009 35.3% TBD
Annual Efficiency

Measure: Percent variance from planned cost and schedule.


Explanation:This measure captures the ability of the program to increase cost effectiveness based on the extent to which it delivers annual research outputs relative to the amount of funds spent. Using an approach similar to Earned Value Management, the data are calculated by: 1) determining the difference between planned and actual performance and cost for each Long-Term Goal, 2) adding these data together to generate program totals, and 3) dividing the Earned Value of all work completed by the Actual Cost of all program activities

Year Target Actual
2004 Baseline -52.8%
2005 N/A -64.9%
2006 N/A -4.3%
2007 N/A Data lag (12/2008)
2008 N/A TBD
2009 N/A TBD
2010 N/A TBD

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the pollution prevention and new technologies (P2NT) research program is to provide to state, local and federal governments, academia, industry (particularly small and medium-sized enterprises, known as SMEs), and citizen groups a suite of problem solving options (P2 Tools, CC&T, ETV, SBIR and SES) to more cost-effectively reduce high priority environmental risks. This program is administered by EPA's Office of Research and Development (ORD). (CC&T = clean chemistry and technology, ETV = environmental technology verification, SBIR = small business innovative research, SES = sustainable environmental systems).

Evidence: P2 Research Strategy (EPA/600/R-98/123, page 1; www.epa.gov/ORD/WebPubs/final/p2.pdf). P2NT Multi-Year Plan (MYP) page 2. Applicable authorizing legislation: 42 U.S.C.A. 13103 [PPA section 6604], 42 U.S.C.A. 7403 [CAA section 103]; 42 U.S.C.A. 1255 [FWPCA section 105]; 42 U.S.C.A. 300j-1 [PHSA section 1442]; 42 U.S.C.A. 6981 [SWDA section 8001], Small Business Reauthorization Act of 2000 (Pub. L. No. 106-554).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Many P2 opportunities exist for industry, as noted in a Government Accounting Office (GAO) evaluation of EPA's P2 programs. The need for technology development is especially apparent for small and medium-sized firms.

Evidence: P2NT MYP (page 3). GAO. Environmental Protection: EPA Should Strengthen Its Efforts to Measure and Encourage Pollution Prevention. (GAO/GAO-01-283, February 2001, pgs 18-19, 21 & 23; www.goa.gov/new.items/d01283.pdf). Science Advisory Board (SAB). Toward Integrated Environmental Decision-Making. (EPA-SAB-EC-00-011, August 2000, pgs 10 & 13; www.epa.gov/sab/pdf/ecirp011.pdf). Toxics Release Inventory (www.epa.gov/tri/tridata/).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: While EPA is the only Agency within the federal government with broad authority to protect all environmental media and regulate cross-media transfers, some aspects of its P2 research program are duplicative of the private sector. For example, the Science Advisory Board's (SAB's) Environmental Engineering Committee (EEC) noted that "some of the research projects and products walk a thin line between providing a useful product or service (one that would not otherwise be available) and infringing on the domain of commercially viable products and services. This is especially true in the area of software development. " EEC further encouraged written disclosure identifying the nature and types of technology products that ORD should or shold not pursue. SAB's Research Strategies Advisory Committee (RSAC) stressed that EPA must convey how its program adds value to on-going efforts in other agencies and the private sector, including addressing factors or providing information on why EPA believes it should pursue projects instead of other parties capable of conducting the projects. In addition, there may be a possible overlap or duplication with another program within EPA: the Office of Pollution Prevention and Toxic Substances' (OPPT's) P2 program, which also aids industry by providing P2 tools to realize P2 opportunities. It is not clear if there are reductions in chemical use or emissions resulting from ORD's program, nor is it apparent that this program results in more reductions than EPA's OPPT's P2 program.

Evidence: P2 Research Strategy (EPA/600/R-98/123, pages 5-15), Small Business Reauthorization Act of 2000 (Pub. L.No.106-554). SAB (RSAC). Water Quality and Pollution Prevention Multiyear Plans: An SAB Review, EPA-SAB-RSAC-02-003, December 2001.SAB (EEC). An SAB Report: Review of ORD's Pollution Prevention Research Strategy, EPA-SAB-EEC-98-008 (www.epa.gov/science1/pdf/eec9808.pdf), July 1998.

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The ETV program has design flaws that limit the effectiveness of the program. The program has difficulty sustaining the involvement of smaller vendors due to long verification times and costs to vendors. In addition, in one instance a vendor claimed the verification as a certification in its advertisements, which the program had to legally remedy.

Evidence: Meeting with OMB on ETV, 2002, with Dr. Paul Gilman, EPA Assistant Adminstrator (AA) for the Office of Research and Development (ORD) and Science Advisor, and Mr. Tim Oppelt, then-Director of the ETV program.

NO 0%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: Among the primary barriers hindering the wider use of pollution prevention are the technical challenges associatied with new and sometimes unproven techniques, such as technical uncertainties and considerable risk. Small firms face even more significant tehcnical challenges in pursuing P2 options. P2 measures tend to require a great deal of technical sophistication and resources. Such firms, as well as medium-sized firms, typically do not have the time and resources to research and evaluate their options and therefore need assistance to help them identifiy and implement various options. SBIR and CC&T are aimed at small and medium-sized businesses.

Evidence: GAO suggested that such assistance be in the form of mentors, such as experts, from larger firms. GAO , Environmental Protection: EPA Should Strengthen Its Efforts to Measure and Encourage Pollution Prevention.(GAO/GAO-01-283, February 2001; www.goa.gov/new.items/d01283.pdf)

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 60%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has not established any outcome long-term goals (LTGs), nor adequate outputs that might be combined to provide a picture of what the program aims to achieve. The LTGs that were presented as "outcomes" are outputs or process activities. It is important for EPA to determine some sort of measurement that compares status before and after the pollution prevention actions were put in place (i.e., the amount of chemical reduced or not used as a result of the tools and assistance the program provides industry.) EPA should attempt to solicit from industry whether actual process changes or reductions in chemical use occurred due to the program's tools and assistance. While Cooperative Research and Development Agreements (CRADAs) are a good output indicator of the interest of / relevance to industry, it alone is not sufficient. A better indicator would include the level of cost-sharing by industry. In addition, the progam has not listed an efficieny measure, although it states one in 3.4 and 4.3 (average time to fund competitive grants). The program should consider including this efficiency measure in the Measures tab of Section 2 until it develops a better one.

Evidence: The P2NT MYP.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: Received "No" for 2.1

Evidence:  

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: While the program does not have LTGs, it has presented annual performance goals (APGs) that attempt to support the purpose of the program, which is to provide information to industry to assist in realizing pollution reduction opportunities. These are too numerous, and more problematic, are the most basic of outputs, which are unlikely to result in outcomes when transferred to customers. The program really must strive to show beyond anecdotal evidence that these tools will indeed be used by industry and contribute to outcomes (e.g., reductions in chemicals use and/or emissions, impacts to environmental quality or public health). Program must also develop an efficiency measure.

Evidence: The Measures Tab contains a subset of all the APGs presented by the program. These are the most preferred for the program from its initial list.

NO 0%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: Received "No" for 2.3.

Evidence:  

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Received "No" for 2.1 and 2.3.

Evidence:  

NO 0%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: In the last five years, the program has not had any evaluations that examine how well the program is accomplishing its mission and meeting its long-term goals. Ideally, such evaluations would include recommendations on how to improve the program's performance. Almost all the studies cited by ORD were focussed on the program's planning process and did not evaluate performance; one review, which did not look at performance information, did note that the program must address strategic planning deficiencies regarding the lack of development of long-term outcome goals and linkages from annual performance measures to LTGs. The joint ORD/Inspector General (IG) study, while not considered independent, observed that the program focuses primarily on outputs, and that it should focus on outcomes. The most recent review of the ETV program has been on the incorporation of quality management, not on performance. While ORD has plans for future reviews of its P2 Research Strategy and its MYPs, these, again, are process reviews.

Evidence: Reviews include: joint ORD/OIG Program Evaluation Report. Design for Objective 8.4 Could Be Improved by Reorienting Focus on Outcomes, No. 2002-P-000002, November 2001.SAB (RSAC). Water Quality and Pollution Prevention Multiyear Plans: An SAB Review, EPA-SAB-RSAC-02-003, December 2001.SAB (EEC). An SAB Report: Review of ORD's Pollution Prevention Research Strategy, EPA-SAB-EEC-98-008 (www.epa.gov/science1/pdf/eec9808.pdf), July 1998. SAB (EEC). Reveiw of EPA's Environmental Technology Verification Program, EPA-SAB-EEC-00-012 (www.epa.gov/sab/pdf/eec0012.pdf), August 2000.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Agency estimates and budgets for the full annual costs of operating its programs, taking into consideration any changes in funding, policy, and legislation. All spending categories and the resource levels and activities associated with them are included in the annual Congressional Justification. Performance data are considered at every step in EPA's planning and budgeting process (i.e. developing the OMB submission, Congressional Justification, and annual Operating Plan and reporting our results in the Annual Report). EPA managers use up-to-date financial, policy, and regulatory information to make decisions on program management and performance. The Agency's financial information is integrated with performance and other program data to support day-to day decision making of managers and executives.

Evidence: Annual Congressional Justification, Budget Automation System (BAS) reports. [EPA was selected as a government-wide finalist for the 2002 President's Quality Award in the area of budget and performance integration.]

YES 10%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The 2001 RSAC review specifically noted that it was not apparent that EPA "had engaged in the kind of strategic thinking required for this effort" to come up with outcome definitions. RSAC specifically noted that long-term goals were open-ended, and the Committee remained concerned that annual goals could not logically lead to meeting LTGs. RSAC also found that there was an apparent missing connection between the inventory of annual targets and LTGs. The Committee stated that it was difficult to understand how the collection of outputs in the MYP would eventually combine and contribute to achieving outcomes that further APGs or LTGs. RSAC advised the program that once outcomes are formulated, the strategic thinking process "moves backward" to the present in order to formulate the series of necessary steps to achieve the forward-looking outcomes. The joint ORD/OIG evaluation also noted that the program needed to focus on outcomes and that it should "improve program design to include performance measures related to short-term outcomes." The absence of adequate LTGs indicates that the program has not corrected its strategic planning deficiencies.

Evidence: SAB (RSAC). Water Quality and Pollution Prevention Multiyear Plans: An SAB Review, EPA-SAB-RSAC-02-003, December 2001. joint ORD/OIG Program Evaluation Report. Design for Objective 8.4 Could Be Improved by Reorienting Focus on Outcomes, No. 2002-P-000002, November 2001.

NO 0%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: The program has not assessed or compared what its potential benefits might be in relation to other efforts that have similar goals either within the Agency, such as OPPT's P2 program within EPA, or in other agencies and the private sector.

Evidence:  

NO 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: The program cites annual reviews through ORD's research planning process to determine its relevance in addressing the needs of its respective customers. SAB reviews, however, found the prioritization criteria were not distinct enough (1998) or not addressed in the program's MYP (2001). The program finalized its MYP without remedying these findings, therefore, it is not clear how the program incorporates prioritization into its budget requests and funding decisions.

Evidence: SAB (RSAC). Water Quality and Pollution Prevention Multiyear Plans: An SAB Review, EPA-SAB-RSAC-02-003, December 2001.SAB (EEC). An SAB Report: Review of ORD's Pollution Prevention Research Strategy, EPA-SAB-EEC-98-008 (www.epa.gov/science1/pdf/eec9808.pdf), July 1998.

NO 0%
Section 2 - Strategic Planning Score 10%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The program has not collected information that would lead it to establish meaningful long-term goals or determine whether the program meets its purpose. For example, the program has not reviewed industry cost-sharing, which can serve as a proxy measure for R&D programs. Alternatively, the program has not attempted to obtain information on the use its tools within industry. This lack of information limits the ability of the program to evaluate itself. The exception to this is the competitive grants aspect of the program, Science to Achieve Results (STAR) grants; grantees are required to report annual progress and final results, including publications and significant accomplishments that are posted on a public web site. They are also required to participate in periodic program review workshops with other grantees and EPA staff to review progress and findings. Contractors and holders of cooperative agreements are also monitored on a regular basis to ensure their progress is compatible with the overall aims of the MYP, but the MYP has shortcomings.

Evidence: STAR Web Site (http://es.epa.gov/ncer/).

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: The program incorporates program performance into personnel performance evaluation criteria. Management is accountable for specific performance standards relating to program goals. The program also monitors progress against GPRA targets through quarterly reporting and mid-year reviews with the Deputy Administrator. For contracts and grantees, statement of work, deliverables, costs, and schedules are written into award terms. All ORD Project Officers (POs) are responsible for seeing that the agreement is awarded and managed according to government regulations in a way that gives value to the government and public.

Evidence: SES Performance standardsProject Officer Training (www.epa.gov/oamintra/training/index.htm)

YES 10%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Prior to the beginning of the fiscal year, the program develops an operating plan which reflects how it plans on spending its budget (as requested in the President's Budget). Resources are allocated by goal, objective, subobjective, program and object class. Programs then adjust the operating plan to reflect appropriated levels. EPA's budget and annual Operating Plan are aligned with the Agency's Strategic Plan and approved by OMB and Congressional Appropriations Committees. Obligations and expenditures are tracked in the Agency's Integrated Financial Management System (IFMS) against the Operating Plan. Fund transfers between program objectives in excess of Congressionally established limits require Congressional notification and/or approval. EPA works with grantees to ensure that their work plans reflect the Agency's Strategic Plan and Operating Plan and that recipient spending is consistent with the approved workplan. Each program office and grants management office conducts post-award monitoring of assistance agreements, including monitoring the draw-down of funds against grantee progress on workplan tasks and deliverables. This monitoring ensures that recipients are spending the funds designated to each program area for the intended purpose. All grantees are required to submit annual or more frequent financial status reports.

Evidence: End of year obligation reports. EPA's annual Operating Plan and Congressional Justification, EPA's Strategic Plan, Budget Automation System (BAS) data, EPA's Annual Report and Financial Statements. EPA's Policy on Compliance, Review, and Monitoring (EPA 5700.6, Advanced post-award monitoring (i.e. on and off-site grantee review) reports, documentation of post-award monitoring in assistance agreement files, grantee financial status reports.

YES 10%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: While ORD is undertaking efforts related to this issue (the average length of time it takes to make grant awards, IT business cases that discuss efficiency improvements), nothing is currently available to measure efficiency of the program.

Evidence: National Academy of Sciences (NAS) Review of STAR grants program (page 103) (The Measure of STAR - www4.nationalacademies.org/news.nsf/isbn/0309089387?OpenDocument)

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: P2 research involves extensive collaboration with EPA program offices (OAR, OPPTS, OW), other agencies (DOE, DOD, NIST, USDA, NASA), and other non-Federal organizations (Electric Power Research Institute, Council for Chemical Research, NATO, P2 Roundtable, UN Environmental Program, American Chemical Society, and American Institute of Chemical Engineers). EPA professionals and their verification partners work with over 800 stakeholders in 21 separate stakeholder groups to facilitate performance evaluation of innovative environmental technologies. Several agencies have entered into cooperative agreements with ETV.

Evidence: P2 Research Strategy (EPA/600/R-98/123, pages 7-9). P2NT MYP (pages 5-6, 12-19). TSE Grants and SBIR: es.epa.gov/ncer. ETV Program MOUs with:DOD: www.epa.gov/etv/sitedocs/memo_agreement_estcp.html,Coast Guard: www.epa.gov/etv/sitedocs/memo_agreement_uscg.htmlState of Massachusetts: www.epa.gov/etv/sitedocs/memo_agreement_ma.html

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: The program follows EPA's financial management guidelines for committing, obligating, reprogramming, and reconciling appropriated funds. Agency officials have a system of controls and accountability, based on GAO and other principles, to ensure that improper payments are not made. At each step in the process, the propriety of the payment is reviewed. EPA trains individuals to ensure that they understand their roles and responsibilities for invoice review and for carrying out the financial aspects of program objectives. EPA received an unqualified audit opinion on its FY02 financial statements and had no material weaknesses associated with the audit. EPA is taking steps to meet the new accelerated due dates for financial statements. The P2NT program has no material weaknesses as reported by the Office of the Inspector General (OIG) and has procedures in place to minimize erroneous payments.

Evidence: Annual Congressional Justification, Budget Automation System (BAS) reports, unqualified audit opinion on EPA FY02 financial statements, Fiscal Year 2002 Advice of Allowance Letter, 2002 Integrity Act Report, resource policies at: http://intrasearch.epa.gov/ocfo/policies.

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: In September 2000, GAO released a report recommending a number of program management improvements for the STAR grants program. Subsequently, EPA identified the STAR grants program as one of its Major Management Challenges in Fiscal Year 2001 for its lack of performance measures. It was recommended that that program assess its outcomes and evaluate whether the grants contribute value to EPA in meeting its priorities. The program addressed the GAO findings and pursued opportunities to remedy itself as a Major Management Challenge. This included EPA charging NAS to conduct a review and make recommendations. EPA is currently reviewing NAS's recommendations.

Evidence: EPA. Fiscal Year 2001 Annual Report, p. III-7.NAS. The Measure of STAR: Review of the US Environmental Protection Agency's Science to Achieve Results (STAR) Research Grants Program (www.nap.edu/openbook/0309089387/html/R9.html), 2003.GAO. Environmental Research: STAR Grants Focus on Agency Priorities, but Management Enhancements Are Possible, RCED-00-170, September 2000.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: All P2 research grants are awarded through ORD's competitive STAR grants program, using external scientific peer reviewers to rate applications based on scientific merit. Only applications rated as 'excellent' or 'very good' (usually 10-20% of proposals) are considered for funding based on relevance to EPA programmatic priorities. To attract new investigators, research solicitations are announced in the Federal Register, posted on EPA's National Center for Environmental Research (NCER) website for at least 90 days, emailed to institutions and individuals that have indicated an interest in receiving them, distributed at scientific conferences, and disseminated to researchers by other federal agencies.

Evidence: STAR Website (RFA announcements: es.epa.gov/ncer). NRC review, 'The Measure of STAR,' April, 2003 (www.nap.edu/books/0309089387/html/)

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation:  

Evidence:  

NA  %
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: An annual progress report is submitted by each grantee and posted on the EPA National Center for Environmental Research website. Reports grantees are distributed to EPA staff to disseminate to interested parties. These reports include summaries of progress in relation to project objectives as well as publications of research results. Grantees also present results at numerous scientific conferences held annually. Summaries of P2 research accomplishments are posted on EPA's website.

Evidence: STAR Website: (Progress reports and publications lists: es.epa.gov/ncer).NRC review, 'The Measure of STAR,' April, 2003 (www.nap.edu/books/0309089387/html/).

YES 10%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: The program allocates funding to outside sources (not competitive grants) through Cooperative Research and Development Agreements (CRADAs), in which industry commits to providing resources or in-kind assistance. It is not clear what management process are in place to maintain program quality, particularly because the program lacks adequate collection of data to set acceptable goals.

Evidence: P2NT MYP (page 8). ORD Planning Guidance. Overview of the EPA Quality System for Environmental Data and Technology, (EPA/240/R-02/003; www.epa.gov/quality/qs-docs/overview-final.pdf).

NO 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 70%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: Received "No" for 2.1 .

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Received "No" for 2.3.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: ORD is undertaking efforts related to this issue, but cannot demonstrate results at this time. Efforts include currently monitoring the average length of time for EPA to make grant awards. In a recent review of the Science to Achieve Results (STAR) program, the National Academy of Sciences (NAS) examined the program, and ORD is in the process of responding to the NAS recommendations. In addition, ORD is developing IT business cases that document how particular projects improve efficiency.

Evidence: NAS. Review of STAR grants program (page 103) (The Measure of STAR - www4.nationalacademies.org/news.nsf/isbn/0309089387?OpenDocument)

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: The program claims that P2 tools development is the most comprehensive among all known programs in public and private sectors (DOE, NIST, NSF, private companies selling proprietary software, and academia), however, no evaluation or documentation exist of these comparisons. The private sector has developed tools to help industry realize P2 opportunities, similar to EPA's tools. SAB's EEC review noted "that some of the research projects and products walk a thin line between providing useful product or service . . . and infringing on the domain of commercially viable products and service. This is especially true in the area of software development." The other aspects of the program, such as SES, SBIR, and CC&T, address areas that are not adequately addressed by other entities, resulting in the program receiving "Small Extent".

Evidence: CRADAs reflect the request for collaboration and cooperative research and development for small companies and academic partners. CRADAs #0157-98 (PARIS II); #0239-02 (ET&E Data Base). WAR Algorithm: www.epa.gov/ORD/NRMRL/std/sab/sim_war.htm. LCA Website: www.epa.gov/ORD/NRMRL/std/sab/LCA.htm. TRACI Website: www.epa.gov/ORD/NRMRL/std/sab/iam_traci.htm.

SMALL EXTENT 7%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Program received "No" in 2.6. A common finding from reviews is that the program does not focus on outcomes. The program has failed to develop any outcome goals to address these findings. The IG review (2001) observed that ORD focuses primarily on outputs, and that it should focus on outcomes. The report noted that "program managers agreed" to this observation. The RSAC review (2001) noted that the program must address strategic planning deficiencies regarding a lack of development of long-term outcome goals and linkages from annual performance measures to LTGs and demonstrate in its MYP how it has addressed three of its five priority setting criteria. While RSAC considers the MYPs to be an essential part of EPA's research and budget planning and strongly recommended that ORD continue its improvement efforts, it seems as if the program finalized and implemented its P2 MYP without addressing the RSAC's recommendations and/or findings. SAB's EEC (1998) noted that "some of the research projects and products walk a thin line between providing a useful product or service (one that otherwise would not be available) and infringing on the domain of commercially viable products and service. This is especially true in the area of software development."

Evidence: "Water Quality and Pollution Prevention Multi-Year Plans: An SAB Review" (EPA-SAB-RSAC-02-003 December 2001; www.epa.gov/sab/pdf/sabrsac02003.pdf).

NO 0%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 7%


Last updated: 09062008.2003SPR