ExpectMore.gov


Detailed Information on the
Nuclear Physics Assessment

Program Code 10000114
Program Title Nuclear Physics
Department Name Department of Energy
Agency/Bureau Name Department of Energy
Program Type(s) Research and Development Program
Competitive Grant Program
Capital Assets and Service Acquisition Program
Assessment Year 2003
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 80%
Program Management 66%
Program Results/Accountability 87%
Program Funding Level
(in millions)
FY2007 $412
FY2008 $433
FY2009 $510

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Develop new annual measures to track how efficiently the program's facilities are operated and maintained on a unit per dollar basis by July, 2007.

Action taken, but not completed The Office of Nuclear Physics is in the process of developing new annual performance measures for its four National User Facilities.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Responding to the recommendations of recent advisory committee reports, including implementing a budget-constrained and phased plan for the future of its research facilities.

Completed DOE/NSF Nuclear Science Advisory Committee Long Range Plan Working Group meeting was held in May 2007. The report is expected by December 2007. The FY 2009 Budget submission will be in the context of the new Long Range Plan and other recent Advisory Committee reports and will complete this action.
2006

Engaging the National Academies, including experts outside of nuclear physics, to study the scientific capabilities of a proposed rare isotope accelerator in an international context.

Completed The National Academy of Sciences report entitled, "Scientific Opportunities with a Rare-Isotope Facility in the United States" was published with copyright date of 2007. The report is available on the Office of Nuclear Physics website http://www.sc.doe.gov/np/.
2006

Maximizing operational efficiency of major experimental facilities in response to increasing power costs.

Completed The Operational Efficiency Review was completed in August 2006. The collected data was updated in the context of the FY 2007 Appropriation and the final report was completed November 5, 2007.
2006

In cooperation with other Office of Science programs, develop and deploy a modern, streamlined, and cost-effective information management system for tracking the university grant proposal review and award process. This system should be in place by the end of FY 2007.

Completed DOE (including SC) uses GRANTS.GOV, as mandated, to receive all proposals; tracking the award of financial assistance agreements is through the DOE Procurement Acquisition Data System (PADS).
2007

Participate in the development of a plan, due to OMB by March 1, 2007, to address serious deficiencies in the program's Exhibit 300 business casees for capital projects.

Completed Methods to address OMB concerns issued to OMB on 2/20/2007.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Progress in realizing a quantitative understanding of the quark substructure of the proton, neutron, and simple nuclei by comparison of precision measurements of their fundamental properties with theoretical calculations. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a quinquennial basis.


Explanation:An external panel will conduct reviews of progress every 5 years. See www.sc.doe.gov/measures for more information.

Year Target Actual
2007 Excellent
2012 Excellent
2017 Successful
Long-term Outcome

Measure: Progress in searching for, and characterizing the properties of, the quark-gluon plasma by recreating brief, tiny samples of hot, dense nuclear matter. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a quinquennial basis.


Explanation:An external panel will conduct reviews of progress every 5 years. See www.sc.doe.gov/measures for more information.

Year Target Actual
2007 Excellent
2012 Excellent
2017 Successful
Long-term Outcome

Measure: Progress in investigating new regions of nuclear structure, studying interactions in nuclear matter like those occurring in neutron stars, and determining the reactions that created the nuclei of atomic elements inside stars and supernovae. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a quinquennial basis.


Explanation:An external panel will conduct reviews of progress every 5 years. See www.sc.doe.gov/measures for more information.

Year Target Actual
2007 Excellent
2012 Excellent
2017 Successful
Long-term Outcome

Measure: Progress in determining the fundamental properties of neutrinos and fundamental symmetries by using neutrinos from the sun and nuclear reactors and by using radioactive decay measurements. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a quinquennial basis.


Explanation:An external panel will conduct reviews of progress every 5 years. See www.sc.doe.gov/measures for more information.

Year Target Actual
2007 Excellent
2012 Excellent
2017 Successful
Annual Output

Measure: Weighted average number (within 20% of baseline estimate) of billions of events recorded by experiments in Hall A, Hall B, and Hall C at the Continuous Electron Beam Accelerator Facility.


Explanation:In Nuclear Physics, the average number of events recorded by the detectors is a good indicator of progress. The events that researchers are really interested in are rare, so the more events they record the more likely they will record what they are interested in studying. Increasing the number of events recorded can be technically challenging but is also critically important to the facility users. This measure will track how we actually perform versus what we plan to deliver. The target numbers are heavily influenced by the schedule of experiments and the expected program budget. Investments in the most basic areas of research, such as Physics, not only spark our imagination and advance our human curiosity about the universe in which we live. Historically, these investments have also paid hansom dividends in terms of new technologies that have raised our standard of living and even extended our life expectancy. Examples include Magnetic Resonance Imaging (MRI), development of lasers, testing of spacecraft and satellite materials, techniques to search for nuclear materials to protect U.S. borders and high performance computers.

Year Target Actual
2001 Established in FY04 3.3, 9.9, 2.2
2002 Established in FY04 2.8, 9.9, 2.7
2003 Established in FY04 3.0, 9.0, 2.6
2004 2.4, 7.2, 2.1 2.3, 7.7, 2.2
2005 2.9, 9.6, 2.8 2.8, 8.1, 2.1
2006 1.45, 7.7, 1.7 1.77, 9.93, 1.9
2007 2.2, 11.6, 2.6 2.5, 12.4, 3.0
2008 4, 20, 5
2009 4, 20, 5
Annual Output

Measure: Weighted average number (within 30% of baseline estimate) of millions of heavy-ion collision events sampled by the PHENIX and recorded by the STAR detectors, respectively, at the Relativistic Heavy Ion Collider.


Explanation:In Nuclear Physics, the average number of events recorded by the detectors is a good indicator of progress. The events that researchers are really interested in are rare, so the more events they record the more likely they will record what they are interested in studying. Increasing the number of events recorded can be technically challenging but is also critically important to the facility users. This measure will track how we actually perform versus what we plan to deliver. The target numbers are heavily influenced by the schedule of experiments and the expected program budget. Investments in the most basic areas of research, such as Physics, not only spark our imagination and advance our human curiosity about the universe in which we live. Historically, these investments have also paid hansom dividends in terms of new technologies that have raised our standard of living and even extended our life expectancy. Examples include Magnetic Resonance Imaging (MRI), development of lasers, testing of spacecraft and satellite materials, techniques to search for nuclear materials to protect U.S. borders and high performance computers

Year Target Actual
2002 Established in FY04 170, 8.2
2003 Established in FY04 5500, 38
2004 900, 40 1300, 28
2005 1800, 40 8600, 117
2006 No heavy-ion running No heavy-ion running
2007 6,500; 60 5,100; 86.6
2008 7,500; 60
2009 2,300; 40
Annual Output

Measure: Weighted average number (within 20% of baseline estimate) of billions of events recorded by experiments at the Argonne Tandem Linac Accelerator System and Holifield Radioactive Ion Beam facilities, respectively.


Explanation:In Nuclear Physics, the average number of events recorded by the detectors is a good indicator of progress. The events that researchers are really interested in are rare, so the more events they record the more likely they will record what they are interested in studying. Increasing the number of events recorded can be technically challenging but is also critically important to the facility users. This measure will track how we actually perform versus what we plan to deliver. The target numbers are heavily influenced by the schedule of experiments and the expected program budget. Investments in the most basic areas of research, such as Physics, not only spark our imagination and advance our human curiosity about the universe in which we live. Historically, these investments have also paid hansom dividends in terms of new technologies that have raised our standard of living and even extended our life expectancy. Examples include Magnetic Resonance Imaging (MRI), development of lasers, testing of spacecraft and satellite materials, techniques to search for nuclear materials to protect U.S. borders and high performance computers.

Year Target Actual
2001 Established in FY04 7.7, 3.4
2002 Established in FY04 2.5, 5.4
2003 Established in FY04 39, 2.1
2004 25, 5.3 20, 4.2
2005 25, 3 28.1, 3.8
2006 17.5, 1.4 24.6, 7.1
2007 22, 1.8 26.7, 7.1
2008 22; 2.4
2009 23; 2.3
Annual Efficiency

Measure: Achieve at least 80% average operation time of the scientific user facilities as a percentage of the total scheduled annual operating time.


Explanation:This annual measure assesses the reliability and dependability of the operation of the scientific user facilities. Many of the research projects that are undertaken at the Office of Science's scientific user facilities take a great deal of time, money and effort to prepare and regularly have a very short window of opportunity to run. If the facility is not operating as expected the experiment could be ruined or critically setback. In addition, the taxpayers have invested millions of dollars in these facilities. The longer these facilities operate reliably, the greater the return on investment for the taxpayers.

Year Target Actual
2001 >80% 85%
2002 >80% 89%
2003 >80% 88%
2004 >80% 89%
2005 >80% 87%
2006 >80% 94%
2007 >80% 91%
2008 >80%
2009 >80%
Annual Efficiency

Measure: Achieve within 10% for both the cost-weighted mean percentage variance from established cost and schedule baselines for major construction, upgrade, or equipment procurement projects.


Explanation:This annual measure assesses whether the major construction projects are adhering to their specified cost and schedule baselines. Adhering to the cost and schedule baselines for a complex, large scale, science project is critical to meeting the scientific requirements for the project and for being good stewards of the taxpayers' investment in the project. The Office of Science has a rigorous process in place for overseeing the management of these large-scale, complex scientific projects and has been recognized, both inside government and by private organizations, for the effectiveness of this process.

Year Target Actual
2003 <10%, <10% 0%, 0%
2004 no projects no projects
2005 no projects no projects
2006 no projects no projects
2007 no projects no projects
2008 <10%, <10%
2009 <10%, <10%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The mission of the Nuclear Physics (NP) program is to foster fundamental research in nuclear physics that will provide new insights and advance our knowledge on the nature of matter and energy and develop the scientific knowledge, technologies and trained manpower that are needed to underpin DOE missions.

Evidence: FY04 Budget Request (www.mbe.doe.gov/budget/04budget/index.htm). Public Law 95-91 that established the Department of Energy (DOE). The NP Mission has been validated by the Nuclear Science Advisory Committee (NSAC).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The NP program addresses five key questions: (1) What is the structure of the nucleon? (2) What is the structure of nucleonic matter? (3) What are the properties of hot nuclear matter? (4) What is the nuclear microphysics of the universe? (5) What is to be the new Standard Model?

Evidence: NSAC Long-Range Plan (www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf) .

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The Office of Science (SC) NP program is the principal source of federal funding for basic, long-term research in Nuclear Physics.

Evidence: More than 90% of U.S. Nuclear Physics research is supported by this program. The remaining 10% is supported by the National Science Foundation (NSF) and coordinated through NSAC - a joint advisory committee.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The NP program is based on competitive merit review, independent expert advice, and community planning. However, a Committee of Visitors (COV) has yet to validate the merit review system.

Evidence: NSAC reviews and reports (www.sc.doe.gov/production/henp/np/nsac/nsac.html). Program files.

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: NSAC ensures that input from the nuclear physics research community is regularly gathered to assess new opportunities, priorities, and progress of the program. Peer review is used to assess the relevance and quality of each project.

Evidence: NSAC reviews and reports (www.sc.doe.gov/production/henp/np/nsac/nsac.html). Program files.

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The four long-term measures reflect the key scientific drivers that the U.S. nuclear physics community has outlined for the field for roughly the next decade. The program has defined "successful" and "minimally effective" performance milestones for each measure, and an external panel will assess interim program performance, and update the measures as necessary, every five years. It is inappropriate for a basic research program such as this one to have a quantitative long-term efficiency measure.

Evidence: NSAC Long-Range Plan (www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf). National Research Council report, "Nuclear Physics: The Core of Matter, the Fuel of Stars" (books.nap.edu/catalog/6288.html). A description of the "successful" and "minimally effective" milestones, and an explanation of the relevance of these measures to the field can be found on the SC Web site (www.sc.doe.gov/measures).

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: NSAC has reviewed the new long-term measures for this program and found them to be ambitious and meaningful indicators of progress in the field. The external reviews described in 2.1 will update the measures, targets, and timeframes on an interim basis.

Evidence: Letter from NSAC chair regarding review of long-term measures.

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: The quantitative annual output measures for facility construction and operations, and the data delivery goals for the program's major facilities, serve as proxies for progress, because the efficient on-cost and on-schedule delivery of scientific data from these large facilities provides a critical resource necessary for continuing scientific discoveries that are directly connected to the long term goals of the program.

Evidence: FY04 Budget Request. Website with further information, including explanation of data delivery measures (www.sc.doe.gov/measures).

YES 10%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: All of the annual measures have baseline data (FY01 and/or FY02) that demonstrate that the targets are ambitious, yet realistic. A 20-30 percent tolerance is used to guard against facilities unwisely stressing hardware near the end of the fiscal year.

Evidence: FY04 Budget Request. Website with further information (www.sc.doe.gov/measures). Construction variance target of <10% comes from OMB Circular A-11, especially Capital Programming Guide supplement.

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: A limited FY03 audit by the DOE Inspector General (IG) found that "performance expectations generally flowed down into the scope of work at the national laboratories." For individual grantees, NP uses general solicitations that do not explicitly include program goals.

Evidence: Memo from the DOE IG to the Director of the Office of Science. M&O contract performance evaluation provisions (e.g., Appendix B in contracts for Jefferson Lab, www.sura.org/DOE/m&o_contract.html; and, Brookhaven Lab, www.bnl.gov/prime/searchprime.asp). Example of recent general renewal solicitation (www.science.doe.gov/grants/Fr03-01.html).

NO 0%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: All research projects undergo merit review; ongoing grants are reviewed triennially; major facilities are reviewed annually; and, construction projects are reviewed quarterly. NSAC produces planning documents and assessments of various components of the NP program on a rotating basis. NP is working to begin a Committee of Visitors (COV) review process for the program on a triennial basis, and expects the first review in 2003.

Evidence: SC Merit Review guidelines (www.sc.doe.gov/production/grants/merit.html). Program files, including Lehman review reports and program advisory committee reports. NSAC reports, including Long-Range Plan, reviews of Low and Medium Energy subprograms, and recent charge letter to NSAC for review of education, theory, and neutron program elements (www.sc.doe.gov/production/henp/np/nsac/nsac.html). Letter from DOE to NSAC establishing a regular evaluation process utilizing a COV.

YES 10%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: DOE has not yet provided a budget request that adequately integrates performance information.

Evidence:

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: New performance measures and targets have been developed in coordination with OMB. A new COV process is being organized, with the first program review in 2003. The U.S. nuclear physics community has recently completed a long-range strategic plan for the field. As part of the SC strategic planning process, NSAC recently issued a 20-year facilities priority plan for NP.

Evidence: Letter from DOE to NSAC establishing a regular evaluation process utilizing a COV. NSAC Long-Range Plan (www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf).

YES 10%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: NSAC provides advice to the program on alternative approaches to addressing key physics questions. The program relies on the Lehman review process and program reviews to monitor construction projects. Facility scientific program advisory committees help prioritize facility research. The program does not currently support a capital project for which a Exhibit 300 is required, so no PART-level project-specific alternatives analyses have been necessary.

Evidence: NSAC reviews and reports (www.sc.doe.gov/production/henp/np/nsac/nsac.html). Program files, including Lehman reports and program advisory committee reports.

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: This is a basic R&D program, and the question is intended for industry-related R&D programs.

Evidence:  

NA 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Although not visible outside DOE, internal SC budget formulation practices include a priority ranking process. The NSAC Long-Range Plan identified strategic priorities for the U.S. nuclear physics community. Previous regular NSAC reviews of subprograms make recommendations, including constant-level-funding scenarios and shutting down facilities. Such reviews prove useful for program planning and should serve as a model for responsible committee advice.

Evidence: NSAC Long-Range Plan, Low Energy, and Medium Energy reviews (www.sc.doe.gov/production/henp/np/nsac/nsac.html).

YES 10%
Section 2 - Strategic Planning Score 80%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: A great deal of project performance information collected via Lehman facility operations reviews, annual facility reviews, and management changes are made in response to these reviews. The program collects performance data from individual grantees and national labs, and uses peer review as a type of standardized quality control at the individual grant level. However, there is not yet a systematic process, such as regular COV evaluations, that conducts research portfolio quality and process validations. While DOE IG contracts with an outside auditor to check internal controls for performance reporting, and the IG periodically conducts limited reviews of performance measurement in SC, it is not clear that these audits check the credibility of performance data reported by DOE contractors.

Evidence: Program files, including Lehman reviews and subprogram reviews. Reporting requirements for grants (www.science.doe.gov/production/grants/605-19.html).

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: Senior Executive Service (SES) and Program Manager Performance Plans are directly linked to program goals. The Management and Operations contracts for the Labs and Facilities include performance measures linked to program goals. Research funding requirements ensure consideration of past performance.

Evidence: Program and personnel files, including grant renewal statistics. Performance-based contract fee evaluation provisions (e.g., Jefferson Lab, www.sura.org/DOE/m&o_contract.html; and, Brookhaven Lab, www.bnl.gov/prime/searchprime.asp). 10 CFR 605 (www.science.doe.gov/production/grants/605index.html).

YES 8%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Using DOE's monthly accounting reports, SC personnel monitor progress toward obligating funds consistent with an annual plan that is prepared at the beginning of the fiscal year to ensure alignment with appropriated purposes.

Evidence: SC programs consistently obligate more than 99.5% of available funds. Program files. Audit reports.

YES 8%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: Using DOE's monthly accounting reports, SC personnel monitor progress toward obligating funds consistent with an annual plan that is prepared at the beginning of the fiscal year to ensure alignment with appropriated purposes.

Evidence: SC reengineering information (www.screstruct.doe.gov). Program files.

YES 8%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The program is well coordinated with a similar program at NSF through a joint Advisory Committee (NSAC) that has produced a recent coordinated strategic plan for nuclear physics. Several experiments at large facilities are jointly funded with NSF and/or international partners. The program has yet to demonstrate adequate coordination and collaboration with other countries (namely Germany and Japan) on future rare isotope accelerators.

Evidence: NSAC Long-Range Plan (www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf), including chapter on international collaboration. List of joint projects with other offices/agencies/countries.

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: SC staff execute the NP program consistent with established DOE budget and accounting policies and practices. These policies have been reviewed by external groups and modified as required to reflect the latest government standards.

Evidence: Various Departmental manuals. Program files. Audit reports.

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: SC is currently reengineering to improve program management efficiency. A Committee of Visitors (COV) process is being implemented. A layer of management above NP in the SC structure was recently removed.

Evidence: SC reengineering information (www.screstruct.doe.gov). Program files.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: Community input, through NSAC, is gathered on what capabilities are needed to address scientific opportunities. The NP program documents the capabilities and characteristics of new facilities at critical decision points that are reviewed by an independent Lehman review. Progress is tracked quarterly through program reviews and annually through Lehman reviews.

Evidence: NSAC reviews, including 1999 ISOL task force report (www.sc.doe.gov/production/henp/np/nsac/nsac.html). Program files, including Lehman operations review reports, and the STAR Barrel Electromagnetic Calorimeter Enhancement project mangagement plan.

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: First time grant applications are encouraged in all Requests For Proposals. The NP Program has a specific solicitation for the Outstanding Junior Investigator (OJI) program, in which awards are made to young non-tenured faculty. Merit review guides all funding decisions. However, the award and merit review process has not yet been validated by a COV.

Evidence: In FY 2002 the NP Program received 31 new research proposals, of which 8 (26%) were approved for funding. 5 OJI awards were made. "How to apply" (www.science.doe.gov/production/grants/guide.html).

NO 0%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: In addition to grantee progress reports, program managers stay in contact with grantees through e-mail and telephone, conduct program reviews and site visits .

Evidence: Program files, including a list of multiple annual site visits to lab and university groups.

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: In accordance with DOE Order 241.1A, the final and annual technical reports of program grantees are made publicly available on the web through the Office of Scientific and Technical Information's "Information Bridge". However, program-level aggregate data on the impact of the grants program is not adequately communicated in the annual DOE Performance and Accountability report.

Evidence: DOE Order 241.1A. Information Bridge (www.osti.gov/bridge/). FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf).

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: Priorities are determined in accord with guidance from the NSAC plans and reviews. Unsolicited field work proposals from the Federal Labs are merit reviewed, but not competed. The funds for research programs and scientific user facilities at the Federal Labs are allocated through a limited competition analogous process to the unlimited process outlined in 10 CFR 605. Lehman and other peer reviews of user facilities are conducted annually. However, the quality of the research funded via this process has not yet been validated by a COV.

Evidence: NSAC Long-Range Plan (www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf). SC Merit Review procedures. (www.sc.doe.gov/production/grants/merit.html) 10 CFR 605 (www.science.doe.gov/production/grants/605index.html) Separate university and lab solicitations for RIA R&D. Program files, including Lehman reviews of operation at major facilities, and a Jefferson Lab facility peer review.

NO 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 66%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: NSAC will evaluate progress toward the long-term performance measures every five years. NSAC and National Research Council (NRC) reviews of progress in the program over the past decade have found good scientific progress.

Evidence: NSAC Long-Range Plan ("Recent accomplishments, p. 4, www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf). NRC Decade Survey report ("Schiffer Report," Introduction, www.nap.edu/catalog/6288.html)

LARGE EXTENT 13%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: NP has met all but one of its annual performance goals in FY02. The one goal, not timely met, resulted in no adverse effect on the facility.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf). FY04 Annual Performance Plan (www.mbe.doe.gov/budget/04budget/content/perfplan/perfplan.pdf).

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The recent history of tracking the two "efficiency" measures for facility construction and operation management shows that, on average, the program continues to meet expectations.

Evidence: FY04 Budget Request. Program files.

YES 20%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: The DOE supports over 90% of the U.S. nuclear physics basic research program via this program; the balance is supported by the NSF. The two programs are highly coordinated including a common Advisory Committee (NSAC). A significant number of the projects have international collaborations. An international benchmarking study has not been done, due in part to its questionable value.

Evidence: Program files, including list of international projects. "International collaborations and cooperation" chapter in NSAC Long-Range Plan (www.sc.doe.gov/production/henp/np/nsac/docs/LRP_5547_FINAL.pdf)

NA  %
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: NSAC and of the major NP program elements have determined that the program is effective in achieving results. These reviews examine scientific progress against the long-range plan, assess scientific opportunities, and recommend priorities based upon realistic budget profiles. Program advisory committees and Lehman facility operations reviews are generally favorable.

Evidence: v

YES 20%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: All NP construction/operation projects met cost and schedule performance goals during the first two quarters of FY03. No contingency remains in the FY04 data collection schedule for the new BLAST detector at MIT/Bates.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf). FY04 Annual Performance Plan (www.mbe.doe.gov/budget/04budget/content/perfplan/perfplan.pdf). List of FY03 quarterly milestones. Program files.

YES 20%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 87%


Last updated: 09062008.2003SPR