ExpectMore.gov


Detailed Information on the
Advanced Scientific Computing Research Assessment

Program Code 10000074
Program Title Advanced Scientific Computing Research
Department Name Department of Energy
Agency/Bureau Name Department of Energy
Program Type(s) Research and Development Program
Competitive Grant Program
Capital Assets and Service Acquisition Program
Assessment Year 2003
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 70%
Program Management 66%
Program Results/Accountability 87%
Program Funding Level
(in millions)
FY2007 $276
FY2008 $351
FY2009 $369

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Implementing action plans for improving program management in response to past expert reviews, especially the development of new annual facility measures for 2008 Budget execution.

Action taken, but not completed Delays in bringing on more staff have delayed training plans but training has been developed and SCAR will implement as soon as new staff are on board.
2006

Engaging advisory panel and other outside groups in regular, thorough scientific assessments of the quality, relevance, and performance of its research portfolio and computing/network facilities.

Action taken, but not completed COV for INCITE Program conducted April 2008. COV report expected August 2008. Program response expected Sept 2008. Next COV, for Computer Science program, will be conducted in 2009.
2007

Engage advisory panel in an assessment of the the strategic priorities for the program, focusing on the balance between "core" research and supercomputing hardware investments.

Action taken, but not completed ASCAC Charged in August 2007. Draft report presented at ASCAC in Feb 2008 but the report was not accepted by the ASCAC. Expect revised report at Aug 2008 ASCAC meeting.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Engaging advisory panel and other outside groups in assessments of the program's progress in achieving its long-term goals, and in studies that revisit the strategic priorities for the program.

Completed ASCAC reports on Progress toward long-term goals received 11-8-06. The committee found "Excellent" progress in the multiscale effort and "Good" progress in the GTL effort, with some concerns regarding "buy-in" from Biology community.
2006

In cooperation with other Office of Science programs, develop and deploy a modern, streamlined, and cost-effective information management system for tracking the university grant proposal review and award process. This system should be in place by the end of FY 2007.

Completed DOE (including SC) uses GRANTS.GOV, as mandated, to receive all proposals; tracking the award of financial assistance agreements is through the DOE Procurement Acquisition Data System (PADS).
2007

Improving the quality of the supporting materials for the Office of Science IT Exhibit 300 business cases submitted to OMB, especially the alternative analysis, acquisition strategy, and risk management sections.

Completed
2007

Participate in the development of a unified, action-based strategy for SC-wide collaboration in accelerator and detector R&D (including advanced accelerator concepts) by March 1, 2007.

Completed Strategy submitted to OMB 2/26/2007
2007

Develop new annual measures to track how efficiently the program's supercomputers are operated and maintained on a unit per dollar basis by July, 2007.

Completed ASCR is continuing to implement the Office of Science Operational Review Plan which provides more robust information than a single operations metric.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Progress toward developing the mathematics, algorithms, and software that enable scientifically-critical models of complex systems, including highly nonlinear or uncertain phenomena, or processes that interact on vastly different scales, or contain both discrete and continuous elements. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Excellent
2009 Excellent
2012 Excellent
2015 Successful
Long-term Outcome

Measure: Progress toward developing the computational science capability to model a complete microbe and a simple microbial community. An independent expert panel will conduct a review and rate progress (excellent, good, fair, poor) on a triennial basis.


Explanation:See www.sc.doe.gov/measures for more information.

Year Target Actual
2006 Excellent Good
2009 Excellent
2012 Excellent
2015 Successful
Annual Output

Measure: Focus usage of the primary supercomputer at the National Energy Research Scientific Computing Center (NERSC) on capability computing. Thirty percent (30%) of the computing time will be used by computations that require at least 1/8 (2,040 processors) of the NERSC resource.


Explanation:There are two possible orientations of a large-scale computing center like the National Energy Research Scientific Computing Center: capacity computing and capability computing. Capacity computing refers to the goal of maximizing the overall throughput of the system, regardless of the size of the individual computational tasks. This tends to favor jobs that require less time and fewer computational nodes, since balancing the load across a large multiprocessor machine is easier with smaller jobs, much the same way that a bucket is filled more completely with sand (small grain) than with baseballs (large grains). In general, these jobs could be run on smaller computing systems. Capability computing refers to the goal of maximizing the scientific impact of the computing center by focusing on large-scale and long-term computing jobs that are impossible or impractical to execute on a smaller system. This can have the effect of lowering the overall throughput of the system, since there will be times when processing elements will be idle, if there is no suitable job pending to match the available resources. Both capability and capacity computing are critical for scientific progress. The goal of the Office of Science is to ensure that the NERSC center is oriented to support capability computing over capacity computing, to ensure that the needs of large-scale challenges in computational science are adequately addressed because these are the challenges that cannot be met on smaller computers. This measure will ensure that the center remains oriented toward capability computing.

Year Target Actual
2002 50% 75%, 22%
2003 50% 36%
2004 50% 49%
2005 40% 68%
2006 40% 51%
2007 40% 68%
2008 30%
2009 30%
Annual Efficiency

Measure: Improve Computational Science Capabilities. Average annual percentage increase in the computational effectiveness (either by simulating the same problem in less time or simulating a larger problem in the same time) of a subset of the application codes.


Explanation:Progress in computational science is enabled by reducing the time required to simulate today's problem or by enabling the simulation of a more complex system in the same time. The first advance increases the breadth and comprehensiveness of our exploration of today's simulations because it enables simulation of more important physical cases. The second type of advance enables scientists to explore new frontiers by adding more comprehensive physics, finer spatial and time scales, and accurate coupling of multiple processes. For example the first type of advance might permit exploring 10 chemical compounds rather than just one. The second might permit new simulations of combustion processes that include fine scale fluid motions and detailed understanding of the 1,000 chemical species in a diesel engine. This measure evaluates the contribution of research in applied mathematics and computer science to scientific discovery in the other programs within the Office of Science. It is a key measure of ASCR's success in enhancing scientific discovery. It should be noted that in many cases of interest the improvement due to this type of advance is equal to the advance due to hardware speed.

Year Target Actual
2003 10% 3181%
2004 50% 200%
2005 50% 65%
2006 50% 135%
2007 100% 2,365%
2008 100%
2009 100%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The mission of the Advanced Scientific Computing Research (ASCR) program is to discover, develop, and deploy the computational and networking tools that enable researchers in the scientific disciplines to analyze, model, simulate, and predict complex phenomena important to the Department of Energy (DOE). To accomplish this mission the program fosters and supports fundamental research in advanced scientific computing ' applied mathematics, computer science, and networking ' and operates supercomputer, networking, and related facilities.

Evidence: FY 2004 Budget Request (www.mbe.doe.gov/budget/04budget/index.htm). Public Law 95-91 that established the Department of Energy (DOE). The ASCR Mission has been validated by the Advanced Scientific Computing Advisory Committee (ASCAC).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The ASCR program addresses the specific need for the Department of Energy's Office of Science (SC) to develop large-scale, complex, high-performance simulation capabilities to accelerate civilian scientific advancement focused on the mission needs of the DOE, and secondarily on the needs of the broader scientific community.

Evidence: This program was specifically authorized in the "High Performance Computing Act of 1991" (PL 102-194). The "Scientific Discovery through Advanced Computing (SciDAC)" plan describes the issues and the program's strategic vision circa 2000 (www.osti.gov/scidac/SciDAC.pdf).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The ASCR program is unique in addressing the specific computational needs and challenges of civilian R&D in the DOE. ASCR is coordinated with other Federal programs through the Interagency Working Group on IT R&D (IWG/IT R&D) to ensure that efforts are not needlessly redundant. The most recent strategic vision for the program (SciDAC) briefly describes relationships with the computing programs at DOE's National Nuclear Security Administration and other Federal agencies.

Evidence: IWG/IT R&D (www.itrd.gov/iwg/program.html). SciDAC plan (see above).

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The ASCR program is based on competitive merit-review, independent expert advice, and joint program planning. This proves efficient and effective. However, a Committee of Visitors (COV) has yet to independently validate ASCR's merit review process.

Evidence: ASCAC reports (www.sc.doe.gov/ascr/adviscommittee.html). Joint planning efforts include SciDAC, Genomes to Life (doegenomestolife.org), and computational nanoscience (www.sc.doe.gov/production/bes/besac/Theory%20and%20Modeling%20in%20Nanoscience.pdf). Program reviews and files.

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: ASCAC ensure that research community input is regularly gathered to assess the priorities and progress of the program. SciDAC efforts are tightly linked to the application programs (and associated advisory committees) . Peer review is used to assess the relevance and quality of each project.

Evidence: ASCAC reviews and reports. SciDAC reports (www.osti.gov/scidac). Program files.

YES 20%
1.RD1

Does the program effectively articulate potential public benefits?

Explanation:  

Evidence:  

NA  %
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation:  

Evidence:  

NA  %
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: While not comprehensive, the two long-term measures reflect key goals for the underlying mathematics and computer science research sponsored by ASCR, and provide a test case for the computation component of the Genomes to Life SciDAC effort. The program has defined "successful" and "minimally effective" performance milestones for each measure, and an external panel will assess interim program performance on a triennial basis, and update the measures as necessary. It is inappropriate for a basic research program such as this one to have a quantitative long-term efficiency measure.

Evidence: SciDAC goals are outlined in program plan (www.osti.gov/scidac), and GTL-specific goals are online at doegenomestolife.org. A description of the "successful" and "minimally effective" milestones, and an explanation of the relevance of these measures to the field can be found on the SC Web site (www.sc.doe.gov/measures).

YES 10%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: ASCAC has reviewed the new long-term measures for this program and found them to be ambitious and meaningful indicators of progress toward computer science, applied mathematics, and SciDAC goals.

Evidence: Letter from ASCAC chair regarding review of long-term measures.

YES 10%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: ASCR has developed quantitative annual output measures that are indicators of progress toward the long term measures, primarily because they focus on efficiently providing the computational capabilities (hardware and the underlying applied math and computer science) necessary for enabling improved scientific progress.

Evidence: FY04 Budget Request. Description on measures and relationship to long-term goals (www.sc.doe.gov/measures). Brief description of "best value" procurement process alluded to in the procurement measure (www.nersc.gov/research/annrep01/03systems.html#NERSC4).

YES 10%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: All of the annual measures include quantifiable annual targets. The new efficiency measure quantifies ambitious performance improvements over current rates. Baseline data (FY02 and FY03) for the procurement and NERSC usage measures demonstrate the targets to be ambitious, yet realistic.

Evidence: FY04 Budget Request. Description on measures and relationship to long-term goals (www.sc.doe.gov/measures). NERSC FY02 Annual Report (www.nersc.gov/research/annrep02/html/).

YES 10%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: ASCR program solicitations for research grants do not yet explictly include specific program goals, though Federal program managers attempt to fund a grant portfolio that is aimed at the long-term goals of the program. For contractors, a limited FY03 audit by the DOE Inspector General (IG) found that "performance expectations generally flowed down into the scope of work at the national laboratories." Management and Operations (M&O) contracts for the labs contain generic "scientific quality" peformance-based evaluation provisions.

Evidence: Most recent general renewal solicitation (www.science.doe.gov/grants/Fr03-02.html). Memo from the DOE IG to the Director of the Office of Science. M&O contract performance evaluation provisions (WWW-accesible examples include: Oak Ridge National Lab, www.ornl.gov/Contract/UT-BattelleContract.htm; and, Lawrence Berkeley National Lab, www.lbl.gov/LBL-Documents/Contract-98/AppFTOC.html).

NO 0%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: ASCAC has conducted a fairly light review of the program's facilities to gauge relevance and quality, but there have not been similar portfolio-level peer reviews of the research program by an independent panel. The program does not yet have COV evaluations of any program elements, but expects to receive the first COV report by April 2004.

Evidence: ASCAC facilities review report (www.krellinst.org/esinfo/ASCAC-facilities-final.mhw.doc).

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: DOE has not yet provided a budget request that adequately integrates performance information.

Evidence:

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: In addition to active participation in a current interagency roadmapping task force on high end computing, ASCR has held a series of strategic planning workshops, participated in the drafting of a new Office of Science strategic plan, and new performance goals and targets have been developed in coordination with OMB. A new COV process is being organized, with the first program element review expected back by April 2004. However, the activity level of ASCAC is below that of other Office of Science advisory committees.

Evidence: Interagency task force (www.itrd.gov/hecrtf-outreach/index.html). Networking workshop (www.hep.anl.gov/may/ScienceNetworkingWorkshop). Science applications workshop (www.pnl.gov/scales). Program files, including COV charge letter to ASCAC chair. ASCAC report activity (www.sc.doe.gov/ascr/ascac_reports.htm).

YES 10%
2.CA1

Has the agency/program conducted a recent, meaningful, credible analysis of alternatives that includes trade-offs between cost, schedule, risk, and performance goals and used the results to guide the resulting activity?

Explanation: One of a kind research facilities are not amenable to the same type of alternatives analysis as other captial asset investments. Nevertheless, the Exhibit 300s provided to OMB contain roughly equivalent analyses, which typically compare the attributes of various computer vendors' systems--using appropriate "best value" metrics--before making a procurement decision.

Evidence: Brief description of "best value" procurement for program's production facility, National Energy Research Scientific Computing Center (NERSC, www.nersc.gov/research/annrep01/03systems.html#NERSC4).

YES 10%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program to other efforts that have similar goals?

Explanation: This is a basic R&D program, and the question is intended for industry-related R&D programs.

Evidence:  

NA 0%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: Although not visible outside DOE, internal SC budget formulation practices include a priority ranking process. ASCR is currently drafting a strategic plan--with the input of external community workshops--as a part of the overall SC planning process. ASCR has engaged the advisory process for the computing components of other SC programs. However, the program has not yet fully engaged ASCAC in its prioritization process, and it is not always obvious that program level budget execution decisions are made within a prioritization framework.

Evidence: ASCAC reports (www.sc.doe.gov/ascr/adviscommittee.html; topical computing centers report not on Web site). Engagement with other SC programs' advisory processes include: Genomes to Life (doegenomestolife.org) and computational nanoscience (www.sc.doe.gov/production/bes/besac/Theory%20and%20Modeling%20in%20Nanoscience.pdf).

YES 10%
Section 2 - Strategic Planning Score 70%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: Facility user surveys and benchmarking provide operational performance information. The program collects performance data from individual grantees and national labs, and uses peer review as a type of standardized quality control at the individual grant level. However, there is not yet a systematic process, such as regular COV evaluations, that conducts research portfolio quality and process validations. While DOE IG contracts with an outside auditor to check internal controls for performance reporting, and the IG periodically conducts limited reviews of performance measurement in SC, it is not clear that these audits check the credibility of performance data reported by DOE contractors.

Evidence: Facility user surveys and user groups/committees (hpcf.nersc.gov/about, www.es.net, www.ccs.ornl.gov/CHUG.html). Program files, including peer review of the facilities.Reporting requirements for grants (www.science.doe.gov/production/grants/605-19.html).

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: Senior Executive Service (SES) and Program Manager Performance Plans are directly linked to program goals. The Management and Operations (M&O) contracts for the Labs and User Facilities include performance measures linked to program goals. Research funding requirements ensure consideration of past performance.

Evidence: Program and personnel files. For performance-based fee adjustments on M&O contracts, see evidence for question 2.5. Grant rules for renewals (www.science.doe.gov/grants/#GrantRules).

YES 8%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Using DOE's monthly accounting reports, SC personnel monitor progress toward obligating funds consistent with an annual plan that is prepared at the beginning of the fiscal year to ensure alignment with appropriated purposes. SC programs consistently obligate more than 99.5% of available funds.

Evidence: Program files. DOE-wide audit reports.

YES 8%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: SC is currently undergoing a reengineering exercise aimed at flattening organizational structure and improving program effectiveness. The program will collect data necessary to track their "efficiency" measure. The system performance measures used by NERSC ensures maximum return on procurement investments.

Evidence: SC reengineering information (www.screstruct.doe.gov). See "Measures" tab for the programmatic efficiency measure. NERSC system performance measures (www.nersc.gov/aboutnersc/presentations/Sc99/SC99Kramer6/SC99Kramer6.PPT, and hpcf.nersc.gov/about/ERSUG/meeting_info/May03/ May03_Presentations/Wong/NERSC_Perf_Eval_Activities.ppt).

YES 8%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The ASCR program is involved in numerous formal and informal collaborations with other programs in advanced scientific computing research, though primarily with national security agencies as oppposed to other civilian science agencies. ASCR is a leading agency in the ongoing governmental Interagency Working Group on IT R&D of the National Science and Technology Council, including co-chairing a current task force on high end computing.

Evidence: Summary of joint activities with other agencies (www.sc.doe.gov/ascr/hitchcock.ppt). Interagency Working Group on IT R&D (www.itrd.gov/iwg).

YES 8%
3.6

Does the program use strong financial management practices?

Explanation: SC staff execute the ASCR program consistent with established DOE budget and accounting policies and practices. These policies have been reviewed by external groups and modified as required to reflect the latest government standards.

Evidence: Various Departmental manuals. Program files. Audit reports.

YES 8%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: SC is currently reengineering to improve program management efficiency. A new COV process is being organized by ASCR, with the first program element review expected back by April 2004.

Evidence: SC reengineering information (www.screstruct.doe.gov). COV charge letter to ASCAC chair, including scope, conflict of interest issues, and future schedule.

YES 8%
3.CA1

Is the program managed by maintaining clearly defined deliverables, capability/performance characteristics, and appropriate, credible cost and schedule goals?

Explanation: Procurement contracts with computer vendors tie payments to specific deliverables, including the sustained system performance measured over the lifetime of the contract.

Evidence: Exhibit 300s submitted to OMB. Program files, including competitive performance proposals from vendors.

YES 8%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: First time grant applications are encouraged in all Request For Proposals. ASCR has a specific solicitation for a new Early Career Principal Investigator (ECPI) program, and investments in minority institutions under the HBCU/MI program. However, the award and merit review process has not yet been validated by a COV.

Evidence: There were 26 new and 9 renewed ASCR grantees in FY2002. In addition, there were 70 new and 9 renewed grantees in FY2001 (includes new programs for SciDAC & Microbial Cell). ECPI website (www.sc.doe.gov/production/grants/Fr02-16.html).

NO 0%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: In addition to grantee progress reports, program managers stay in contact with grantees through email and telephone, and conduct program reviews and site visits.

Evidence: Reporting requirements for grants (www.science.doe.gov/production/grants/605-19.html). Program files, including documentation of program manager site visits, etc.

YES 8%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: In accordance with DOE Order 241.1A, the final and annual technical reports of program grantees are made publicly available on the web through the Office of Scientific and Technical Information's "Information Bridge". However, program-level aggregate data on the impact of the grants program is not adequately communicated in the annual DOE Performance and Accountability report.

Evidence: DOE Order 241.1A. Information Bridge (www.osti.gov/bridge/). FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf).

NO 0%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: ASCAC facility reviews, facility steering committees, and user surveys validate the quality of the scientific user facilities. Unsolicited field work proposals from the Federal Labs are merit reviewed, but not competed. The funds for research programs and scientific user facilities at the Federal Labs are allocated through a limited competition analogous process to the unlimited process outlined in 10 CFR 605. However, the quality of the research funded via this process has not yet been validated by a COV.

Evidence: ASCAC facility report (www.krellinst.org/esinfo/ASCAC-facilities-final.mhw.doc). Unsolicited proposals (See 10CFR600.6, professionals.pr.doe.gov/ma5/MA-5Web.nsf/FinancialAssistance/ Part+600). Example of lab solicitation, with field work proposal reference (www.science.doe.gov/grants/LAB03_17.html). Merit Review procedures (www.sc.doe.gov/production/grants/merit.html). 10 CFR 605 (www.science.doe.gov/production/grants/605index.html). Facility user surveys and user groups/committees (hpcf.nersc.gov/about, www.es.net, (www.ccs.ornl.gov/CHUG.html). Program files, including peer review of the facilities.

NO 0%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation:  

Evidence:  

NA  %
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation:  

Evidence:  

NA  %
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation:  

Evidence:  

NA  %
Section 3 - Program Management Score 66%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: ASCAC will evaluate progress toward the new long term performance measures every three years, but no external portfolio-level reviews are available other than the generaly positive facilities report by ASCAC. Early results indicate that the SciDAC effort appears to be successful, which is important for acheiving the future goals of the program.

Evidence: ASCAC facilities review report (www.krellinst.org/esinfo/ASCAC-facilities-final.mhw.doc). SciDAC update at latest ASCAC meeting (www.sc.doe.gov/ascr/Laub031403.ppt).

LARGE EXTENT 13%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Although the three annual performance goals for FY05 are new, ASCR has met the targets for most of its former annual measures.

Evidence: FY02 Performance and Accountability Report (www.mbe.doe.gov/ stratmgt/doe02rpt.pdf). FY04 Annual Performance Plan (www.mbe.doe.gov/budget/04budget/content/perfplan/perfplan.pdf).

YES 20%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The sustained system performance metric used by NERSC for procurements has resulted in machines with more compute nodes delivered by the vendor than originally planned, which in turn allows more scientific simulations to be carried out.

Evidence: Program files, including procurement contracts.

YES 20%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: While user surveys regularly show a fairly high level of satisfaction with ASCR facilities, expert comparitive analyses of the program as a whole have not been done. The program has a unique role to serve the needs of the other five SC research programs, and the DOE mission more broadly, so the value of such analyses is questionable at best given the interconnectedness of the U.S. computing community.

Evidence: NERSC Annual User Survey (hpcf.nersc.gov/about/survey/).

NA  %
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: The ASCR facilities are effective in achieving desired results, based on assessment by the ASCAC in their facilities report, and based on external peer review of both NERSC and ESnet. However, no independent review process has been carried out to assess the program's research portfolio.

Evidence: ASCAC facilities review report (www.krellinst.org/esinfo/ASCAC-facilities-final.mhw.doc). Program files, including ESnet and NERSC peer review results.

LARGE EXTENT 13%
4.CA1

Were program goals achieved within budgeted costs and established schedules?

Explanation: Performance data for FY02 and FY03 demonstrate that the capital asset procurements, primarily for NERSC acquisitions, were almost exactly on schedule and on budget. This excellent performance can be primarily attributed to the sustained system performance metric used for these procurements, which focuses on the actual performance of the resource available to the end users rather than on the theoretical peformance of a proposed system.

Evidence: Exhibit 300s submitted to OMB. FY02 Performance and Accountability Report (www.mbe.doe.gov/stratmgt/doe02rpt.pdf). Brief description of "best value" procurement for NERSC (www.nersc.gov/research/annrep01/03systems.html#NERSC4).

YES 20%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation:  

Evidence:  

NA  %
Section 4 - Program Results/Accountability Score 87%


Last updated: 09062008.2003SPR