ExpectMore.gov


Detailed Information on the
Investment in Research Infrastructure and Instrumentation Assessment

Program Code 10004405
Program Title Investment in Research Infrastructure and Instrumentation
Department Name National Science Foundation
Agency/Bureau Name National Science Foundation
Program Type(s) Research and Development Program
Competitive Grant Program
Assessment Year 2006
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 100%
Program Management 100%
Program Results/Accountability 80%
Program Funding Level
(in millions)
FY2007 $472
FY2008 $611
FY2009 $633

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Monitoring performance against recently established annual targets and making appropriate management or other adjustments to improve its performance.

Completed Because the NSF Research Infrastructure and Instrumentation budget investment category no longer exists under the new Strategic Plan, beginning in FY07, the broadening participation goal will be reported for a significant component of that former category: the Major Research Instrumentation Program. NSF achieved the goal in that program in FY07.
2006

Ensuring increased timeliness of yearly project reports from award recipients.

Completed On Nov. 18, 2006, changes will be implemented in the Project Reports System to enable NSF to monitor and enforce that PIs are submitting annual and final project reports within the appropriate timeframes. Annual reports are due 90 days prior to report period end date and are required for all standard and continuing grants and cooperative agreements. Final reports are due within 90 days after expiration of award. Policy documents have been updated to reflect the changes.
2006

Ensuring that a greater share of award decisions are made available to applicants within six months of proposal receipt or deadline date.

Completed Because the NSF Research Infrastructure and Instrumentation budget investment category no longer exists under the new Strategic Plan, beginning in FY 2007 time-to-decision results will be reported for a significant component of that former category: the Major Research Instrumentation program. NSF achieved the 6-month time to decision goal for this program in FY07.

Program Performance Measures

Term Type  
Long-term Outcome

Measure: External validation by the Advisory Committee for GPRA Performance Assessment that NSF has demonstrated "significant achievement" in expanding opportunities for U.S. researchers, educators, and students at all levels to access state-of-the-are S&E facilities, tools, databases, and other infrastructure;


Explanation:Assessment by external Advisory Committee for GPRA Performance Assessment of "significant achievement" in expanding opportunities for U.S. researchers, educators, and students at all levels to access state-of-the-are S&E facilities, tools, databases, and other infrastructure.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External validation by the Advisory Committee for GPRA Performance Assessment that NSF has demonstrated "significant achievement" in developing and deploying an advanced cyberinfrastructure to enable all fields of science and engineering to fully utilize state-of-the-art computation.


Explanation:Assessment by the external Advisory Committee for GPRA Performance Assessment of "significant achievement" in developing and deploying an advanced cyberinfrastructure to enable all fields of science and engineering to fully utilize state-of-the-art computation.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External validation by the Advisory Committee for GPRA Performance Assessment that NSF has demonstrated "significant achievement" in providing for the collection and analysis of the scientific and technical resources of the U.S. and other nations to inform policy formulation and resource allocation.


Explanation:Assessment by the external Advisory Committee for GPRA Performance Assessment of "significant achievement" in providing for the collection and analysis of the scientific and technical resources of the U.S. and other nations to inform policy formulation and resource allocation.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Long-term Outcome

Measure: External validation by the Advisory Committee for GPRA Performance Assessment that NSF has demonstrated "significant achievement" in supporting research that advances instrument technology and leads to the development of next-generation research and education tools.


Explanation:Assessment by the external Advisory Committee for GPRA Performance Assessment of "significant achievement" in supporting research that advances instrument technology and leads to the development of next-generation research and education tools.

Year Target Actual
2003 Success Success
2004 Success Success
2005 Success Success
2006 Success Success
2007 Success Success
2008 Success
2009 Success
2010 Success
2011 Success
2012 Success
Annual Efficiency

Measure: For 70 percent of the proposals submitted to the Major Research Instrumentation (MRI) Program, be able to inform applicants about funding decisions within six months of proposal receipt or deadline, or target date, whichever is later, while maintaining a credible and efficient merit review system.


Explanation:Because the program category "Research Infrastructure and Instrumentation" no longer exists under NSF's new Strategic Plan, data on the measures associated with the PART Program can no longer be tracked. Therefore, the Foundation decided to highlight one major component of that PART Program, the MRI Program, and report results for the time to decision measure for FY 2007 and beyond.

Year Target Actual
2003 N/A 71%
2004 N/A 69%
2005 N/A 73%
2006 70% 67%
2007 70% 87%
2008 70%
2009 70%
2010 70%
Annual Output

Measure: Number of distinct science/engineering/education users who make use of the TeraGrid.


Explanation:TeraGrid is one of the components of NSF's shared cyberinfrastructure. It consists of a network of eight high-speed computers located throughout the country and linked to each other via a dedicated national network. Each of these computers can process trillions of operations per second--that is, many thousands of times faster than a modern desktop computer. Access to these machines is open to the science/engineering/education community on the basis of peer-reviewed proposals and provides the entire science/engineering/education community with the high-performance computer power needed to address major problems in science and engineering. Current capabilities of the system provide for over 40 teraflops of computing power; nearly 2 petabytes of rotating storage; and specialized data analysis and visualization resources, interconnected at 10-30 gigabits/second. The measure will report on the number of distinct science/engineering/education users who access these computers directly through authorized user accounts or through internet science gateways. While it is not a problem to deal with determining the number of unique users with accounts, there is a possibility of double counting science gateway users. Every effort will made to ensure that users entering through science gateways are counted only once in a reporting period.

Year Target Actual
2004 N/A 600
2005 N/A 1800
2006 2500 3200
2007 3500 4195
2008 4500
2009 6000
2010 8000
Annual Output

Measure: Maintain a high percentage of proposals submitted to the Major Research Instrumentation (MRI) Program from academic institutions not in the top 100 of NSF funding recipients.


Explanation:Because the program category "Research Infrastructure and Instrumentation" no longer exists under NSF's new Strategic Plan, data on the measures associated with the PART Program can no longer be tracked. Therefore, the Foundation decided to highlight one major component of that PART Program, the MRI Program, and report results for the measure for academic institutions not in the top 100 of NSF funding recipients for FY 2007 and beyond.

Year Target Actual
2003 0% 42%
2004 0% 42%
2005 0% 43%
2006 44% 44%
2007 45% 63%
2008 45%
2009 45%
2010 45%

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the Infrastructure and Instrumentation (I&I) program is clearly stated: The NSF Act of 1950, as amended, established NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; and to secure the national defense." The I&I program reflects those parts of NSF's mission directed at programs to "strengthen scientific and engineering research potential" and to "support the development and use of computers and other scientific methods and technologies, primarily for research and education in the sciences and engineering." These mission directives lead to the five program elements that comprise I&I: Digital Library, Major Research Instrumentation, Research Resources, Science Resources Statistics, and Shared Cyberinfrastructure. The Digital Library element builds on work supported under the earlier multi-agency Digital Libraries Initiative. From these origins a decade ago this element has evolved into the National Science, Technology, Engineering, and Mathematics Education Digital Library program which seeks to create, develop, and sustain a national digital library supporting science, technology, engineering, and mathematics education (including pre-K to 12, undergraduate, graduate, and lifelong learning). The Major Research Instrumentation (MRI) element provides opportunities for researchers in organizations of higher education, research museums, and non-profit research organizations to obtain critical instrumentation to enable research and research training. The MRI element also supports research that advances development of new instrumentation leading to next-generation research and education tools. The Research Resources element consists of specialized instrumentation and infrastructure that are not included in any other NSF program. This element includes specialized instruments developed to address a specific research problem as well as small-sized national facilities. Science Resources Statistics, as stated in the legislative mandate of the NSF Act, provides "a central clearinghouse for the collection, interpretation, and analysis of data on scientific and engineering resources and to provide a source of information for policy formulation by other agencies of the Federal Government." Shared Cyberinfrastructure provides researchers and educators with access to the world's highest-performance digital information technology and knowledge management resources, thus catalyzing discovery at the frontiers of science and engineering and addressing questions previously considered unapproachable because of their complexity or scope.

Evidence: Relevant information concerning the Infrastructure and Instrumentation (I&I) program purpose may be found in the NSF Strategic Plan FY 2003-2008 (pages 18-19, http://www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf). Additional information may be found in the National Science Foundation Act of 1950, 42 USC 1861 et. Seq., especially at "Functions of the Foundation" (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=browse_usc&docid=Cite:+42USC1862; also see the National Science Foundation Authorization Act Public Law 107-368, Section 2 (http://www.nsf.gov/mps/ast/aaac/p_l_107-368_nsf_authorization_act_of_2002.pdf), the Science and Engineering Equal Opportunities Act, 42 USC 1885, and S&E Infrastructure for the 21st Century: The Role of the National Science Foundation, NSB 02-190 (http://www.nsf.gov/nsb/documents/2002/nsb02190/nsb02190.pdf).

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: NSF investments in I&I are driven by new and constantly emerging science and engineering needs. The I&I program addresses specific research, education, and workforce requirements at universities and colleges to advance U.S. capabilities required for world-class research and to prepare students for the future U.S. workforce. The ability of NSF researchers and educators to explore the frontiers of science, engineering, and education, to transform these fields, and to enable revolutionary technologies is directly coupled to the availability of necessary I&I. To determine and assess needs, various communities are included in workshops that help identify unmet needs; in the merit review process in making funding decisions; and by Advisory Committees and Committees of Visitors in assessing the strengths and weaknesses of existing activities. In all of the elements comprising I&I, concepts are evolving to encompass distributed systems including software, databases, telescience capabilities, and expert systems. Rapid advances in computing power, communications bandwidth, data storage, and distributed systems allow innovative collaborative and data intensive research styles and enable the broad community of individuals, groups, and organizations to advance science and engineering in ways that revolutionize what they can do, how they can do it, and who can participate.

Evidence: The need for investments in Infrastructure and Instrumentation (I&I) is addressed in Paragraphs 4, 5, 6 of Sec. 2, of NSF Authorization Act of 2002, P.L. 107-368 (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_bills&docid=f:h4664enr.txt.pdf): " (4) The research and education activities of the National Science Foundation promote the discovery, integration, dissemination, and application of new knowledge in service to society and prepare future generations of scientists, mathematicians, and engineers who will be necessary to ensure America's leadership in the global marketplace. (5) The National Science Foundation must be provided with sufficient resources to enable it to carry out its responsibilities to develop intellectual capital, strengthen the scientific infrastructure, integrate research and education, enhance the delivery of mathematics and science education in the United States, and improve the technological literacy of all people in the United States. (6) The emerging global economic, scientific, and technical environment challenges long-standing assumptions about domestic and international policy, requiring the National Science Foundation to play a more proactive role in sustaining the competitive advantage of the United States through superior research capabilities." Other information is found in the NSF FY 2007 Budget Request to Congress (http://www.nsf.gov/about/budget/fy2007) and the President's Information Techanology Advisory Committee (PITAC) report, pp. 35-46. Chap 4: Sustained Infrastructure for Discovery and Competitiveness (http://www.nitrd.gov/pitac/reports/20050609_computational/computational.pdf). Added evidence is be found in NSF FY 2003-2008 Strategic Plan, page 18-19 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201); National Science Foundation Act of 1950, Functions of the Foundation (http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=browse_usc&docid=Cite:+42USC1862). Also, recent reports such as that prepared by the National Science Board's (NSB) Taskforce on Science & Engineering Infrastructure (http://www.nsf.gov/nsb/documents/2003/start.htm), as well as Committee of Visitor (COV) reports, support NSF's role in I&I. COVs assess about one-third of NSF programs each year and review performance over the previous three years. (http://www.nsf.gov/pubs/2006/nsf0601/pdf/nsf0601.pdf Appendix 4a, Page IV-9). Evidence may be also found in S&E Infrastructure for the 21st Century: The Role of the National Science Foundation, NSB 02-190 (http://www.nsf.gov/nsb/documents/2002/nsb02190/nsb02190.pdf). Also see Long-Lived Digital Data Collections Enabling Research and Education in the 21st Century, NSB 05-40 (http://www.nsf.gov/pubs/2005/nsb0540/) and Digital Libraries: Universal Access to Human Knowledge ( http://www.nitrd.gov/pubs/pitac/pitac-dl-9feb01.pdf). Also Revolutionizing Science and Engineering Through Cyber-Infrastructure (http://www.communitytechnology.org/nsf_ci_report/).

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The I&I program is designed so that it is not duplicative of other Federal, state, local or private efforts. NSF is the only independent agency specifically charged with promoting science and engineering research and education in all science and engineering fields and disciplines. The NSF I&I program is designed in concert with other agencies in order to take advantage of complementary programs and to leverage the investments of all agencies. NSF's investments in infrastructure and instrumentation are coordinated with those of agencies with mutual interests and needs for research infrastructure, such as the Departments of Energy, Commerce, Education, and Interior, and NASA. Periodically, SRS conducts surveys of state-supported research and development activities. Within shared cyberinfrastructure assessments national committees such as the Presidential Information Technology Advisory Committee (PITAC) and NSF's Cyberinfrastructure Advisory Committee assess all available resources (public and private).

Evidence: Proposals to NSF programs must identify other agency funding/requests to ensure no unnecessary duplication. The agency has specific, statutory authority to evaluate the status and needs of the various sciences and engineering fields and to consider the results of this evaluation in correlating its research and educational programs with other Federal and non-Federal programs (http://www.nsf.gov/home/about/creation.htm and the National Science Foundation Act of 1950 under the "Functions of the Foundation (42 U.S.C. 1862, Sec 3.a)(5)" at http://assembler.law.cornell.edu/uscode/html/uscode42/usc_sec_42_00001862----000-.html or http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=browse_usc&docid=Cite:+42USC1862). NSF's proposed data collections on science and engineering are cleared through OMB's Office of Information and Regulatory Affairs to insure that the collections are not redundant or duplicative." (http://www.whitehouse.gov/omb/library/OMBINV.EXECUTIVE.html#NSF); http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf and http://www.nsf.gov/cise/sci/reports/atkins.pdf.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: No major design flaws have been identified. The extensive oversight and reviews that I&I receives ensure that its design is free of major flaws that would limit its effectiveness. NSF relies upon the competitive merit review process, NSF program officers, and Committees of Visitors (COVs) to ensure that I&I is effectively serving its intended communities, and to recommend changes to improve program effectiveness and efficiency. Merit review by peers has been recognized as a best practice for administering R&D programs. Oversight by COVs and other independent groups (e.g., Advisory Committees, National Science Board, National Academy of Sciences/National Research Council) provide additional scrutiny of the portfolio, program goals, and results.

Evidence: Evidence showing that I&I is free of major flaws that would limit effectiveness or efficiency may be found in the FY 2005 Performance and Accountability Report, pages II-75-81 (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); the FY 2005 Report of the National Science Board on the National Science Foundation's Merit Review System (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf); and Committee of Visitor Reports (list of all reports - http://www.nsf.gov/about/performance/advisory.jsp). Examples are the 2005 COV report for the Division of Astronomical Sciences - (http://www.nsf.gov/od/oia/activities/cov/mps/2005/ASTcov.pdf); the 2003 COV report to the Advisory Committee of the Geosciences Directorate (http://www.nsf.gov/geo/adgeo/advcomm/fy2003_cov/ATM_ULAFOS_2003_COV_report.doc); and NRC reports on improving SRS data collections (http://fermat.nap.edu/catalog/9775.html, http://fermat.nap.edu/catalog/11111.html).

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The I&I program is targeted to meet the needs of its intended beneficiaries. NSF's investments in I&I rely upon two mechanisms to ensure that the program is effectively targeted and that funding addresses the program's purpose directly. First, the solicitations for each activity contain a clear statement of the purpose in the context of the particular activity. Second, the merit review process ensures that funding is awarded to proposals that best address the activity's purpose. Activities of the SRS program are part of the agency's mission as prescribed in the NSF Act as Amended. The science, engineering, and education communities identify, through workshops and Advisory Committee (AC) reports, the specific research, education, and workforce needs. To ensure program focus, all announcements and solicitations contain clear and explicit statements of a program's purpose and context. All program announcements and solicitations are available online to ensure open access and to inform individuals who might be interested in responding. Targeted outreach is accomplished through MyNSF, an electronic communications system that alerts self-identified individuals of specific opportunities. NSF Program Officers also conduct outreach activities at professional conferences and during visits to academic institutions. NSF staff also participates in outreach at NSF's bi-annual Regional Grants Conferences. Other mechanisms used to ensure effective targeting include use of the World Wide Web and targeted e-mails. The merit review process ensures that funding is awarded to projects that best address the program's purpose. NSF's merit review processes for access to specific resources (e.g., requests for supercomputer time) also target resources so that results of investments will reach the intended beneficiaries. Due to the interdisciplinary nature of many I&I proposals, teams of program officers representing multiple directorates work together to create multidisciplinary panels with collective expertise to ensure scrutiny of all aspects of the proposed activity. In some cases, the National Science Board reviews recommended I&I award portfolios to ensure that they appropriately support NSF's mission. Once awards are made, Advisory Committees and Committees of Visitors review the impacts and effectiveness of I&I investments.

Evidence: Evidence demonstrating that I&I is effectively targeted can be found here: MyNSF, formerly the Custom News Service, allows viewers to receive notifications about new content posted on the NSF website. Notification can be received via email or RSS. This service alerts potential proposers of relevant solicitations and is available at http://www.nsf.gov/mynsf/. Examples of how an I&I solicitation targets a specific audience and addresses a specific purpose may be seen at http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=6186&org=EAR&from=home, a solicitation that deals with earth sciences instrumentation and facilities, and at http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5293&org=AST&from=fund, where the possibility of user access to observing time on telescopes of the U. S. Air Force is provided.. COV reports include a specific section asking COV panelists to discuss a program's portfolio of awards. A list of all COV Reports is available at http://www.nsf.gov/about/performance/advisory.jsp; for examples, see the 2005 COV report for Major Research Instrumentation at http://www.nsf.gov/od/oia/activities/cov/cross/2005/MRIcov.pdf, pages (21-25) and the 2003 COV report for Shared Cyberinfrastructure at http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf (pages 8-11).

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The I&I program has specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program; these are listed in the 'Measures' tab. These are drawn from the objectives set forth in the NSF FY 2003-2008 Strategic Plan, and they enable the best new ideas generated by scientists and engineers working at the forefront of discovery. The I&I Program is a subset of the Tools Strategic Goal - providing "broadly accessible, state-of-the-art S&E facilities, tools, and other infrastructure that enable discovery, learning and innovation." This reflects the parts of NSF's mission directed at programs to strengthen scientific and engineering research potential by enabling access to, and development of, new or improved instrumentation and other shared resources, including scientific software and scientific databases, digital libraries of research and educational information, organisms used in research, and specialized facilities. These I&I elements serve well-defined communities of scientists and policy makers and provide capabilities, resources and capacity that serve the science, engineering, and education communities.

Evidence: Evidence of specific long-term performance measures that reflect the purpose of the program can be found in the Measures Tab; and in the "Tools" section of the FY 2007 NSF Budget Request (http://www.nsf.gov/about/budget/fy2007/pdf/fy2007.pdf ; page18). Additional information about the assessment of performance may be found in the NSF Strategic Plan FY 2003-2008, pages 27-29 (http://www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf). Some activities within the I&I program require tracking of specific performance measures unique to the activity. Examples include projects supported through the BIO Living Stocks Program (see http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=9189&org=DBI).

YES 9%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program has ambitious targets and timeframes for its long-term measures. These include verifiable qualitative assessments by external experts assessing NSF's performance with respect to its long-term goals for Tools . Some projects within the I&I portfolio enter into multiple partnerships to support and enable development of infrastructure and instrumentation. The assessment by external experts ensures that the goals and timeframes for these activities are appropriately ambitious and that they promote continuous improvement. The primary mechanisms for external evaluation are the annual Advisory Committee for GPRA Performance Assessment (AC/GPA) and the Committee of Visitors (COV) process. Other external guidance comes from workshops, Directorate Advisory Committees, and Principal Investigator (PI) meetings.

Evidence: Ambitious targets and timeframes for long-term measures can be found in the Measures Tab; FY 2005 AC/GPA Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210; page 36); and the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0601; page II-53).

YES 9%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The I&I program has specific annual performance measures, shown in the Measures Tab, which can demonstrate progress toward achieving the program's long-term goals and the agency's strategic goals. These are efficiency and broadening participation measures that are directly related to I&I long-term goals of assisting a broad science, engineering and education community to gain access to state-of-the-art technology and communication capabilities.

Evidence: Specific annual performance measures demonstrating progress toward achieving long-term goals may be found in the Measures Tab. In FY 2005, committees of external experts determined that NSF had demonstrated significant achievement for all of the annual performance indicators for the TOOLS goal, which includes infrastructure and instrumentation; FY 2005 AC/GPA Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210; page 36); Division of Biological Infrastructure COV Report (http://www.nsf.gov/od/oia/activities/cov/bio/2004/Response_DBI2004COVRpt.pdf); and Major Research Instrumentation COV Report FY 2005 (http://www.nsf.gov/od/oia/activities/cov/covs.jsp).

YES 9%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The I&I Program has baselines and ambitious targets for its annual measures. Baselines and targets are set with the prospect of significant achievement in the discovery across the frontier of S&E, connected to learning, innovation and service to society. Baselines for the annual performance measures are obtained from internal NSF sources and I&I Programs annual reporting data sources. Ambitious targets for its annual measures are shown in the Measures Tab.

Evidence: Performance measures can be found in the Measures Tab. Additional information may be accessed on the Performance Assessment Information website, (http://www.nsf.gov/about/performance/). Annual AC/GPA reports (http://www.nsf.gov/about/performance/acgpa/index.jsp); and the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par) provide specific information. The COV reports provide information about the performance of individual programs (http://www.nsf.gov/od/oia/activities/cov/covs.jsp).

YES 9%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The I&I program obtains ongoing commitment toward its goals from all its partners by aligning program descriptions and solicitations with those goals, by using the merit review process to select proposals that demonstrate commitment to these goals, and by requiring grantees to submit satisfactory annual and final progress reports. NSF program officer approval of these reports is a prerequisite for continuation and/or renewal of support for both current and future projects. To receive further support (subsequent awards), all applicants must include in their new proposals a report on the results of previous NSF support, which is then considered by reviewers during the merit review of the proposal. No subsequent awards can be made to an applicant unless a Program Officer has approved the final progress reports for all previous awards.

Evidence: Evidence of commitment to annual and long-term goals may be found in annual and final project reports and in the Grant General Conditions (http://www.nsf.gov/pubs/gc1/gc1_605.pdf). Numerous program announcements are issued annually by various science and engineering disciplines throughout the Foundation (http://www.nsf.gov/funding/). All program announcements reference NSF's two merit review criteria (http://www.nsf.gov/pubs/1999/nsf99172/nsf99172.htm), which are aligned with the I&I goals. For example, see the MRI solicitation (http://www.nsf.gov/pubs/2005/nsf05515/nsf05515.pdf): "NSF particularly encourages collaborations between disciplinary scientists who are knowledgeable in unique instrumentation areas and private sector experts in the area of instrument manufacture. Working together within a framework of concurrent engineering, such partnerships have the potential to create new products with wide scientific and commercial impact." Some specific activities and projects are governed by Interagency MOUs and in some cases, cooperative agreements between agencies and performers that specify annual and long-term goals. Typically agency representatives meet formally or informally annually to assess progress described in required reports. Examples include the interagency MOU governing support for a macromolecular structure database and the cooperative agreement for the Protein Data Bank. (internal documents). Other evidence may be found in the Report to the NSB on the NSF Merit Review Process - FY 2005 (http://www.nsf.gov/nsb/documents/2006/0306/merit_review.pdf).

YES 9%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: Independent evaluations of I&I program component activities are conducted regularly by Committees of Visitors (COVs), the Advisory Committee for GPRA Performance Assessment (AC/GPA), Directorate and other Advisory Committees (ACs), and, on an as-needed basis, by other independent entities to inform and support program improvements, evaluate effectiveness and relevance, and influence program planning. Directorate ACs generally meet twice a year to review Directorate performance and COV reports. The AC/GPA meets annually to determine whether the breakthroughs, advances and other results reported from I&I demonstrate significant achievement towards long-term outcome goals. Each activity at NSF, including the I&I component activities, is reviewed on a triennial cycle by a COV. NSF's approach to evaluation was recently highlighted by GAO as an "evaluation culture ?? commitment to self-examination, data quality, analytic expertise, and collaborative partnerships." I&I sponsors workshops and Principal Investigator meetings that help define needs and opportunities, in order to be responsive to the NSF mission. NSF staff and external experts conduct site visits for major activities. NSF also supports focused assessments by the National Research Council and other organizations, as appropriate. The results of all these activities inform NSF senior management and contribute to development of plans for the agency.

Evidence: Independent evaluations are critical to NSF performance assessment as evidenced by the GAO report Program Evaluation: An Evaluation Culture and Collaborative Partnerships Help Build Agency Capacity, GAO-03-454 May 2, 2003 http://www.gao.gov/new.items/d03454.pdf). Examples of independent evaluations include Advisory Committee reports, for example - http://www.nsf.gov/geo/advisory.jsp and http://www.nsf.gov/geo/acgeo_cov.jsp ;the Advisory Committee for GPRA Performance Assessment (AC/GPA_ Report, Committee of Visitor reports [list of all reports and responses - http://www.nsf.gov/about/performance/advisory.jsp; 2005 COV report for the Division of Astronomical Sciences - http://www.nsf.gov/od/oia/activities/cov/mps/2005/ASTcov.pdf, 2003]; and external reviews, community workshops, and annual site visits. Specific activities require supported projects to have external advisory committees that meet on an annual basis to assess progress and plans, and to recommend changes where necessary. Summaries of reports and of project directors' responses are provided to NSF. More specific examples of independent evaluations include projects such as the BIO Living Stocks Program, each of which is required to have an external advisory committee (http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=9189&org=DBI); and NRC reports on improving SRS data collections (http://fermat.nap.edu/catalog/9775.html and http://fermat.nap.edu/catalog/11111.html).

YES 20%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: NSF's performance structure provides the underlying framework for NSF's annual budget request. Each major NSF organization - each directorate - ties its budget directly to NSF's performance framework. This is further documented in the performance chapter of the NSF Budget Request to Congress. With respect to presenting the resource needs in a clear and transparent presentation, the NSF budget displays resource requests by structural component and by performance goal. This presentation is based on consultations over the past year with key Congressional and OMB staff, and it also incorporates recommendations from the 2004 report on NSF by the National Academy of Public Administration. The purpose of this presentation is to highlight the matrix structure that NSF employs, with the major organizational units each contributing to the goals and investment categories established in the NSF Strategic Plan. This revised presentation contains additional information on the portfolio of investments maintained across NSF.

Evidence: The FY 2007 NSF Budget Request to Congress, as well as previous budget requests, indicates the long-term goals of the components of the I & I program and presents the resources needed in a complete and transparent manner (http://www.nsf.gov/about/budget/fy2007/). Full budgetary costing for Tools is included in the FY 2007 Budget Request to Congress. Additional information may be found in the NSF Strategic Plan FY 2003-2008 (http://www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf) and NSF Budget Cost Performance Integration Plan (Internal Document).

YES 9%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: No major strategic planning deficiencies have been identified. Although strategic planning deficiencies have not been noted in the components of the I&I Program, the COV and AC processes provide valuable constructive feedback on an ongoing basis concerning areas where strategic planning can be strengthened, and in response the Foundation takes steps to address those issues. Also, NSF solicits public feedback on the agency's goals and strategic planning processes as part of each independent (external) assessment of agency activities. Steps to address weaknesses are identified and implemented. Reports on updates to implementation of COV recommendations are required on an annual basis and these are posted on the NSF web site.

Evidence: The program implements recommendations from outside evaluations as evidenced in the Committee of Visitors reports and NSF responses. These include: COV Reports (list of all reports and responses - http://www.nsf.gov/about/performance/advisory.jsp); for example, the 2005 COV report for the Division of Astronomical Sciences (http://www.nsf.gov/od/oia/activities/cov/mps/2005/ASTcov.pdf); the 2005 COV report on Division of Shared Cyberinfrastructure (http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf and the response http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIresponse.pdf). In addition, there are Directorate Advisory Committee minutes (for example: http://www.nsf.gov/bio/advisory.jsp; http://www.nsf.gov/geo/advisory.jsp, http://www.nsf.gov/cise/advisory.jsp); and the NSF Strategic Plan FY 2003 - FY 2008 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201) and NSF's FY 2005 Performance and Accountability Report, pages I9-I10 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf0601).

YES 9%
2.RD1

If applicable, does the program assess and compare the potential benefits of efforts within the program and (if relevant) to other efforts in other programs that have similar goals?

Explanation: NSF's investments in Infrastructure and Instrumentation address many national STEM research needs that are not under the purview of the more mission-specific federal, state or local agencies. No other program of this scope exists in the Federal Government. In areas where research activities may overlap, NSF coordinates its activities with that of the agencies in order not to duplicate efforts and to ensure that each agency supports those efforts most appropriate to its mission. I&I comprises the broad, core set of activities that enable progress in the array of fields needed for the U.S. to maintain leadership in science, engineering and education. The Office of Science and Technology Policy, the National Science and Technology Council, the National Science Board, and other policy-making bodies regularly review NSF's investments in I&I in the context of the overall Federal investment in science and engineering. Prior to initiating support for new activities, workshops and external reviews are typically conducted to ensure that scientific opportunities justify the expenditure, and that supporting the acquisition and operation of infrastructure is the most efficient method of facilitating the science in question. NSF senior management reviews and compares opportunities of competing projects and selects from them, forwarding them for subsequent review and approval to the NSB. Interagency and international agreements and understandings are active and on file for most projects, demonstrating the commitment of NSF to non-duplication and efficient and effective coordination of efforts.

Evidence: Evidence of coordination include the High Energy Physics Advisory Panel (coordinates various physics activities between NSF and DOE); Astronomy and Astrophysics Advisory Committee (coordinates various NASA, NSF, and DOE activities in the space sciences); and Ocean.US (http://www.ocean.us/). Also relevant, for example, is the Astronomy and Astrophysics Decadal Survey, "Astronomy and Physics in the New Millennium" (http://www.nap.edu/books/0309070317/html/).

YES 9%
2.RD2

Does the program use a prioritization process to guide budget requests and funding decisions?

Explanation: In formulating budget requests for the various activities within I&I, NSF employs a prioritization process that develops both NSF's overall highest priorities and priorities for the various component activities within I&I. As an early step in the process, each of the activities within the I&I program provides input to senior management with respect to past performance and future needs. Senior management integrates that information, prioritizes budget requests within and among programs, and determines funding levels, all of which are reviewed by the National Science Board. Priorities Factors considered in developing priorities for component activities include: NSF's highest funding priorities (listed in the FY 2007 Budget Request - especially strengthening the core and addressing major national challenges identified by the Administration); needs and opportunities identified by Committees of Visitors and the Advisory Committees; new frontiers and topics of major impact that are identified by the scientific community, e.g., through workshops; and important emerging areas for which we receive numbers of highly ranked proposals. In reaching funding decisions, the program relies on the external merit review system to aid in the prioritization of proposals for final funding decisions. The external merit review system includes review of proposals by expert panelists with backgrounds that match the core constituencies of the program. These reviewers assess proposals according to the Merit Review Criteria: intellectual merit and broader impacts. Program directors consider the advice of reviewers, NSF's core strategies, and the desire to maintain a diverse portfolio of awards in making funding decisions.

Evidence: Relevant information regarding the prioritization process may be found in the NSF Budget Requests to Congress and the NSF Strategic Plan FY 2003- 2008 (http://www.nsf.gov/od/gpra/Strategic_Plan/FY2003-2008.pdf). Additional information regarding funding decisions may be found in the Grant Proposal Guide (www.nsf.gov/pubsys/ods/getpub.cfm?gpg). Examples of documentation include: National Academy Studies, and studies such as the High Energy Physics Advisory Panel reports, and the NAS Decadal Review of Astronomy: Astronomy and Astrophysics in the New Millennium (2001) ((http://www.nap.edu/books/0309070317/html/). For the atmospheric sciences evidence includes the National Academy of Sciences study, "Strategic Guidance for NSF's Support of the Atmospheric Sciences" (http://www4.nas.edu/webcr.nsf/ProjectScopeDisplay/BASC-U-03-03-A?OpenDocument). Additional evidence demonstrating a prioritization process to guide budget requests and funding decisions may be found in NSF Strategic Plan FY 2003 - FY 2008 (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201); Congressional Budget Justification (http://www.nsf.gov/about/budget/fy2007/ ;Overview, p 1); COV reports (http://www.nsf.gov/od/oia/activities/cov/); National Science Board Reports, minutes and agendas (http://www.nsf.gov/nsb/); Funding decisions: Grant Proposal Guide (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg).

YES 9%
Section 2 - Strategic Planning Score 100%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: NSF programs collect timely high-quality performance data relating to key program goals, and they use this information to adjust program priorities, make decisions on resource allocations, and make other adjustments in management actions. Performance information for all programs, including those in I&I, is collected from NSF grant recipients via interim, annual, and final project reports. Site visits to larger projects are also used to collect performance information. Committee of Visitors (COV) reviews and recommendations are utilized to improve program performance. All of these assessments impact management practices. The information is shared with and reviewed by key program partners within NSF as well as with programs at other agencies, when relevant and appropriate.

Evidence: Evidence relating to the use of credible performance information may be found in Committee of Visitors Reports for all NSF Directorates and responses at http://www.nsf.gov/od/oia/activities/cov/, in the 2006 Committee of Visitors Report for the Division of Science Resources Statistics (available on the NSF website in April 2006,) in the minutes of Directorate Advisory Committees (e.g., see http://www.nsf.gov/bio/advisory.jsp, http://www.nsf.gov/attachments/102806/public/MPSAC-Minutes-April-2005-Approved.pdf) and in annual reports of the Advisory Committee for GPRA Performance Assessment report http://www.nsf.gov/about/performance/reports.jsp). Data is collected through annual, interim, and final project reports (internal documents) and through an internal database. For example, with respect to Shared Cyberinfrastructure, evidence relating to the use of credible performance information can be found in the 2005 Committee of Visitors Report for the Division of Shared Cyberinfrastructure (http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf). There was also an extensive review, held in January 2006, of the Extensible Terascale Facility (TeraGrid) and of Core funding of the National Center for Supercomputer Applications and of the San Diego Supercomputer Center, and the review documents are also available.

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Federal managers and program partners are held accountable for cost, schedule and performance results. NSF employees/program officers are held accountable through annual performance evaluation and COV reviews. NSF grantees must meet annual and final reporting requirements as well as financial record keeping requirements. Other program partners are held accountable through cooperative agreements or contracts. NSF staff monitor grantee performance (including cost, schedule and technical performance) and take corrective action as necessary. In requesting further support applicants must discuss results of previous NSF support in their new proposals. Such past performance is considered in the merit review process. Subgrantees are similarly held accountable by grantees and contractors. Contracting Officer's Technical Representatives (COTRs) account for cost, schedule and performance of NSF contractors that perform several large surveys for NSF.

Evidence: Evidence demonstrating that federal managers and program partners are accountable for cost, schedule and performance results may be found in COV reports (http://www.nsf.gov/od/oia/activities/cov/) such as the 2005 Shared Cyberinfrastructure COV report and in the 2006 review report of the TeraGrid and Core cyberinfrastructure centers; Awardee project reports; annual evaluations by external advisory committees of large or long-term projects; interim evaluations of long term projects by external site visit teams; NSF Grant General Conditions (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf98gc1a); Federal Cash Transaction Reports; and annual performance evaluations of NSF staff/program officers.

YES 10%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: NSF routinely obligates its funds in a timely manner. NSF obligates over 99 percent of this support within the first year appropriated. Funds for incremental funding of continuing grants are made at the beginning of each fiscal year; however, each funding increment requires the approval of an annual progress report by the program officer. NSF's grant and contract monitoring activities assure that the funds are used for their intended purpose. NSF also has pre- and post-award internal controls to reduce the risk of improper payments. Beginning in FY 2004 NSF has incorporated erroneous payments testing of awardees into the on-site monitoring program.

Evidence: Data and information demonstrating that funds are obligated in a timely manner and spent for the intended purpose are included in the PriceWaterhouse Coopers NSF FY 2001 Risk Assessment for Erroneous Payments; data on NSF Carryover, presented in the NSF Budget Request to Congress (http://www.nsf.gov/about/budget/); Risk Assessment and Award Monitoring Guide; clean opinion on financial statements for past 7 years; and Federal Cash Transaction Reports.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: NSF has procedures to measure and achieve efficiencies and cost effectiveness in program execution. NSF continues to be a leader in the use of information technology to advance the agency mission. NSF's use of IT results in more timely and efficient processing of proposals. NSF has established a time to decision goal to ensure that Principal Investigators who apply for grants are given a decision on the funding in a reasonable amount of time. Monthly reports on progress are sent to managers and results are available to all staff through the agency's Enterprise Information System. NSF uses letters of intent and coordination of program deadlines to improve the efficiency of proposal receipt and processing. Several I&I activities limit the number of proposals that can be submitted by a single institution. This tends to strengthen submitted proposals while relieving administrative burden on NSF. In addition, NSF competitively sources awards for large data collections via regular open competitions, thus leading to cost effectiveness in these programs.

Evidence: Evidence of procedures to measure and achieve cost effectiveness may be found in the Measures Tab. Data and information on time-to-decision are located on pages I-20 and II-89 in the FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). Data and information on organizational excellence is located on page II-75 in FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par). Other evidence may be found in COV reports (http://www.nsf.gov/od/oia/activities/cov/); the NSF FY 2003-2008 Strategic Plan (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf04201); NSF Grant Proposal Guide (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg); and program solicitations (http://www.nsf.gov/funding/). Evidence relating to the procedures used to ensure efficiencies and cost effectiveness can be found in the cited 2005 Shared Cyberinfrastructure COV report.

YES 10%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The elements within I&I promote partnerships, including collaboration with other agencies, industry, and national laboratories for projects of mutual interest and international collaboration. NSF regularly shares information with other agencies and participates in coordination activities through OSTP, the National Science and Technology Council, and other interagency organizations. In addition to these external activities, mechanisms are established for joint funding between NSF Directorates. Policy guidance provided by the National Science Board incorporates perspectives from related programs and investments.

Evidence: Data and information demonstrating that the program collaborates effectively with related programs are included in management plans, internal administrative manuals, in program solicitations (http://www.nsf.gov/funding/, and in award documents. Information on the STEM Digital Libraries is available at http://nsdl.org/. Information on the Major Research Instrumentation is available at http://www.nsf.gov/od/oia/programs/mri/. Information on Shared Cyberinfrastructure is available at http://www.nsf.gov/dir/index.jsp?org=OCI.. Information on the Division of Science Resources Statistics is available at http://www.nsf.gov/statistics/. Examples of interagency joint grant announcements include: NSF/DOE Partnership in Basic Plasma Science and Engineering (http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=5602&org=ATM&from=home), the Interagency Coordinating Committee for Airborne Geosciences Research and Applications (http://www.nsf.gov/geo/atm/ulafos/laof/charter.jsp), Ocean Technology and Interdisciplinary Coordination (http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=12724&org=OCE), Global Learning and Observations to Benefit the Environment (GLOBE) - the Integrated Earth Systems Science Program (IESSP)( http://www.nsf.gov/pubs/2006/nsf06515/nsf06515.htm) and Facilities to Empower Geosciences Discovery 2004-2008 (NSF 03-053) http://www.nsf.gov/geo/facilities/; NSF is the on an interagency MOU signed by NSF, NIH, and Doe for the Protein Data Bank (http://www.pdb.org/); The Division of Science Resources Statistics (SRS) is one of 14 principal federal statistical agencies making up the Interagency Council of Statistical Policy which coordinates federal statistical activities. NSF supports the Council's subactivities FedStats, a portal for Federal statistics from many agencies. http://www.fedstats.gov/agencies. Many of NSF's data collection activities are coordinated with and co-funded by other agencies, e.g., http://www.norc.uchicago.edu/issues/sed-2004.pdf. NSF data collection activities follow policy guidance developed for related programs, http://www.whitehouse.gov/omb/inforeg/pmc_survey_guidance_2006.pdf.

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: NSF uses strong financial management practices at the agency level that are reflected at the program level. NSF was the first federal agency to receive a 'green light' for financial management on the President's Management Agenda scorecard. NSF prepares annual financial statements (including Balance Sheet, Statement of Net Cost, Statement of Changes in Net Position, Statement of Budgetary Resources, and Statement of Financing). Supplementary statements are also prepared (including Budgetary Resources by Major Accounts, Intragovernmental Balances, Deferred Maintenance, and Stewardship Investments) in conformity with generally accepted accounting principles in the U.S. These are subjected to independent audits. NSF has received a clean opinion on its financial audits for the last eight years.

Evidence: Evidence of NSF's strong financial management practices may be found in the President's Management Agenda, in the results of NSF financial audits, in the annual Performance and Accountability Reports (http://www.nsf.gov/pubsys/ods/getpub.cfm?par). Also see the Executive Branch Management Scorecard (http://www.whitehouse.gov/results/agenda/scorecard.html, the results of NSF financial audits, and performance and management assessments (http://www.whitehouse.gov/omb/).

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: No management deficiencies have been identified. The program has processes in place such as Committee of Visitors reports and NSF responses to these reports. External Committees of Visitors (COVs) regularly provide feedback on programmatic and management-related concerns. COVs assess the integrity and efficiency of the processes for proposal review and address management issues in response to a series of questions asked of all COVs. NSF staff, in turn, respond to any management deficiencies identified by the COV through an agency response that outlines the steps the agency will take to address the issues. The cognizant directorate Advisory Committee reviews the NSF response, as well as the initial COV report.

Evidence: For evidence, see COV reports and NSF responses (http://www.nsf.gov/od/oia/activities/cov/covs.jsp) and NSF Inspector General reports (http://www.nsf.gov/oig/pubs.jsp) See the COV reports for Major Research Instrumentation and Shared Cyberinfrastructure at http://www.nsf.gov/od/oia/activities/cov/cross/2005/MRIcov.pdf, page 25; http://www.nsf.gov/od/oia/activities/cov/cross/2005/MRIresponse.pdf, page 1 and http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf, page 11; and http://www.nsf.gov/od/oai/activities/cov/cross/2005/SCIresponse.pdf, page 4.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: All program activities rely upon NSF's competitive, merit review process that includes external peer evaluation. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF program officer, and usually by 3-10 other persons outside NSF who are experts in the particular field represented by the proposal. All such activities are reviewed by NSF's Committees of Visitors. Competitive merit review, with peer evaluation, is NSF's accepted method for informing its proposal decision process. The NSB-approved criteria address the "Intellectual Merit" and the "Broader Impacts" of the proposed effort. Some solicitations contain additional criteria that address specific programmatic objectives. NOTE: The weight of this question has been increased to reflect the relative importance of merit review in verifying the relevance, quality, and performance of NSF's investments.

Evidence: Evidence demonstrating that grants are awarded through a clear competitive process are included in the NSF FY 2004 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par); NSB Policy on Recompetition (http://www.nsf.gov/nsb/documents/1997/nsb97224/nsb97224.txt); Report to the NSB on the NSF Merit Review Process - FY 2004 (http://www.nsf.gov/nsb/documents/2005/MRreport_2004.pdf); NSF Merit Review Criteria (http://www.nsf.gov/pubs/1999/nsf99172/nsf99172.htm); and COV Reports and NSF responses (http://www.nsf.gov/od/oia/activities/cov/ and in the annual Performance and Accountability Reports (http://www.nsf.gov/pubsys/ods/getpub.cfm?par).

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: NSF uses an array of oversight practices including appropriate grant mechanisms, site visits, project reports, an IT-enabled grants management system, and a risk-based monitoring program that provide sufficient knowledge of grantee activities. The NSF merit review process provides a high degree of assurance that awardees are technically qualified and have resources available to successfully undertake the work they have proposed. At the award phase, the most appropriate funding instrument is determined, as are terms and conditions that ensure appropriate reporting on progress and expenditure of funds. NSF's award oversight is tailored for each award, and could consist of any combination of the following: regular reports from grantees on progress, desk-reviews, site visits, meetings with project staff, and interim reviews by special panels . Continuing support (i.e., continuing grant increments) is based upon required template-based annual progress reports submitted by grantees that are subject to review and approval by NSF Program Officers before additional funds are released. NSF has a formal Award Monitoring and Business Assistance Program based on a financial and administrative risk assessment of NSF awardee institutions.

Evidence: Data and information demonstrating sufficient oversight practices are included in COV reports (http://www.nsf.gov/od/oia/activities/cov/); awardee project reports; Report to the NSB on the NSF Merit Review Process - FY 2004 (http://www.nsf.gov/nsb/documents/2005/MRreport_2004.pdf); Risk Assessment and Award Monitoring Guide; President's Management Agenda (PMA) Scorecard for Financial Management (http://www.whitehouse.gov/results/agenda/scorecard.html); NSF's clean audit opinions; site visit reports; trip reports from attendance at professional meetings; and workshops and grantee meetings as presented in the annual Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par).

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The program collects grantee performance data on an annual basis and makes it available to the public in a transparent and meaningful manner. NSF Grant General Conditions require that results of NSF-supported projects be published in the open literature and that NSF support is appropriately referenced/cited. NSF collects significant results from NSF-supported projects and makes them available on the web in terms that the public can understand through the Discoveries area of the NSF web site; press releases; annual budget requests to Congress; the Performance and Accountability Report, an annual brochure on Performance Highlights; and the Report of the Advisory Committee for GPRA Performance Assessment. Information is also made available to the public on the numbers of proposals and numbers of awards as well as, for each award, the name of the principal investigator, the awardee institution, amount of the award, and an abstract of the project. The Budget Internet Information System (BIIS) contains extensive information on awards and funding trends.

Evidence: Evidence demonstrating performance data is collected from grantees includes NSF Discoveries web site (http://www.nsf.gov/discoveries/); news releases (http://www.nsf.gov/news/news_list.cfm?nt=2); FY 2005 Performance and Accountability Report (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=par); AC/GPA annual reports (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05210); FY 2005 Performance Reports and Highlights (http://www.nsf.gov/about/performance/reports.jsp); FY 2007 Budget Request (http://www.nsf.gov/about/budget/); NSF Grant General Conditions (http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf98gc1a); Highlights of annual meetings/grantees meetings; workshops; and awards database (http://www.nsf.gov/awardsearch/).

YES 10%
3.RD1

For R&D programs other than competitive grants programs, does the program allocate funds and use management processes that maintain program quality?

Explanation: NSF programs are administered as competitive grant programs with the exception of SRS that has other mechanisms. Most SRS surveys are competitively sourced, but it also conducts several surveys and analytical projects using interagency agreements. SRS staff on these projects have special status to conduct extensive quality reviews of these highly confidential data collections or analytical procedures.

Evidence:

NA 0%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The program has demonstrated adequate progress in achieving its long-term performance goals, as shown on the measures tab. NSF relies on external evaluation in validating its progress. Input is derived from Committees of Visitors, Principal Investigator annual and final project reports, and summaries of substantial outcomes ('nuggets') from funded research. In addition, since FY 2002 the NSF Advisory Committee for GPRA Performance Assessment (AC/GPA) has been aggregating this input for its assessment. The AC/GPA has determined that the accomplishments in all the indicators associated with the Tools strategic goal (for which I&I is a major subset) demonstrate "significant achievement." The I&I program expands opportunities for U.S. researchers, educators, and students at all levels to access state-of-the-art S&E tools, databases, and other infrastructure. The program supports research that advances instrument technology and leads to the development of next generation research and education tools. The program also provides for the collection and analysis of the scientific and technical resources of the U.S. and other nations to inform policy formulation and resource allocation.

Evidence: Evidence demonstrating progress in meeting the Infrastructure and Instrumentation program's long-term goals may be found in: the Advisory Committee for GPRA Performance Assessment Report (http://www.nsf.gov/pubs/2005/nsf05210/nsf05210.pdf); the June 2005 Committee of Visitors Report for the Shared Cyberinfrastructure Division (http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf); in the annual Performance and Accountability Reports, pages II-51ff (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); in annual and final project reports.

YES 20%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program achieves its annual performance goals. NSF is increasing the number and diversity of organizations that have access to advanced instrumentation and infrastructure, as well as increasing the number of proposals from new investigators. NSF has established a time to decision (dwell time) goal to ensure that applicants for grants are given a decision on the funding in a reasonable amount of time. Since 2002, NSF has more than achieved its goal of informing applicants of funding decisions for at least 70 percent of proposals "within six months of deadline or target date, or receipt date, whichever is later."

Evidence: Evidence demonstrating progress in meeting the Infrastructure and Instrumentation program's annual goals may be found in the Measures Tab.

SMALL EXTENT 7%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: NSF continues to improve efficiency and cost effectiveness of agency operations through increasing use of electronic business systems and operations. It leads in the vigorous and dynamic use of information technology to advance the agency mission. NSF continues to improve operational efficiencies through electronic systems including the use of interactive electronic panels, online reviews, and online award processing. NSF builds on the past successes of FastLane, which accepts proposals from the community it serves electronically and supports practices of NSF and awardees such as university/NSF business processes. NSF continues improvement notably with the Electronic-Jacket (eJ) that supports merit review, real-time access to proposal and award information, and provides for "shared-work" processes for interdisciplinary proposals that involve more than one organization. For example, the declining of proposals is fully electronic. eJ has been an essential enabler for NSF to maintain dwell time results in the face of increased numbers of proposals. NSF's Business and Operations Advisory Committee has rated NSF as successful in developing and using "new and emerging technologies for business application." NSF has received green ratings in 2002, 2003, 2004, and 2005 in e-Government. These ratings indicate the current success and positioning for future successes in government-wide electronic business methods. Independent reviews by COVs and other external groups provide additional scrutiny to portfolio and program goals, ensuring effectiveness and operational efficiency. Where appropriate, the number of proposals accepted from a single institution is limited, leading to stronger proposals coming to the Foundation, fewer proposals having to be processed, higher success rates, and encouraging interdisciplinary collaboration within submitting universities. Information Technology improvements have eliminated grantee mailing costs, significantly reduced printing costs and permitted more timely and efficient processing of proposals.

Evidence: The I&I Program processed over 6,700 proposals in the period FY 2003-2005 while meeting or exceeding NSF efficiency goals in areas such as proposal dwell time in two of these three years (71 percent, 69 percent, and 73 percent respectively - see Annual Efficiency Measure). Although the numbers of I&I proposals increased by approximately 10 percent over that period, these improvements were achieved without significant staff increases. Evidence demonstrating progress in meeting the Infrastructure and Instrumentation program's long-term goals may be found in: the Advisory Committee for GPRA Performance Assessment Report (http://www.nsf.gov/pubs/2005/nsf05210/nsf05210.pdf); the June 2005 Committee of Visitors Report for the Shared Cyberinfrastructure Division (http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf); in the annual Performance and Accountability Reports, pages II-51ff (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); in annual and final project reports. The President's Management Agenda Scorecard is available at http://www.whitehouse.gov/results/agenda/scorecard.html.

LARGE EXTENT 13%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The I&I program compares favorably to other programs in government, and other private and public sectors. Because of the quality, relevance and performance of NSF programs, programs in other Federal agencies and international governments often emulate aspects of NSF investments and processes. NSF uses external expert review, through Committees of Visitors (COVs) and Advisory Committees (ACs), to ensure the continued high quality of the distinct disciplinary components that make up I&I. NSF uses COVs and ACs to assess program performance in light of their knowledge of programs throughout the government and other private and public sectors. These COVs and ACs include leaders from the private sector, from other government agencies, from the university community, and from other countries. In addition, NSF is the only Federal agency charged with promoting the progress of science and engineering research and education across all fields and disciplines. NSF's activities through its investments in I&I thus provide a major source of federal support that enables research at universities and other institutions in many disciplines. No other entity, government or private, addresses this far-reaching purpose. NSF's activities, through its investment in I&I, also address unique national interdisciplinary research and education needs that are not addressed by the mission agencies. NSF has standing committees and other mechanisms involving representation from other organizations to ensure coordination, communication, and best practices effective implementation of other programs. The NSF activities often create a nation-wide and even global response to address the goals of the program.

Evidence: Evidence demonstrating progress in meeting the Infrastructure and Instrumentation program's long-term goals may be found in: the Advisory Committee for GPRA Performance Assessment Report (http://www.nsf.gov/pubs/2005/nsf05210/nsf05210.pdf); the June 2005 Committee of Visitors Report for the Shared Cyberinfrastructure Division (http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf); in the annual Performance and Accountability Reports, pages II-51ff (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); in annual and final project reports. Science and Engineering Indicators (http://www.nsf.gov/statistics/seind06/), the statutory, biennial series of statistics produced by NSF/NSB, is the Nation's premier source of national and international data on science and engineering resources and has served as the model for similar indicator series around the world. NSF's support of ground-based astronomy was discussed in the National Academy of Sciences report entitled "U.S. Astronomy and Astrophysics: Managing an Integrated Program" (http://fermat.nap.edu/catalog/10190.html)

YES 20%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Independent reviews by Committees of Visitors (COVs), by other external groups (e.g., Advisory Committees (ACs), National Academies (NAS/NAE/NRC), and PCAST), and by the National Science Board provide additional scrutiny of the portfolio and program goals and results. The Advisory Committee for GPRA Performance Assessment (AC/GPA) report noted that NSF demonstrated "significant achievement" with respect to its FY 2005 GPRA Strategic Outcome Goals for Tools (which includes I&I). Assessment by COVs of the "overall quality of the research and education projects supported" by the programs in I&I is a continuing measure. These independent reviews find that programs are effective and achieve results. In particular, the most recent evaluation that included the entire Tools goal, incorporating the I&I portfolio, was the 2005 meeting of the AC/GPA. The AC/GPA wrote: "... demonstrated significant achievement for all indicators ..."for the Tools goal, and added, "NSF supports and provides a wide variety of accessible, state-of-the-art science and education facilities, tools and infrastructure, and in most cases is the only support for such instrumentation in academia." In reaching this determination, the AC specifically considered indicators that matched the objectives used here for I&I.

Evidence: Evidence demonstrating progress in meeting the Infrastructure and Instrumentation program's long-term goals may be found in the Advisory Committee for GPRA Performance Assessment Report (http://www.nsf.gov/pubs/2005/nsf05210/nsf05210.pdf); the June 2005 Committee of Visitors Report for the Shared Cyberinfrastructure Division (http://www.nsf.gov/od/oia/activities/cov/cise/2005/SCIcov.pdf); in the annual Performance and Accountability Reports, pages II-51ff (http://www.nsf.gov/pubsys/ods/getpub.cfm?par); "Improving the Design of the Scientists and Engineers Statistical Data System (SESTAT)" -- Committee to Review the 2000 Decade Design of the Scientists and Engineers Statistical Data System (SESTAT), National Research Council .

YES 20%
Section 4 - Program Results/Accountability Score 80%


Last updated: 09062008.2006SPR