ExpectMore.gov


Detailed Information on the
National Center for Education Statistics Assessment

Program Code 10000196
Program Title National Center for Education Statistics
Department Name Department of Education
Agency/Bureau Name Institute of Education Sciences
Program Type(s) Research and Development Program
Assessment Year 2003
Assessment Rating Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 89%
Program Management 60%
Program Results/Accountability 100%
Program Funding Level
(in millions)
FY2007 $90
FY2008 $88
FY2009 $105

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Reporting data on the progress of improving the timeliness of the release of survey data.

Action taken, but not completed NCES has established an efficiency measure that will track the time to release information from surveys and will report performance against the measure annually. NCES also collects and reports data on customer satisfaction with the timeliness of data files and publications.
2008

Organizing an independent review through the National Institute of Statistical Sciences of the proposal to align the Beginning Postsecondary Student (BPS) Longitudinal Study with the High School Longitudinal Study of 2009.

Action taken, but not completed
2007

Reviewing two surveys to ensure that they are addressing current policy concerns, eliminating items that are no longer of relevance and adding new items that are.

Action taken, but not completed The two surveys being reviewed are the Schools and Staffing Survey (SASS) and the National Household Education Survey (NHES). NCES held a technical review panel meeting for SASS and will award a contract to redesign NHES by the end of September 2008.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

The Department of Education will focus on improving the timeliness of NCES products and services.

Completed The Institute for Education Sciences Director made improving the timeliness of release of information from surveys a priority and established a performance measure to track time-to-release of survey results.
2006

Examine the efficacy of alternative or additional performance and efficiency measures to assess program performance.

Completed NCES reviewed measures used by other statistical agencies and provided OMB with an analysis of the suitability of these measures, as well as other measures, for NCES. Based upon feedback from OMB, it will propose a set of additional measures for use in assessing program performance.
2006

Examine performance measures of other statistical agencies and review their appropriateness for NCES.

Completed NCES reviewed measures used by other statistical agencies and provided OMB with an analysis of the suitability of these measures, as well as other measures, for NCES.
2007

Provide for OMB review and approval the program measures to be used to assess program performance for the Statistics programs, with an estimate of when data will be available for each of the measures.

Completed NCES reviewed measures used by other statistical agencies and provided OMB with an analysis of the suitability of these measures, as well as other measures, for NCES. Based upon feedback from OMB, it will propose a set of additional measures for use in assessing program performance.
2007

Reviewing postsecondary longitudinal student based data collections to ensure alignment with the new High School Longitudinal Study of 2009.

Completed NCES conducted an internal review of the feasibility of aligning the beginning postsecondary student longitudinal study (BPS) with the high school longitudinal study (HSLS).

Program Performance Measures

Term Type  
Long-term/Annual Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the ease of understanding NCES data files.


Explanation:Data are collected through an on-line random sample survey of NCES customers who visited the NCES Web-site.

Year Target Actual
2006 Baseline 89
2007 90 89
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the timeliness of NCES data files.


Explanation:Data were collected through a random sample of over 3,900 academic researchers; education associations; education journalists; users of NCES's National Education Data Resource Center; and Federal, State, and local policymakers. In 2006, NCES changed its customer service survey data collection to an on-line random sample survey of NCES customers who visited the NCES Web-site. When changing to the on-line survey, NCES also modified the questions asked of respondents. The Department no longer collects information specifically on the comprehensiveness of NCES data files.

Year Target Actual
1997 Baseline 52
1999 85 67
2001 90 66
2004 90 78
2006 90 86
2007 90 84
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the timeliness of NCES publications.


Explanation:Data for 1997 through 2004 were collected through a random sample of over 3,900 academic researchers; education associations; education journalists; users of NCES's National Education Data Resource Center; and Federal, State, and local policymakers. In 2006, NCES changed its customer service survey data collection to an on-line random sample survey of NCES customers who visited the NCES Web-site. Given the changes, data for earlier years are not comparable to 2006.

Year Target Actual
1997 Baseline 72
1999 85 77
2001 90 74
2004 90 78
2006 90 85
2007 90 86
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term/Annual Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the ease of understanding NCES publications.


Explanation:

Year Target Actual
2006 New measure 93
2007 90 90
2008 90
2009 90
2010 90
2012 90
2011 90
2013 90
Long-term Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the timeliness of NCES services.


Explanation:Data for 1997 through 2004 were collected through a random sample of over 3,900 academic researchers; education associations; education journalists; users of NCES's National Education Data Resource Center; and Federal, State, and local policymakers. In 2006, NCES changed its customer service survey data collection to an on-line random sample survey of NCES customers who visited the NCES Web-site. Given the changes, data for earlier years are not comparable to 2006.

Year Target Actual
1997 NA 89
1999 85 93
2001 90 88
2004 90 84
2006 90 92
2007 90 94
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term/Annual Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the relevance of NCES data files.


Explanation:Data are collected through an on-line random sample survey of NCES customers who visited the NCES Web-site.

Year Target Actual
2006 Baseline 94
2007 90 94
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term/Annual Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the courtesy of NCES staff providing services.


Explanation:Data are collected through an on-line random sample survey of NCES customers who visited the NCES Web-site.

Year Target Actual
2006 Baseline 95
2007 90 96
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term/Annual Efficiency

Measure: The percentage of NCES Statistics program initial releases that either meet the target number of months, or show at least a 2-month improvement over the prior release, with the starting point of 18 months in 2006, then declining to 16 months in 2007, 14 months in 2008, and 12 months in 2009 and beyond.


Explanation:

Year Target Actual
2006 Set a baseline 90
2007 90 100
2008 90
2009 90
2010 92
2011 94
2012 95
2013 95
Long-term Outcome

Measure: The percentage of respondents who would recommend NCES to others and who would rely on NCES in the future, as measured by the American Customer Satisfaction Index (ACSI).


Explanation: NCES survey participants are sampled from a combined list of NCES report requesters, data users, participants in meetings of constituents data providers for postsecondary institutions and for elementary and secondary state level data, participants in NCES training sessions that are offered to interested users and providers of NCES data, and volunteers to an open solicitation for participants from NCES web users. The ACSI is an indicator established in 1994 by the American Society for Quality and the University of Michigan Business School. The ACSI measures customer satisfaction in each of seven sectors: manufacturing-nondurables, manufacturing-durables, retail, services, finance/insurance, transportation/communications/utilities, and public administration/government. In each of the sectors, customer satisfaction is studied for its relationship to quality, profitability, and customer retention. ACSI measures satisfaction with more than 100 different federal government programs and/or websites, including 28 programs of other information providers. In addition, 5 other federal statistical agencies (BLS, BEA, NASS, ERS, IRS) have one or more components of their program evaluated through the ACSI. By participating in the ACSI, NCES will be able to use this common metric to compare customer satisfaction with its products and services with customer satisfaction with other ACSI participants.

Year Target Actual
2008 Set a Baseline
2010
2012
2014
Long-term/Annual Output

Measure: The percentage of survey data collections with either a response rate of 85 percent or higher or a non-response bias analysis and weight adjustments to adjust for bias identified in the nonresponse bias analysis.


Explanation:

Year Target Actual
2007 Baseline 100
2008 100
2009 100
2010 100
2011 100
2012 100
2013 100
Long-term Outcome

Measure: Number of web visits to the NCES website (monthly average).


Explanation: Number of web visits is a program performance measure of dissemination. Specifically, it is a metric of user traffic that is employed by several statistical agencies (BTS, BJS, NCHS) to monitor the level of interest in the information provided on the agency website over time. The collection of these data will allow NCES to monitor changes in web usage associated with major data releases or new user tools, and will provide a basis for comparison with other statistical agencies. This measure is based on the number of unique visits. A unique visit is a series of actions that begins when a visitor views their first page from the server, and ends when the visitor leaves the site or remains idle beyond the idle-time limit. The default idle-time limit is 30 minutes.

Year Target Actual
2008 Set a Baseline
2009
2010
2011
2012
2013
Long-term Outcome

Measure: Number of users of the NCES Data Analysis System (monthly average).


Explanation:

Year Target Actual
2008 Set a baseline
2009
2010
2011
2012
2013
Long-term/Annual Output

Measure: Number of downloads of electronic versions of reports (monthly average).


Explanation:

Year Target Actual
2008 Set a Baseline
2009
2010
2011
2012
2013
Long-term Outcome

Measure: Number of times NCES Statistics program data are cited on the web sites of 90 education associations and organizations.


Explanation: The 90 websites cover the range of elementary/secondary and postsecondary associations that represent data providers, education practitioners, education information dissemination experts, researchers, and education policy makers. These groups serve as an information source for their constituents, frequently repackaging relevant information to increase accessibility for their members. The list started from a list of potential recipients for education products and was supplemented based on the experience and knowledge of the team members who developed the monitoring program.

Year Target Actual
2008 Set a Baseline
2009
2010
2011
2012
2013
Long-term/Annual Efficiency

Measure: The average cost per completed case for the Fast Response Survey System, adjusted for inflation (in 2006 dollars).


Explanation:Cost per case for a survey is an efficiency measure. The measure is the cost per case for the data collection component of the survey in 2006 dollars, and it refers to the calculated cost for each completed individual response in a survey. The measure is calculated by adding the total costs for a survey for data collection (specifically costs for mailing, interviewers, web data collection, and, where applicable, incentives for participation) and total costs for processing of the data (including computer input and editing). These total costs are divided by the number of completed cases in the survey to create a "cost per case" for the survey. An interagency committee collaborated on the development of a framework for performance measures for federal statistical agencies. That committee identified costs to produce a product as an efficiency measure. In particular, five other statistical agencies (Census, BLS, NASS, BTS, EIA) are using measures of unit cost to monitor efficiency of performance. NCES anticipates that in light of increased resistance to voluntary participation in federal surveys, and increased costs associated with transportation of field staff and materials, it will be difficult to maintain a level cost per case collected over time.

Year Target Actual
2007 Baseline $159.09
2008 $159.09
2009 $159.09
2010 $159.09
2011 $159.09
2012 $159.09
2013 $159.09
Long-term/Annual Efficiency

Measure: The average cost per completed case for the Trends in Mathematics and Science Study, adjusted for inflation (in 2006 dollars).


Explanation:Cost per case for a survey is an efficiency measure. The measure is the cost per case for the data collection component of the survey in 2006 dollars, and it refers to the calculated cost for each completed individual response in a survey. The measure is calculated by adding the total costs for a survey for data collection (specifically costs for mailing, interviewers, web data collection, and, where applicable, incentives for participation) and total costs for processing of the data (including computer input and editing). These total costs are divided by the number of completed cases in the survey to create a "cost per case" for the survey. An interagency committee collaborated on the development of a framework for performance measures for federal statistical agencies. That committee identified costs to produce a product as an efficiency measure. In particular, five other statistical agencies (Census, BLS, NASS, BTS, EIA) are using measures of unit cost to monitor efficiency of performance. NCES anticipates that in light of increased resistance to voluntary participation in federal surveys, and increased costs associated with transportation of field staff and materials, it will be difficult to maintain a level cost per case collected over time.

Year Target Actual
2003 Baseline $177.77
2007 $177.77 na
2011 $177.77
2015 $177.77
Long-term/Annual Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the ease of finding information on nces.ed.gov.


Explanation:Data are collected through an on-line random sample survey of NCES customers who visited the NCES Web-site.

Year Target Actual
2006 Baseline 82
2007 90 81
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term/Annual Outcome

Measure: Percentage of customer respondents satisfied or very satisfied with the relevance of NCES publications that they used in the last year.


Explanation:Data are collected through an on-line random sample survey of NCES customers who visited the NCES Web-site.

Year Target Actual
2006 Baseline 95
2007 90 94
2008 90
2009 90
2010 90
2011 90
2012 90
2013 90
Long-term/Annual Efficiency

Measure: The average cost per completed case for the National Postsecondary Student Aid Study, adjusted for inflation (in 2006 dollars).


Explanation:Cost per case for a survey is an efficiency measure. The measure is the cost per case for the data collection component of the survey in 2006 dollars, and it refers to the calculated cost for each completed individual response in a survey. The measure is calculated by adding the total costs for a survey for data collection (specifically costs for mailing, interviewers, web data collection, and, where applicable, incentives for participation) and total costs for processing of the data (including computer input and editing). These total costs are divided by the number of completed cases in the survey to create a "cost per case" for the survey. An interagency committee collaborated on the development of a framework for performance measures for federal statistical agencies. That committee identified costs to produce a product as an efficiency measure. In particular, five other statistical agencies (Census, BLS, NASS, BTS, EIA) are using measures of unit cost to monitor efficiency of performance. NCES anticipates that in light of increased resistance to voluntary participation in federal surveys, and increased costs associated with transportation of field staff and materials, it will be difficult to maintain a level cost per case collected over time.

Year Target Actual
2004 Baseline $174.12
2008 $174.12
2012 $174.12
2016 $174.12

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: NCES follows a Congressional mandate to collect, analyze, and report education information and statistics.

Evidence: Sec. 151, P.L. 107-279

YES 20%
1.2

Does the program address a specific interest, problem or need?

Explanation: NCES is the lead Federal agency for collecting, reporting, analyzing, and disseminating statistical data related to education in the United States and in other nations.

Evidence: Publications and products.

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: See above.

Evidence: See above.

YES 20%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: NCES is organized according to policy area and core activity. The current administrative structure is successful in supporting NCES products and activities, however the successful administration of the Center does not mean that program improvements are not needed

Evidence: Successful release of core NCES products.

YES 20%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation:  

Evidence:  

NA  %
1.RD1

Does the program effectively articulate potential public benefits?

Explanation: The Office attempts to measure benefit through customer satisfaction surveys. In addition, NCES is developing a monitoring system to measure external uses of NCES products. However, NCES should also consider conducting evaluations to determine the effectiveness of NCES data in informing educational decisions.

Evidence: Results of bi-ennial customer satisfaction surveys.

YES 20%
1.RD2

If an industry-related problem, can the program explain how the market fails to motivate private investment?

Explanation: N/A

Evidence: N/A

NA 0%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Department of Education's GPRA Plan contains an NCES long-term goal to "Provide timely, useful, and comprehensive data that are relevant to policy and educational improvement." Performance targets are established through 2007.

Evidence: NCES GPRA goals.

YES 11%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: Measurement of customer satisfaction is consistent with continuous improvement of NCES products and services. Although this survey is only administered every two years, the Department of Education has demonstrated that biennial administration provides high quality data for decision-making while reducing respondent burden and survey costs. However, ED should consider supplementing this survey with an external evaluation of the entire Statistics portfolio to determine whether resources are optimally allocated across project areas and with an annual review of a subset of products from the Statistics program to ensure technical rigor. NCES also should consider developing additional performance measures to supplement the customer service data, and should examine whether it is possible to disaggregate data in the customer survey to provide information on aspects of the Statistics program alone. (The current survey provides information for Statistics and NAEP combined.)

Evidence: Customer satisfaction surveys.

YES 11%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: NCES conducts meetings with key constituents. Contractors, grantees, and the NCES Advisory Council were involved in the development and/or review of the NCES Information Quality Guidelines and Statistical Standards. In addition, each contractor and subcontractor is contractually committed to adhering to the NCES Information Quality Guidelines and Statistical Standards.

Evidence: Elementary and Secondary and Postsecondary data forums, technical review panels, contractor meetings, and the NCES Advisory Council for Education Statistics. NCES held separate review meetings with a cross-section of NCES contractors and Grantees to receive input to the development of the Information Quality Guidelines and Statistical Standards.

YES 11%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation: NCES collaborates with other agencies (e.g., HHS, USDA) on data collection activities and participates in the Federal Committee for Statistical Methodology and the Interagency Council for Statistical Policy. However, a more systematic approach to working with other ED offices and ensuring their information needs are met might be warranted.

Evidence: Joint funding of activities with other agencies (e.g., the Early Childhood Longitudinal Study, TIMSS, CPS, Household Crime Victimization Study)

YES 11%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: The last National Academy of Science review was completed in 1986, and there are no plans at present for another independent study of NCES. However, the revised statistical standards were reviewed by an external expert panel convened (at NCES request) by the National Institute of Statistical Sciences.

Evidence:  

NO 0%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: To the extent that the NCES budget is aligned with discreet statistical projects, the impact of funding decisions can be understood.

Evidence: Budget requests and project contracts.

YES 11%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: NCES has revised its statistical standards and has products peer reviewed prior to release. Customers have, in general, been satisfied with the quality of NCES products. However, NCES has not demonstrated that it has a plan for a systematic review of its entire portfolio to determine appropriate allocation of resources across program areas, overall program effectiveness, and strategies for improving the efficiency of the organization.

Evidence: Publication of the draft revised Statistical Standards in 2002 (http://nces.ed.gov/statprog/stat_standards.asp); adjudication procedures; customer surveys.

YES 11%
2.RD1

Is evaluation of the program's continuing relevance to mission, fields of science, and other "customer" needs conducted on a regular basis?

Explanation: NCES solicits opinions from customers via a biennial survey. In addition, NCES is developing a monitoring system to measure uses of NCES products by various user groups. However, NCES is in need of a systematic evaluation by an independent organization.

Evidence: Participation of advisory board. Customer satisfaction surveys.

YES 11%
2.RD2

Has the program identified clear priorities?

Explanation: NCES conducts large, on-going surveys and has ad-hoc meetings with individual program office staff to discuss data needs, and, in addition, receives recommendations from advisory groups for its major data collections.

Evidence: Current portfolio of work.

YES 11%
Section 2 - Strategic Planning Score 89%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: NCES uses customer satisfaction information to inform bureau products and services. NCES claims that biennial surveys are sufficient to measure satisfaction of customers and structure the creation and delivery of products. NCES should consider providing Statistics-specific customer service data and also should consider developing additional performance measures to supplement the customer service data. (See II.2.)

Evidence: Customer satisfaction surveys.

YES 10%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to the new EDPAS system which links employee performance to success in meeting the goals of the Department's Strategic Plan. In general, managers are provided individual performance agreements where there are given responsibility for achieving relevant action steps outlined in the Strategic Plan. These action steps and other items included in managers' performance agreements are designed to measure the degree to which a manager contributes to improving program performance. Contractor and grantee performance is monitored on an annual basis through review and approval of annual budget plans, compliance reviews, audits, and site visits. Contractors and grantees that do not meet Federal requirements are required to submit improvement plans and can have awards reduced or discontinued for serious or persistent failures to comply.

Evidence:  

YES 10%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: NCES successfully obligates funds by the end of each fiscal year, but should work on the timeliness of interagency agreements and needs to reduce the number of penalty interest charges. Funds are spent for the intended purposes; this is assessed through contract monitoring.

Evidence: Contract files, Inspector General audit reports.

YES 10%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: Although NCES has been working on technological improvements that will improve data accuracy and timeliness, the Office does not have formal incentives and procedures for realizing efficiencies and cost effectiveness. Moreover, NCES should work to synthesize project web architecture in order to promote interoperability and lower costs.

Evidence:  

NO 0%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: Education's 2004 Budget satisfies the first part of the question by presenting the anticipated S&E expenditures (including retirement costs) for this program, which constitute 29.6 percent of the program's full costs. However, Education has not satisfied the second part of the question because program performance changes are not identified with changes in funding levels.

Evidence:  

NO 0%
3.6

Does the program use strong financial management practices?

Explanation: An Inspector General audit report released September 20, 2002 found that the Office of Education Research and Improvement (now the Institute of Education Sciences) "did not always ensure compliance with contract terms or follow established regulations, policies, and procedures." In response to the IG audit, ED Contracts Office staff arranged training, which all NCES contracting officer's representatives and program managers attended.

Evidence: Audit #ED-OIG/A19-B0009

NO 0%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: NCES identified deficiencies in the contract oversight process and is working to ensure that all contract management staff receive appropriate training. NCES requires all staff responsible for monitoring contracts to maintain up-to-date certification.

Evidence:  

YES 10%
3.RD1

Does the program allocate funds through a competitive, merit-based process, or, if not, does it justify funding methods and document how quality is maintained?

Explanation: Most NCES activities are conducted through competitively awarded contracts.

Evidence: Contract files.

YES 10%
3.RD2

Does competition encourage the participation of new/first-time performers through a fair and open application process?

Explanation: NCES holds bidders conferences, places Statements of Work (SOWs) on the web, and conducts outreach at meetings and conferences.

Evidence: Contract files and outreach conferences.

YES 10%
3.RD3

Does the program adequately define appropriate termination points and other decision points?

Explanation: NCES is beginning to use performance-based contracts that have adequate opportunity for termination and amendment. However, NCES did not demonstrate that it has in place a plan for systematically reviewing its portfolio to determine when resources should be allocated to higher priority activities or when specific data collections, data elements, or reports should be terminated or overhauled. In addition, NCES has not designed a process wherein decisionmakers, including the OMB and senior Departmental management, are aware of significant contractual activity. In response to these concerns, NCES has initiated an ongoing internal program review that will result in the evaluation of all major NCES data collections (see Section II, Question 1). This will provide the information base for NCES to set priorities and to make programmatic adjustment as necessary. This will also provide an information base to share with OMB and Senior Departmental Management.

Evidence:  

NO 0%
3.RD4

If the program includes technology development or construction or operation of a facility, does the program clearly define deliverables and required capability/performance characteristics and appropriate, credible cost and schedule goals?

Explanation: N/A

Evidence: N/A

NA 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: The Department of Education's GPRA Plan contains an NCES long-term goal to "Provide timely, useful, and comprehensive data that are relevant to policy and educational improvement." Measurement of this indicator shows that NCES is showing progress in achieving long-term goals, but needs to work on improving the timeliness of products. Performance targets are established through 2007.

Evidence: GPRA Performance Plan.

YES 25%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: NCES continues to measure high levels of customer satisfaction but need to improve timeliness.

Evidence:  

YES 25%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: NCES staff work to improve data collection and reporting strategies, such as through the enhanced use of technology, in order to conduct work in a more cost-effective manner.

Evidence: NCES continues to modify product delivery so that publications and data are available electronically and on the web. Technological improvements have increased the timeliness of NCES products and services.

YES 25%
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation:  

Evidence:  

NA 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: NCES conducts reviews of individual projects to ensure high quality, and customer survey data show that customers are, overall, satisfied with the comprehensiveness, timeliness, and utlity of publications, data files, and services. NCES has not, however, demonstrated that the Statistics program as a whole is effective, and ED should consider conducting an external review, by an independent organization, of the Statistics program to assess overall quality, allocation of resources, and the extent to which NCES data meet the nation's need for educational information.

Evidence: Customer satisfaction surveys.

YES 25%
4.RD1

If the program includes construction of a facility, were program goals achieved within budgeted costs and established schedules?

Explanation: N/A

Evidence: N/A

NA 0%
Section 4 - Program Results/Accountability Score 100%


Last updated: 09062008.2003SPR