ExpectMore.gov


Detailed Information on the
Title I Grants to Local Educational Agencies Assessment

Program Code 10003320
Program Title Title I Grants to Local Educational Agencies
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Block/Formula Grant
Assessment Year 2006
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 88%
Program Management 100%
Program Results/Accountability 50%
Program Funding Level
(in millions)
FY2007 $12,838
FY2008 $13,899
FY2009 $14,305

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2007

Work with Congress to support the reauthorization of the Title I Grants to Local Educational Agencies consistent with the Administration's reauthorization proposal.

Action taken, but not completed The Adminstration's legislative proposals for reauthorizing NCLB, including Title I Grants to LEAs, were informally transmitted to Congress in August 2007. Many of these proposals have been included in S. 1775, the No Child Left Behind Act of 2007, which has been introduced in the Senate. Pending Congressional action, the Administration also is pursuing reauthorization goals through a combination of additional pilot demonstration projects and proposed regulatory changes.
2007

Improve timeliness and transparency related to the collection and analysis of performance and monitoring data to promote more effective use of these data to strengthen program management.

Action taken, but not completed ED has posted SY 2005-06 student achievement data on its web site, and is currently updating the annual Consolidated State Performance Report to improve data quality and obtain information about the new School Improvement Grants program. ED also is preparing a comprehensive report on the monitoring cycle ending in FY 2006 for publication in 2008, and has posted State reports for the first year (FY 2007) of the FY 2007-2009 monitoring cycle.
2007

Strengthen support for the school district and school improvement process, including increased participation in Title I public school choice and supplemental educational service options.

Action taken, but not completed In addition to targeted monitoring of SES and Public School Choice in 7 States during FY 2007, the regular FY 2007-2009 monitoring cycle includes expanded monitoring of SES and choice. ED published proposed regulations in April 2008 that include new notification and outreach requirements intended to increase participation in Title I school choices and SES options. The proposed regulations also would strengthen the restructuring process.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Annual Outcome

Measure: The percentage of economically disadvantaged students in grades 3-8 scoring at the proficient or advanced levels on State reading/language arts assessments will increase to 77.7 percent.


Explanation:This measure focuses on progress toward the statutory goal of 100-percent proficiency in reading/language arts by SY 2013-2014. * The baseline has been recalculated since 2005-06 assessment data are now available, and that was the first year States were required to assess all students in grades 3-8 and thus will support a more accurate comparison in subsequent years.

Year Target Actual
2004 undefined 49.7
2005 undefined 52.6
2006 * 55.3
2007 60.9 57.4
2008 66.5
2009 72.1
2010 77.7
Annual Outcome

Measure: The percentage of economically disadvantaged students in grades 3-8 scoring at the proficient or advanced levels on State mathematics assessments will increase to 76.2 percent.


Explanation:This measure focuses on progress toward the statutory goal of 100-percent proficiency in reading/language arts by SY 2013-2014. * The baseline has been recalculated since 2005-06 assessment data are now available, and that was the first year States were required to assess all students in grades 3-8 and thus will support a more accurate comparison in subsequent years.

Year Target Actual
2004 undefined 47.6
2005 undefined 50.6
2006 * 52.3
2007 58.3 55.9
2008 64.2
2009 70.2
2010 76.2
Long-term/Annual Outcome

Measure: The difference between the percentage of economically disadvantaged students in grades 3-8 scoring at the proficient or advanced levels on State reading/language arts assessments and the percentage of all students in grades 3-8 scoring at the proficient or advanced levels on State reading/language arts assessments will decrease to 6.5 percent.


Explanation:This measure focuses on the program goal of closing achievement gaps between poor students and other students. * The baseline has been recalculated since 2005-06 assessment data are now available, and that was the first year States were required to assess all students in grades 3-8 and thus will support a more accurate comparison in subsequent years.

Year Target Actual
2004 undefined 13.9
2005 undefined 13.3
2006 * 13.0
2007 11.4 12.8
2008 9.8
2009 8.1
2010 6.5
Long-term/Annual Outcome

Measure: The difference between the percentage of economically disadvantaged students in grades 3-8 scoring at the proficient or advanced levels on State math assessments and the percentage of all students in grades 3-8 scoring at the proficient or advanced levels on State math assessments will decrease to 6.4 percent.


Explanation:This measure focuses on closing the achievement gap in mathematics between poor students and all students. * The baseline has been recalculated since 2005-06 assessment data are now available, and that was the first year States were required to assess all students in grades 3-8 and thus will support a more accurate comparison in subsequent years.

Year Target Actual
2004 undefined 13.3
2005 undefined 12.8
2006 * 12.7
2007 11.1 12.2
2008 9.5
2009 7.9
2010 6.4
Annual Efficiency

Measure: During the SY 2005-06 through SY 2009-10 monitoring cycles, ED will reduce the average number of business days required to complete State monitoring reports from the SY 2004-05 baseline average of 46 days.


Explanation:

Year Target Actual
2004 undefined NA
2005 undefined 46.3
2006 undefined 43.3
2007 40.3
2008 40.0
2009 40.0
2010 40.0

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: Statutory goal is to help ensure that all students reach proficiency on State assessments in reading and mathematics by the 2013-14 school year, in particular by closing longstanding achievement gaps. Title I provide supplemental funds to low-income communities to help students at risk of academic underachievement catch up with the peers. The program supports accountability for improving students achievement, assistance for low-performing schools, and increased options for parents.

Evidence: ESEA 1111(b)(2)(F)

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Significant numbers and percentages of students, both on average and among key subgroups, currently score below the proficient level on State academic assessments. While narrowing in some areas in recent years, large gaps in achievement remain between poor and minority students and their more advantaged peers.

Evidence: State assessment data.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Title I, Part A is the centerpiece of No Child Left Behind and the Federal effort to close achievement gaps between poor and disadvantaged students and their more advantaged peers, while ensuring that all students are proficient in reading and math by 2013-14. The program's accountability requirements serve as the framework for evaluating the effectiveness of Federal, State, and local education reforms and programs. While other Federal, State, and local programs address similar issues and academic subjects, Title I, Part A defines the general policies to which State accountability systems, academic standards and assessments, and compensatory education programs conform.

Evidence: All States have implemented NCLB accountability plans and assessments systems. Funding redundancy is reduced in through the Title I, Part A supplement-not-supplant requirements which ensure that Federal funds add to, rather than replace, State and local funding.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: Formula-driven allocations are widely deemed fair, and key statutory requirements (accountability plans, annual assessment, identification of LEAs and schools for improvement, provision of choice and supplemental educational services) are clearly linked to program goals and objectives. State and local concerns about the "prescriptiveness" of the program have been addressed through the provision of flexibiliity in certain areas, such as the assessment of students with disabilities and limited English proficient students, the calculation of adequate yearly progress, implementation of choice options, and ensuring that all teachers are highly qualified. In addition to considerable local flexibility in the use of Federal funds, the program has incorporated, over time, key reform elements such as standards-based accountability, use of research-based practices, and greater parental choice. As a result, there are few, if any, evidence-based alternatives for Federal efforts to leverage State and local educational improvement.

Evidence: GAO reports and evaluation data. Amendments to State accountability plans.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: Program dollars are allocated on the basis of annually updated Census estimates of the numbers and percentages of students from low-income families in each school district. Funds appropriated in excess of FY 2001 level have been allocated through the two program formulas that target funds on LEAs with the highest concentration of low-income students. Districts generally allocate dollars to schools according to the percentages of students receiving free- and reduced-price lunch in each school. LEAs also are required to give priority to low-income, low-achieving students in providing public school choice and supplemental educational services to students enrolled in schools identified for improvement. In addition, resources are allocated to States on condition of compliance with program requirements, which include the inclusion of all students in State assessment systems; the reporting of assessment results disagregated by poverty, race and ethnicity, disability status, and LEP status; and adequate yearly progress determinations based on the academic achievement of these subgroups, which collectively comprise the major intended beneficiaries of the program. State administrative costs are limited to 1 percent of combined LEA allocations. The statute includes supplement-not-supplant and maintenance-of-effort requirements that work to prevent the use of Federal funds to replace State and local education funding. States reserve 4 percent of LEA allocations to support State and local improvement efforts, which generally target LEAs with the greatest need for such funds because they have the most schools identified for improvement.

Evidence: Sections 1122, 1124A, 1125, 1125A, 1125AA, 1113, and 1003. Upcoming evaluation data on resource allocation at the local level.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has four long-term/annual performance measures focused on increasing the academic achievement of students from low-income families in reading and math and on reducing the achievement gap between these students and other students. These measures reflect the long-term program goal of 100 percent proficiency in reading and math by 2014.

Evidence: FY 2008 Program Performance Plan

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program has established annual targets for increasing proficiency in reading and math and reducing the achievement gap in both subjects that are consistent with the statutory program goal of 100 percent proficiency in reading and math for all students, and in particular the low-income and minority students that are the focus of program activities, by 2014.

Evidence: FY 2008 Program Performance Plan.

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The program's long-term measures, which are focused on increasing proficiency in reading and math to 100 percent and reducing achievement gaps in those subjects to zero, are reflected in four annual indicators described in detail in the "Performance Measures" section.

Evidence: FY 2008 Program Performance Plan.

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Baseline data and annual targets are presented in the "Performance Measures" section. Baseline data were derived from State assessment results reported to ED for the 2003-04 and 2004-05 school years. Baselines will be recalculated once 2005-2006 assessment data are available, since that is the first year States are required to assess all students in grades 3-8 and thus will support a more accurate comparison in subsequent years. Actual data from first two years suggest that annual targets are feasible, if ambitious.

Evidence: FY 2008 Program Performance Plan.

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: All States have approved accountability plans reflecting short- and long-term program goals and objectives. These plans include annual adequate yearly progress (AYP) determinations for LEAs and schools based on the 100 percent proficiency goal by 2014, as well as rewards and sanctions for meeting or missing AYP.

Evidence: State accountability plans and consolidated applications.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The authorizing statute has long required separately funded, extensive evaluation of the implementation and impact of the Title I Grants to Local Educational Agencies program, guided by an Independent Review Panel (IRP) appointed by the Secretary, supervised by the Department's evaluation office (which is independent of the office administering the program), and conducted by third-party contractors. The purpose of the IRP is to provide expert advice to the Secretary, particularly in the area of using the most rigorous research methodologies available to determine the program's effectiveness. The Department currently is conducting a National Longitudinal Study of No Child Left Behind, based on a nationally representative sample of 1,500 schools and 300 districts, as well as a Study of State Implementation of Accountability and Teacher Quality Under NCLB, based on two sets of surveys and interviews of State-level staff from all 50 States, DC, and Puerto Rico. These studies will provide a comprehensive examination of the Title I program, including the implementation of State standards, assessments, and accountability systems, including student subgroup achievement and accountability; State definitions of adequate yearly progress; schoolwide and targeted assistance programs; targeting and State and local use of program funds; the identification of schools and districts for improvement and actions taken to improve their performance; the implementation of public school choice and supplemental educational services; and professional development activities. Studies and evaluation reports generally are intended to be completed on a schedule to help inform Congressional reauthorization of the program, which occurs roughly every five years. However, interim reports and sub-study results typically are released often enough to support program improvements and policy and regulatory changes in between reauthorizations. ED released the National Assessment of Title I Interim Report in February 2006; an additional NATI report on targeting and resource allocation is expected in early 2007, and the final NATI report is scheduled for publication in late spring 2007.

Evidence: ESEA Section 1501; semi-annual meetings of the Independent Review Panel; the Title I Evaluation Plan; the National Assessment of Title I Interim Report, Volume I: Implementation, published in February 2006

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: Budget requests are not tied specifically to achieving a desired level of program outcome. This is largely a function of the nature and structure of the program: it is a highly flexible formula grant that supplements State and local education dollars that provide over 90% of the resources devoted to achieving this program's student achievement outcomes. However, at the Federal level, budget submissions include the full costs of the program, including Federal S&E dollars.

Evidence: Annual budget requests and Congressional justifications.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The program has developed meaningful long-term performance measures to guide strategic planning efforts.

Evidence: FY 2008 Program Performance Plan

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: ED collects annual performance data (primarily State assessment results) through the Consolidated State Performance Report (CSPR) process. ED has an extensive, comprehensive, and consistent desk review and on-site monitoring system that assesses program operations at the State and local levels and provides recommendations for improvement. Monitoring utilizes data from on-site visits, State performance reports, audits and IG reports. Following on-site visits, ED provides an after-action report to the State, including corrective actions. States must provide evidence that they are implementing the corrective actions. This monitoring system, which includes clear protocols and regularly scheduled visits (including approximately one-third of States each year), was developed based on a thorough analysis of this program's previous monitoring system and has been responsible for signficant number of program improvements.

Evidence: Consolidated State Performance Report (CSPR) data, annual NCLB report to Congress, site-visit monitoring reports.

YES 11%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: The monitoring process and the Consolidated State Performance Reports (CSPR) are the primary accountability tools for ensuring compliance with Title I, Part A statutory provisions. The monitoring process, which uses a series of consistent indicators across States, is designed to focus on the results of State's efforts to implement critical Title I, Part A requirements, how Title I resources are being used, and the extent to which States effectively monitor the sub-grantees, the local educational agencies. The information generated through our monitoring also informs Federal technical assistance initiatives and national leadership activities. The ED program director is held accountable through the EDPAS process for managing the overall quality of the monitoring process with the goal of improving program compliance. ED program managers also are held accountable through EDPAS for ensuring the accuracy of monitoring reports, the timely submission of monitoring and after-action reports (efficiency measure), and completing monitoring closeout activities. CSPR data is used to measure progress towards meeting the Title I, Part A performance indicators developed for reporting under the Government Performance and Results Act. This data includes information such as number of schools and districts in improvement and student performance data that program staff analyze to identify potential issues that may require corrective action. ED staff also use the CSPR data to implement the President's Management Agenda and the Department's Strategic Plan, which both stress the linking of timely and accurate data to meaningful outcome measures. Program staff also use the results from monitoring to take systematic follow-up steps to improve accountability for program partners. First, States must provide evidence to ED that they have implemented corrective actions to address findings outlined in monitoring reports. For the past five years, ED has placed conditions on Title I, Part A grant awards for several States with unresolved monitoring findings. In addition, ED has withheld funds from SEAs based on data gathered during the monitoring process that revealed significant compliance issues that were not resolved. For example, in fiscal year (FY) 2004, ED withheld $103,981 in Title I, Part A funds from Idaho, or 25 percent of the amount the SEA is permitted to reserve for State administration, because that year Idaho did not have a standards and assessment system in place that met statutory requirements. Also in FY 2004, ED withheld $444,282 in Title I, Part A funds from Texas, or 4 percent of the State's Title I administrative funding, because that year Texas was late in providing AYP notifications to many of its Title I schools. Second, in response to its monitoring findings ED has revised its non-regulatory guidance on schoolwide programs and fiscal issues to provide more clarity and is developing new guidance on targeted assistance programs. By routinely updating and improving program guidance based on what is learned by monitoring the States, ED helps to strengthen its program partners' performance. Third, program monitoring has revealed a need for an increased focus on math. As result, ED is preparing a series of national leadership activities on math designed to improve student achievement in this critical academic area. Finally, program monitoring found that many States are not able to reserve 4 percent of their Title I, Part A allocation for school improvement activities because of the section 1003(e) "hold-harmless" that protects LEA allocations at the prior-year level. In response, the President's FY 2007 budget proposed the elimination of the section 1003(e) "hold-harmless" provision.

Evidence: State accountability plans, annual assessment reporting and AYP determinations, CSPR data, and EDPAS agreements.

YES 11%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: The Department obligates Title I, Part A funds within the timeframes set out by ED schedules. Program staff typically calculate Title I, Part A allocations in the spring, enter the amounts into ED's Grants and Payments System (GAPS), and coordinate their release with the ED offices responsible for transmitting the funds to the grantees' accounts. Reports are generated from ED's Grants and Payments System to monitor grantee spending, and program staff review financial records and audit reports as a standard practice during compliance reviews. ED ensures that funds are spent for the intended purpose through its program monitoring efforts, which include a fiduciary component that involves a partnership between the Title I program office and ED's Office of the Chief Financial Officer (OCFO). For example, ED monitors examine whether SEAs ensure that LEAs set aside the correct amounts of their Title I, Part A allocations for school improvement, parental involvement, and services to private school students, as well as whether the LEAs spend the appropriate amounts on supplemental educational services and school choice. OCFO, which has the primary responsibility within ED for ensuring that grantees comply with the Improper Payments Act, participates in monitoring trips to review whether expenditures are accurately reported through examination of randomly selected records and equipment inventories and by interviewing grantee and sub-grantee staff about their accounting procedures. When ED finds problems, it requires SEAs to make corrections and provide evidence that they implemented the corrections. ED also continuously analyzes monitoring findings to see whether there are particular areas in which SEAs are having problems in order to provide appropriate technical assistance. For example, this year ED issued new fiscal guidance that addresses problem areas such as Comparability, Supplement not Supplant, and Carryover to help SEAs ensure that they and their LEAs spend funds on their intended purposes. SEAs are not required to report their specific expenditures or their LEAs' expenditures to ED, but the monitoring process does include a review of accounting documents showing Title I, Part A expenses for specific personnel and other activities, and the OCFO monitoring component includes a check on the accuracy of reporting expenditures. For example, OCFO has uncovered cases of split credit card purchases where SEA or LEA staff members used two or more transactions to exceed spending limits. Furthermore, ED's Office of the Inspector General (OIG) regularly conducts audits that include determining the extent that reporting expenditures for Title I, Part A and other Department programs are accurately reported.

Evidence: ED monitoring reports, IG audit reports.

YES 11%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: ED has proposed an efficiency measure related to its on-site monitoring and reporting process for Title I, Part A that focuses on reducing the average number of business days required to complete State monitoring reports from 46 days to 40 days by monitoring cycle 2007-08. ED also is working to provide more accurate and timely reporting of performance and other data through the EDFacts system. As part of this effort, ED has developed and disseminated a document entitled "Improving Data Quality for Title I Standards, Assessments, and Accountability Reporting Guidelines for States, LEAs, and Schools. And while not directly related to procedures, the ongoing National Assessment of Title I (NATI) includes evaluation studies that are producing rigorous scientific evidence on the effectiveness of specific education programs and practices funded by Title I, Part A and other ED programs. Completion of this research will permit more efficient use of Federal education funds by encouraging grantees to employ reading programs of proven effectivenss.

Evidence: ED program monitoring procedures; the SY 2004-05 Consolidated State Performance Reports; National Assessment of Title I Interim Report, ED Evaluation Plan.

YES 11%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The progam has coordinated closely with the Office of Special Education and Rehabilitative Services and the Office of English Language Acquisition to ensure that students with disabilities and limited English proficient students are appropriately included in State assessment and AYP systems. The program also has worked closely with the Charter Schools program to ensure that charter schools receive their fair share of program funds and effectively implement NCLB accountability requirements.

Evidence:

YES 11%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified major deficiencies in the financial management of this program, though there have been problems with a limited number of State and local recipients uncovered through IG audits and/or monitoring visits. Federal program officials have taken appropriate actions to ensure correction of these problems.

Evidence: Monitoring reports posted on the ED web site.

YES 11%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: ED's program monitoring process is the central component to its management of the Title I, Part A program. ED assigns a staff person to each State to provide for continuous monitoring. The State contact person conducts an ongoing desk review, routinely gathering and analyzing data and information relevant to each of the three monitoring areas for their assigned State(s). These areas include standards, assessment, and accountability; instructional support; and fiduciary responsibility. On-site monitoring is carried out on a 3-year cycle. Monitoring outside of this cycle is scheduled as needed if there are indications that a State has serious or chronic compliance problems. For example, during this period, ED conducted follow-up visits to 3 States??California, Maine, and Michigan??that were monitored in the previous year. The monitoring team typically includes a team leader and at least five other staff members or expert consultants who participate in the monitoring site visit. An on-site monitoring visit typically lasts 4 to 5 days. During a site visit, program staff review documentation that was not available prior to the trip and interview SEA and LEA staff, principals, parents, and other stakeholders. After the on-site monitoring is completed, the team conducts conference calls with additional LEAs across the State to complete its information gathering. The team then prepares a comprehensive monitoring report for the SEA containing recommendations, findings, and corrective actions that together provide an analysis of the status of Title I, Part A implementation throughout the State. The SEA has 30 business days to respond to all of the compliance issues identified in that report. Evidence of implementation of actions designed to correct all compliance issues identified in the monitoring report must be submitted and approved by ED prior to removing the condition on the State's grant award(s). ED updates its monitoring process as needed to maintain strong program management. For example, in response to concerns about SEA monitoring of their LEAs, ED added an indicator in the middle of the latest 3-year cycle designed to measure the extent to which SEAs effectively monitor LEA implementation of program requirements. Monitoring teams also review the effectiveness of the school improvement and instructional support measures established by the State to see whether they benefit LEAs and schools within the State, and determine how well the SEA has complied with funding and resource use requirements. This includes assessing the degree to which the SEA has exercised it general oversight responsibilities to ensure LEA compliance through State monitoring activities, review and approval of local applications, audits, and implementation of complaint procedures. While not yet finalized, for the new monitoring cycle beginning in 2006-07, ED will implement a revised monitoring process involving: (1) a more extensive pre-site review; and (2) a modified on-site review which will focus on high priority issues based on the pre-site review and other factors, such as an examination of prior monitoring findings, high priority issues, Inspector General Reports, Office for Civil Rights findings, and where appropriate, single State audit findings.

Evidence: Title I, Part A monitoring reports posted on the ED web site.

YES 11%
3.BF1

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: ED gains sufficient knowledge of grantee activities through its extensive program monitoring cycle, Consolidated State Performance Reports (CSPRs), technical assistance activities, consideration of amendments to State accountability plans, annual meetings with State Title I directors, and ongoing communications with States via telephone calls, conference calls, and electronic mail.

Evidence: Program monitoring reports posted on the ED web site, Dear Colleague Letters posted on the ED web site, CSPRs.

YES 11%
3.BF2

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: ED collects a significant amount of performance information, such as State assessment data, through the Consolidated State Performance Reports (CSPRs). These data are the primary source used to inform the program's GPRA indicators, progress on which is published in the Department's Performance and Accountability Report (PAR). Increasing use of the new EDFACTS system should permit more timely collection, processing, and dissemination of the CSPR data. ED expects CSPR data, particularly student achievement data, to be more readily available to the public as part of the implementation of Goal 1, Strategy 5 of ED's new strategic plan which will require ED to "collect, analyze and publicly disseminate disaggregated student information on a timely basis." In addition, the Department plans is publishing the State-by-State results of the performance data used for the PART and other related information to better inform the public. Furthermore, ED regularly conducts monitoring visits at the State and local levels to assess program operations and to require corrective actions to ensure proper implementation. The Department currently posts the resulting monitoring reports on its website and will soon add an analysis of findings over the three-year monitoring cycle just completed. Also, as part of the new monitoring cycle beginning in the winter of 2006-2007, the Department will begin to document publicly State responses to the corrective actions required by ED.

Evidence: The public can currently access the program's reporting of its GPRA performance indicators, monitoring reports, and other performance-related information, such as program performance plans, on the Department's website. Also, in support of the PART process, individual State data that mirror the national student-level performance measures will be available on ED's website. And finally, to illustrate better the changes that States make in response to the Department's rigorous monitoring, tables showing specific actions by the States to correct adverse findings will soon be displayed on ED's website.

YES 11%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: OMB and ED have agreed on new long-term and annual performance measures and targets for the program, and ED has calculated baseline data for SY 2003-04 and first-year change data for SY 2004-05. From SY 2003-2004 to SY 2004-2005, the percentage of economically disadvantaged students who scored at least proficient in math rose from 47.6% to 50.6%, while the achievement gap between poor students and all students dropped from 13.3% to 12.8%. In reading, the percentage proficient rose from 49.7% to 52.5%, with the achievement gap declining from 13.9% to 13.3%. The addition of new assessments in SY 2005-2006 (results for this year will be available in fall 2007) and changing State Adequate Yearly Progress definitions limit the comparability of performance data from year to year, but the early data clearly show that student achievement is moving in the right direction. Improved achievement occurred even as States were identifying increasing numbers of schools and school districts for improvement, suggesting the possibility that future improvement efforts could maintain or even accelerate progress.

Evidence: Consolidated State Performance Report data for SY 2003-04 and SY 2004-05 and SY 2003-04 data submitted via the Education Data Exchange Network (EDEN).

SMALL EXTENT 8%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: OMB and ED have agreed on new annual performance measures and targets for the program, and ED has calculated baseline data for SY 2003-04 and first-year change data for SY 2004-05. From SY 2003-04 to SY 2004-05, the percentage of economically disadvantaged students who scored at least proficient in math rose from 47.6% to 50.6%, while the achievement gap between poor students and all students dropped from 13.3% to 12.8%. In reading, the percentage proficient rose from 49.7% to 52.5% with the achievement gap declinging from 13.9% to 13.3%. Further, ED notes that in 25 States, the math achievement of economically disadvantaged students increased by more than three percent, and in 30 States the reading achievement of economically disadvantaged students increased by more than three percent. The larger growth in these States exemplifies the potential for the overall larger increases that will be needed in upcoming years to reach the 100% proficiency target, and it is generally consistent with the rate of progress that will be required to meet the program's annual performance targets. In addition, these performance data are available due to the consideralbe progress ED and its program partners have made in putting in place, for the first time, reliable assessment and reporting systems that provide the nationally disaggregated results needed to measure program performance.

Evidence: State assessment results from the CSPR. Program data on monitoring reports.

LARGE EXTENT 17%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: ED significantly improved the efficiency and the cost-effectiveness of its data collection by refining Title I, Part A questions on the Consolidated State Performance Report forms to improve clarity and eliminate duplication, and by implementing the EDEN/EDFACTS system to significantly streamline the CSPR reporting process. Critical Title I, Part A data is requested as part of a single cross-program data request to States and can be submitted electronically at the point in time when it is available. The State EDFACTS coordinator coordinates data submissions across programs and serves as a single point of contact for resolving any questions related to accuracy. To further facilitate the data collection process, ED pre-populates the reporting document to the extent it already has the required data, which avoids duplicative data entry by States unless errors are identified. These steps will result in more accurate data at both the State and Federal level while reducing the State reporting burden so that more attention can be focused on program implementation. Further, these improvements help provide more timely data to ED program managers to inform policy and management decisions supporting program implementation. Also, ED has adopted an efficiency measure tracking the number of business days needed to complete and submit Title I, Part A monitoring reports to the SEAs, calculated baseline data, and set targets for reducing the number of days from 46 to 40. Program data shows that ED reduced the time-to-completion for monitoring reports from 46.3 days in 2004-05 to 43.3 days in 2005-06. The efficiency measure directly relates to program performance because the faster an SEA receives the report identifying monitoring findings, required corrective actions, and required documentation, the sooner it can make program improvements. To ensure that reports are completed on schedule, monitoring team leaders report weekly to the program director on their progress towards completing the reports. And once an SEA receives the report, program staff track the SEA's response to ensure that it is returned to ED on time.

Evidence: EDFacts Clearance package #03017, 1875-0240-v.1, Appendix D, which contains a report showing burden reduction for the Consolidated State Performance Report due to the implementation of EDFacts. ED Program management files documenting completion of monitoring trips and report issuance dates.

LARGE EXTENT 17%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: There are no comparable programs, although several States had previously put in place standards-based accountability systems that included many of the same components as Title I Grants to LEAs, including standards, assessments, report cards, sanctions for poor performance and rewards for success. In general, those States with robust accountability systems like that of Title I Grants to LEAs have shown the most improvement in student achievement, though results are mixed.

Evidence:

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The National Assessment of Title I (NATI) Interim Report released in February 2006 provided limited, preliminary data on improved achievement under Title I as reauthorized by No Child Left Behind, as well as considerable data on State and local implementation of key provisions of the law (which took effect during the 2002-03 school year). In terms of achievement, the NATI Interim Report showed positive achievement trends based both on available State-level assessment data and on the National Assessment of Educational Progress (NAEP). For example, three-quarters of States with three-year trend data showed achievement gains in elementary reading for low-income students??a key minority subgroup targeted by the Title I program. The NATI documented similar trends in mathematics achievement for minority subgroups on State assessments, progress that appeared to be confirmed by NAEP achievement trends for minority and low-income students in the early grades. Also, a March 2006 Education Trust study reported that in a majority of States examined, the math and reading scores of economically disadvantaged students on State assessments increased from 2003 to 2005, while the achievement gap between these students and their peers narrowed over the same period. As the bulk of Title I, Part A funds are used in elementary schools, these findings provide an indication of program effectiveness and improved results. The NATI Interim Report also found that States were making progress in implementing the full range of assessments required in grades 3-8 and once in high school by NCLB, that most States were assessing 95 percent or more of their students in these grades, and that all States were reporting student achievement data disaggregated by key minority subgroups. The number of schools identified for improvement rose about 50 percent following two years of NCLB implementation, from about 6,000 schools to just over 9,000 schools, and almost all States implemented a statewide system of support for identified schools, though the systems varied considerably in the comprehensiveness of that support. In addition, the NATI documented significant increases in the number of schools where public school choice and supplemental educational services were offered during the first three years of NCLB implementation, though participation rates remained low during this period.

Evidence: National Assessment of Title I Interim Report, Volume I, Implementation, February 2006; the Education Trust's Primary Progress, Secondary Challenge report, March 2006.

SMALL EXTENT 8%
Section 4 - Program Results/Accountability Score 50%


Last updated: 09062008.2006SPR