ExpectMore.gov


Detailed Information on the
IDEA Special Education Grants to States Assessment

Program Code 10000192
Program Title IDEA Special Education Grants to States
Department Name Department of Education
Agency/Bureau Name Office of Special Education and Rehabilitative Services
Program Type(s) Direct Federal Program
Assessment Year 2008
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 88%
Program Management 100%
Program Results/Accountability 42%
Program Funding Level
(in millions)
FY2007 $10,783
FY2008 $9,515
FY2009 $10,494

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Long-term Outcome

Measure: The percentage of fourth-grade public school students with disabilities scoring at or above Basic in reading on the National Assessment of Educational Progress (NAEP).


Explanation:

Year Target Actual
2000 NA 22
2002 24 29
2003 25 29
2005 35 33
2007 35 36
2009 37
2011 39
2013 40
Long-term Outcome

Measure: The percentage of eighth-grade public school students with disabilities scoring at or above Basic in mathematics on the National Assessment of Educational Progress (NAEP).


Explanation:

Year Target Actual
2000 NA 20
2003 23 29
2005 32 31
2007 33 33
2009 35
2011 37
2013 38
Long-term Outcome

Measure: The percentage of fourth-grade public school students with disabilities included in the National Assessment of Educational Progress (NAEP) reading sample who are excluded from testing.


Explanation:This measure is critical in determining outcomes for children with disabilities in the context of NAEP assessments. If larger numbers of children with more severe disabilities are excluded from testing, then the significance of any improvements in the performance of children tested is drawn into question. Likewise, lower performance of children with disabilities may reflect larger percentages of children with disabilities being tested.

Year Target Actual
1998 NA 41
2002 NA 39
2003 NA 33
2005 NA 35
2007 33 36
2009 31
2011 29
2013 28
Long-term Outcome

Measure: The percentage of eighth-grade public school students with disabilities included in the National Assessment of Educational Progress (NAEP) mathematics sample who are excluded from testing.


Explanation:This measure is critical in determining outcomes for children with disabilities in the context of NAEP assessments. If larger numbers of children with more severe disabilities are excluded from testing, then the significance of any improvements in the performance of children tested is drawn into question. Likewise, lower performance of children with disabilities may reflect larger percentages of children with disabilities being tested.

Year Target Actual
2000 NA 32
2003 NA 22
2005 NA 24
2007 23 31
2009 21
2011 19
2013 18
Long-term/Annual Outcome

Measure: The percentage of students with disabilities who graduate from high school with a regular high school diploma.


Explanation:

Year Target Actual
2000 NA 46
2001 NA 48
2002 NA 51
2003 NA 52
2004 NA 55
2005 54 54
2006 56 56.5
2007 57 Expected Oct. 2008
2008 58
2009 59
2010 60
2011 61
2012 62
2013 63
Long-term/Annual Outcome

Measure: The percentage of students with disabilities who drop out of school.


Explanation:

Year Target Actual
2000 NA 42
2001 NA 41
2002 NA 38
2003 NA 34
2004 NA 31
2005 34 28
2006 29 26.2
2007 28 Expected Oct. 2008
2008 27
2009 26
2010 25
2011 24
2012 23
2013 22
Long-term/Annual Outcome

Measure: The percentage of students with disabilities who graduate from high school with a regular high school diploma.


Explanation:

Year Target Actual
2000 NA 46
2001 NA 48
2002 NA 51
2003 NA 52
2004 NA 55
2005 54 54
2006 56 56.5
2007 57 Expected Oct. 2008
2008 58
2009 59
2010 60
2011 61
2012 62
2013 63
Long-term Outcome

Measure: The percentage of children with disabilities who are either competitively employed, enrolled in some type of postsecondary school, or both, within two years of leaving high school.


Explanation:

Year Target Actual
1987 NA 52
2004 NA 59
2005 59.5 75
2006 60 Undetermined
2007 60.5 Undetermined
Annual Efficiency

Measure: The average number of workdays between the completion of a site visit and the Department's response.


Explanation:

Year Target Actual
2004 NA 123
2005 NA 107
2006 113 50
2007 100 92
2008 95
2009 90
2010 88
Long-term/Annual Outcome

Measure: The percentage of students with disabilities in grades 3-8 scoring at the proficient or advanced levels on state reading assessments


Explanation:

Year Target Actual
2005 NA 38
2006 NA 38.7
2007 51.8 Expected Sept. 2008
2008 54
2009 61.7
2010 69.4
2011 77
2012 84.7
2013 92.4
2014 100
Long-term/Annual Outcome

Measure: The percentage of students with disabilities in grades 3-8 scoring at the proficient or advanced levels on state mathematic assessments.


Explanation:

Year Target Actual
2005 NA 38.5
2006 NA 37.8
2007 52.2 Expected Sept. 2008
2008 53.3
2009 61.1
2010 68.9
2011 76.7
2012 84.4
2013 92.2
2014 100
Long-term/Annual Outcome

Measure: The difference between the percentage of students with disabilities in grades 3-8 scoring at the proficient or advanced levels on state reading assessments and the percentage of all students in grades 3-8 scoring at the proficient or advanced levels on state reading assessments.


Explanation:

Year Target Actual
2005 NA 27.8
2006 NA 29.6
2007 21.6 Expected Sept. 2008
2008 22.2
2009 18.5
2010 14.8
2011 11.1
2012 7.4
2013 3.6
2014 0
Long-term/Annual Outcome

Measure: The difference between the percentage of students with disabilities in grades 3-8 scoring at the proficient or advanced levels on state mathematics assessments and the percentage of all students in grades 3-8 scoring at the proficient or advanced levels on state mathematic assessments.


Explanation:

Year Target Actual
2005 NA 24.9
2006 NA 27.2
2007 19.4 Expected Sept. 2008
2008 20.5
2009 17
2010 13.6
2011 10.2
2012 6.9
2013 3.4
2014 0

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program helps States and local educational agencies (LEAs) provide children with disabilities access to high quality education that meets challenging standards and prepares them for higher education, employment and independent living. However, many educational and State organizations, Members of Congress and others believe the program's main purpose should be to provide financial relief to school districts to help pay for special education.

Evidence: Individuals with Disabilities Education Act (IDEA) and legislative history. The Individuals with Disabilities Education Improvement Act of 2004, enacted on December 3, 2004, amended the IDEA. The purpose of the Grants to States program is to provide grants to assist States, outlying areas, freely associated States and the Department of the Interior to provide special education and related services to children with disabilities in accordance with IDEA Part B, which authorizes the program. The amendments added several new requirements to Part B that increase the program focus on improving results for children. A copy of the amendments is at http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: The program addresses specific needs of children with disabilities by: (1) ensuring access to education for children with disabilities by establishing basic service requirements that, in the absence of the program, might not be met; (2) improving educational outcomes of students with disabilities, who consistently do not perform as well as their nondisabled peers; and (3) providing financial assistance to States and LEAs to help pay for special education and related services.

Evidence: Access to education for all children is guaranteed by the Constitution of the United States (implicitly through the Equal Protection Clause), many State constitutions and laws, and Section 504 of the Rehabilitation Act of 1973. However, the IDEA statute defines more specifically how States and LEAs provide this access for children with disabilities.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: This program does not duplicate other Federal programs. Federally-run schools that provide special education (e.g., Department of Defense and Bureau of Indian Affairs schools) adhere to the IDEA's programmatic requirements. While States and LEAs pay for most of the costs of special education, the Federal program helps ensure that a minimum level of services and protections are provided to children with disabilities in each State.

Evidence: There are no other programs in the Department of Education that overlap with the Grants to States program. While Title I of the Elementary and Secondary Education Act (ESEA) Grants to Local Educational Agencies and the IDEA Grants to States program overlap in that a disproportionate number of children are both poor and have disabilities, most of the children served by the Grants to Local Educational Agencies program do not have disabilities, and most children served under the Grants to States program are not poor. We have not identified overlapping programs outside the Department.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The program is based on partnerships between the Federal Government, States, and LEAs, with each partner contributing resources toward the education of children with disabilities. To receive funds under this program, States and LEAs must follow the IDEA's specific statutory requirements regarding the services to be provided, due process protections, etc. Still, while this program significantly affects how States/LEAs provide special education, it has limted ability to ensure this education is of high quality. There is no conclusive proof that another approach would be more efficient or effective in meeting the purposes of the program. However, the absence of conclusive evidence does not mean that program improvements are not needed.

Evidence: IDEA Sections 611-618 spell out the program's major requirements. Since every State accepts IDEA funding, they have all agreed to follow the law's specific requirements. The Department of Education's monitoring activities evaluate the degree to which States comply with these requirements.

YES 20%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: Funds provided to States and LEAs under the program must be used to help provide special education and related services to children with disabilities. States and LEAs must also use federal funds to locate, identify, and evaluate children, the intended beneficiaries, to determine if they are eligible for special education and related services. However, in a limited number of situations, as provided for by the Individuals with Disabilities Education Improvement Act of 2004, services may be provided to nondisabled students as a prevention effort. For example, LEAs may now use a portion of their Federal funds to provide early intervening services (EIS) for children who have not been identified as needing special education or related services, but who need additional academic and behavioral support to succeed in a general education environment. With these services it may be found that these children will not ultimately require special education or that their special education needs may be reduced.

Evidence: Individuals with Disabilities Education Improvement Act of 2004, P.L. 108-446. (See especially sections 613(a)(2) and 613(f) of the amended IDEA). A copy of the amendments is at http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Department has established long-term indicators, measures, and targets that meaningfully reflect outcomes for children with disabilities. These indicators include graduation rates, drop-outs rates, and performance on the National Assessment of Educational Progress (NAEP), also known as the "Nation's Report Card." They also include indicators and goals related to State Adequate Yearly Progress for children with disabilities under the No Child Left Behind Act. A new long-term measure dealing with postsecondary enrollment and employment, which is currently based on data collected from longitudinal studies, will be implemented by September 2008. This new measure will be based on data collected from States on annual performance reports, and will be submitted to OMB for approval by September 30, 2008. One problem with the measures on graduations and drop-outs is that results for children with disabilities cannot be compared with those for children generally because of differences in how data are collected for these measures across States and between children with and without disabilities. The Department is working to resolve this problem and the Secretary has announced her intent to standardize the definitions used by States in reporting graduation rates to the Department for all children and subgroups. This will make comparisons between children with disabilities and other children possible with regard to the Department's general data collections. Unfortunately, it will not affect the statutory definition used for collecting drop-out and graduation data specifically under the IDEA on children with disabilities.

Evidence: FY 2009 Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf. Press release "U.S. Secretary of Education Margaret Spellings Announces Department Will Move to a Uniform Graduation Rate, Require Disaggregation of Data" http://www.ed.gov/news/pressreleases/2008/04/04012008.html

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The established targets and timelines for program indicators are ambitious and challenging. For example, the target percentages for the graduation of students with disabilities with regular diplomas increases by 1 percent each year, and the target percentages for students with disabilities who drop out decrease by 1 percent each year. These targets may appear to be modest, but, given that the successful high school outcomes are the results of years of education from pre-K through high school, they are ambitious and challenging. Recent increases in graduation rates and reductions in drop out rates for youth with disabilities also indicate that these targets are realistic.

Evidence: FY 2009 Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf, Department of Education Fiscal Year 2009 Performance Plan (see http://www.ed.gov/about/reports/annual/2009plan/fy09perfplan.pdf)

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The annual performance measures for the program are drawn from measures developed in accordance with the Government Performance and Results Act (GPRA) and are the same as the long-term performance measures for the program with the exception of 5 measures. Four of these measures deal with the performance of children with disabilities on the National Assessment of Educational Progress (NAEP). NAEP data is collected only every other year. A long-term measure dealing with postsecondary outcomes, which is currently based on data collected from longitudinal studies, is being revised to reflect data collected from States on annual performance reports. The annual measures are discrete, quantifiable, and measurable. For example, two Grants to States measures deal with the performance of children on assessments. One of these measures is performance on State assessments used to assess Adequate Yearly Progress (AYP) under the No Child Left Behind Act. The Department also has an operational efficiency measure to assess how quickly monitoring reports can be turned around for the States in order to help them identify deficiencies in program operations.

Evidence: FY 2009 Congressional Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf. Department of Education Fiscal Year 2009 Performance Plan (see http://www.ed.gov/about/reports/annual/2009plan/fy09perfplan.pdf

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The program has baselines and ambitious targets for all of its annual measures. These include graduation rates, drop out rates, and State efforts in achieving AYP.

Evidence: FY 2009 Congressional Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf. Department of Education Fiscal Year 2009 Performance Plan (seehttp://www.ed.gov/about/reports/annual/2009plan/fy09perfplan.pdf)

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The Individuals with Disabilities Education Improvement Act of 2004 amended IDEA to require each State to develop a performance plan, which must be approved by the Secretary of Education. These performance plans include performance indicators in areas critical to improving results for children with disabilities. States are required to set baselines and annual improvement targets and to report annually on their progress. The Department's Office of Special Education Programs (OSEP) administers IDEA and has funded technical assistance (TA) centers that are committed to supporting States in improving their performance on the indicators; a center has been assigned to each indicator. OSEP has developed performance indicators for the centers related to the quality, usefulness, and relevance of their products and services.

Evidence: IDEA Sections 616 and 618. A copy of the IDEA amendments is at frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf. OSEP technical assistance network information is at www.ed.gov/parents/needs/speced/resources.html. Part B State Performance Plan (SPP) and Annual Performance Report (APR) http://www.ed.gov/policy/speced/guid/idea/bapr/index.html.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The Individuals with Disabilities Education Act Amendments of 1997 required the Department to conduct a national assessment to determine the effectiveness of the IDEA in achieving its purposes. This assessment was conducted through contracts with independent contractors including Westat, SRI International, and Abt Associates to conduct longitudinal evaluations in various areas. One of these, the Special Education Elementary Longitudinal Study (SEELS), included a nationally representative sample of more than 11,000 students who were ages 6-12 and receiving special education services in grades 1 through 6 in 2000. This sample group represented about 43 percent of children with disabilities. Guided by its comprehensive conceptual framework, SEELS collected data three times over 5 years on student and family characteristics; students' school programs, instruction, and accommodations; and a broad set of student outcome measures, including academic progress and social adjustment. A series of SEELS reports documented changes in student characteristics, experiences, and outcomes from the three waves of data collection and examined trends in outcomes by comparing information reported in Wave 3 with the "baseline" information reported in Wave 1 for students for whom information are available. These trend data provide a unique opportunity to understand the variables related to growth in child outcomes, with a particular focus on factors amenable to intervention (e.g., placement, instructional groups, curricular modifications). Many of the more recent findings of the study are included in the SEELS "What Makes a Difference? Influences on Outcomes for Students with Disabilities" report published in February 2007. However, the Department has not sufficiently used the results of these longitudinal studies to support program improvements. Fortunately, the Individuals With Disabilities Education Improvement Act of 2004, which amended the IDEA, enhanced the independence of evaluation activities by transferring responsibility for these activities from the Office of Special Education, which administers the Grants to States program, to the Institute of Education Sciences. IES has begun the IDEA 2004 National Assessment of Progress to examine both the implementation of IDEA and the impact of programs and policies supported by this legislation. The implementation study involves a survey of both State educational agencies and LEAs that will yield information on the implementation of new requirements under the 2004 Amendments to IDEA. One of the impact studies, which employ rigorous research designs, will explore the differential effects of standard protocol versus problem-solving approaches to response to intervention on student outcomes. Response to intervention (RTI) is the practice of providing gradually increased levels of intervention depending on how children respond to lower levels of intervention. The highest levels of intervention are provided only after less intensive interventions, which have been shown to be effective with most children, have not achieved the desired results. The other study will determine the effects on students with disabilities of attending schools identified for improvement under NCLB for, at least, failing to achieve AYP for this subgroup of students. Combining evaluative information from child-based longitudinal studies with policy implementation and impact studies permits a multi-faceted examination of this complex program.

Evidence: What Makes a Difference? Influences on Outcomes of Students with Disabilities, February 2007 http://www.seels.net/info_reports/what_makes_difference.htm. Special Education Elementary Longitudinal Study (SEELS) (www.seels.net.

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: While IDEA funding has more than quadrupled in recent years, there is no evidence that this increased funding has directly improved educational outcomes for students with disabilities. State and local responsibilities for educating children with disabilities are not affected by changes in the Federal funding. While data show improved outcomes for children with disabilities, there is no demonstrated relationship between Federal funding and these improvements.

Evidence: The IDEA statute's requirements, and the number of children served under IDEA, are not contingent upon the size of the Federal appropriation.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The current measure of postsecondary outcomes for children leaving secondary school relies on information that is collected through longitudinal studies, which are not likely to be funded in the near future. To address this problem, the program has taken actions to revise the measure and obtain annual data. The revised measure will be based on data collected through Annual Performance Reports. The first postsecondary data from these reports became available in February 2008. These data are in the process of being analyzed and a baseline and targets will be developed by September 2008. Inconsistency in the definition for graduation among the States has also been a problem in gathering meaningful data on outcomes for children with disabilities and other children. One way in which is the Department is attempting to address this problem is through adopting a uniform definition for graduation under NCLB. This definition would also be used for children with disabilities under NCLB. The Department does not yet have an outcome-based efficiency measure for the program and needs to begin development of such a measure. In addition, the Department has committed to doing a better job of using evaluation results for program improvement.

Evidence: FY 2009 Congressional Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf. Department of Education Fiscal Year 2009 Performance Plan (see http://www.ed.gov/about/reports/annual/2009plan/fy09perfplan.pdf) Part B State Performance Plan (SPP) and Annual Performance Report (APR) http://www.ed.gov/policy/speced/guid/idea/bapr/index.html . Press Release on Move to Uniform Graduation Rate http://www.ed.gov/news/pressreleases/2008/04/04012008.html

YES 12%
Section 2 - Strategic Planning Score 88%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department of Education collects a wide range of data through Annual Performance Reports (APRs) from States. Data on critical performance areas are used in OSEP's Continuous Improvement and Focused Monitoring Process, which targets monitoring and technical assistance on those States that show poor performance. States are required to report annually on their performance against State-established targets on 20 indicators of results (graduation, dropout, performance on assessments) and compliance (timely resolution of constituent complaints, timely completion of initial evaluations). The Department reviews each State's APR and sends the State a response letter and table detailing the status of performance for each indicator. The response letter includes the State's "determination" under section 616 of the IDEA - either meets requirements, needs assistance, needs intervention or needs substantial intervention. One problem in using and interpreting data arises from the fact that each State has its own standards regarding some of the variables that affect data. For example, States with lower standards for graduation are likely to appear to be better performers than States with higher standards even though the States with higher standards may be providing higher quality education to children with disabilities.

Evidence: Annual Reports to Congress on the Implementation of IDEA (www.ed.gov/about/reports/annual/osep/index.html); Web data (www.ideadata.org/AnnualTables.asp); Department Response Letters to Annual Performance Reports (http://www.ed.gov/fund/data/report/idea/partbspap/index.html) April 6, 2004 memorandum from OSEP to Chief State School Officers on Implementation of OSEP's Continuous Improvement and Focused Monitoring System (www.ed.gov/policy/speced/guid/idea/monitor/index.html). See OSEP Memorandum 04-09. IDEA sections 616 and 618. A copy of the IDEA amendments is at frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf. Press Release on Move to Uniform Graduation Rate http://www.ed.gov/news/pressreleases/2008/04/04012008.html

YES 14%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Since 2005, the Department's Organizational Assessment has included accountability measures for OSEP staff and partners. State performance on key outcomes such as high school graduation and reading and math achievement for students with disabilities are measured as well as the usefulness and quality of technical assistance provided by our TA grantees and contractors. Measures also address significant aspects of the program's operation that affect States' ability to improve performance, such as issuing APR letters (see 3.1 above) and making timely formula grant awards. The Organizational Assessment also measures our effectiveness in closing out formula grant awards and assisting States in liquidating obligations. OSEPS' was rated fully successful in its most recent assessment. The "organizational priorities" element of the EDPAS Employee Performance Plan measures each employee's contribution to success in meeting agency goals and objectives. The job-specific standards developed under this element are results driven and hold employees accountable for achieving measurable results. Those results are linked to applicable Strategic Plan responsibilities, specific GPRA indicators, the Blueprint for Management Excellence and other organizational goals and objectives that are aligned with the President's Management Agenda and ED's mission and vision. In OSEP, the Director and Division directors are responsible for developing annual and long-term performance goals, measures, and targets, and data collection strategies and instruments needed to measure performance. In addition, a Continuous Improvement and Focused Monitoring System has been implemented by OSEP to focus monitoring on those areas that are most critical to improving results for children with disabilities. Many of these areas are reflected in the program's GPRA indicators. The Individuals with Disabilities Education Improvement Act also establishes new requirements for States to develop and implement State Performance Plans (SPPs) to focus State resources on improving results for children with disabilities, and to report annually on their progress or slippage in these efforts. Each State receives a letter detailing their performance on the 20 indicators in the SPP. Where a State has slippage in performance it must develop and implement improvement activities to address the slippage. In subsequent reviews of the APR, OSEP will evaluate the impact of the revised improvement activities and may also conduct on-site verification of the data and improvement activities.

Evidence: Department of Education EDPAS system. IDEA section 616 http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf. Part B State Performance Plan (SPP) and Annual Performance Report (APR) http://www.ed.gov/policy/speced/guid/idea/bapr/index.html.

YES 14%
3.3

Are funds (Federal and partners') obligated in a timely manner, spent for the intended purpose and accurately reported?

Explanation: Federal funds pay for only a small percentage of the total cost of special education. The IDEA statute provides broad authority for how Federal funds can be used. When Federal funds are found to be improperly spent it is usually due to an accounting error. Federal obligations are consistently made in a timely manner. OSEP personnel review the expenditure amounts of grantees monthly (more frequently as the end of the fiscal year approaches) to identify and address potential problems in the liquidation of obligations. Unexpended balances from prior years have been substantially reduced since 2001. In addition, in 2006 OSEP implemented on-site verification of States' fiscal management procedures. OSEP also actively resolves IG audit findings such as those identified in the 2006 IG audit of the Bureau of Indian Education. OSEP worked closely with the Department of Interior to ensure that BIE had effective policies and procedures in place to ensure that funds are spent appropriately. BIE provided documentation to verify that effective procedures had been implemented and resulted in compliant practices related to use and distribution of funds.

Evidence: http://wdcrobcolp01.ed.gov/CFAPPS/grantaward/start.cfm search 84.027.

YES 14%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The Department is engaged in several activities designed to improve program efficiency. OSEP's Continuous Improvement and Focused Monitoring Process (CIFMP) seeks to focus Federal managers and grantees on those areas that are most likely to improve results for children with disabilities. The Department is also taking positive steps in collecting data on children with disabilities and other children through its EDFacts initiative. EDFacts is centralizing performance data supplied by SEAs with other data assets within the Department to enable better analysis and use of data in policy development, planning, and management. The initiative is on target to improve State data capabilities by providing resources and TA; reducing the burden of duplicative data collections; and streamlining data practices at the Federal, State and local levels. The Department has an operational efficiency measure to assess how quickly monitoring reports can be turned around. This can help OSEP and the States identify deficiencies in program operations and improve program performance. However, the Department does not yet have an outcome-based efficiency measure for the program and needs to begin development of such a measure.

Evidence: Memoranda and forms related to the Department's special education. Continuous Improvement and Focused Monitoring Process. www.ed.gov/policy/speced/guid/idea/monitor/index.html. See OSEP Memorandum The EDFacts Initiative http://www.ed.gov/about/inits/ed/edfacts/index.html

YES 14%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Under the IDEA, improving results for children with disabilities is a collaborative effort including Federal, State, and local partners. The Individuals with Disabilities Education Improvement Act of 2004 amended IDEA to reinforce this partnership by requiring each State to develop a State Performance Plan, which must be approved by the Secretary of Education. These plans must reflect how the States will improve implementation of the IDEA at the State and local levels. OSEP's Continuous Improvement and Focused Monitoring Process further helps to focus States and LEAs on critical areas such as graduation and drop-outs. To assist States in implementing their plans, OSEP maintains a technical assistance network, which provides, among other things, information on best practices in special education. OSEP also collaborates extensively with the Office of Elementary and Secondary Education (OESE) to maximize the Department's TA resources and expertise. For example, since 2005 OSEP has helped fund three of the OESE Comprehensive Content TA Centers. OSEP and OESE TA staff have collaborated from 2006 to 2008 on a joint conference of the TA projects funded by the two offices, and the OESE TA centers are now a part of the OSEP TA Network's "MATRIX" which is a customer friendly data base of all TA provided by these centers organized by State, SPP/APR indicator, and topic. OSEP has collaborated extensively with OESE on ensuring that students with disabilities are appropriately assessed under NCLB, including sponsoring a joint conference for states in July 2007, followed by a General Supervision Enhancements Grants competition to make funds available for States to develop assessments for children with disabilities counted as proficient based on alternate achievement standards ("1% rule") and children counted as proficient based on modified achievement standards ("2% rule"). In 2007, monthly joint TA calls were held on topics related to improving instruction and assessment of these students. Additional conference calls and a second TA conference for States are being planned for 2008. OSEP has collaborated with OESE, the Office of English Language Acquisition (OELA), and the Institute of Education Sciences (IES) on providing leadership to States in the implementation of Response to Intervention (RTI), including a National Summit on RTI held in December 2007, followed by the development of extensive TA on the use of Title I and Title III of ESEA and IDEA/Early Intervening Services (EIS) funds to support RTI. OSEP also collaborates with other Federal entities. For example, in 2008 it is conducting joint monitoring activities with the Rehabilitation Services Administration (RSA), serves on a multi-agency work group making recommendations on the implementation of the President's Executive Order on Coordination of Human Transportation Services, and, through its National Center on Accessing the General Curriculum, worked with the National Institute of Standards and Technology to develop the National Instructional Materials Accessibility Standard. OSERS is also working with others on activities to improve the transition of students from school to work or postsecondary education. Three regional meetings are planned in 2008 to assist State teams from those regions (including state VR agency representatives) in identifying strategies for collecting, reporting, and using data for four transition-related State performance indicators, and a national State transition planning institute is scheduled for May 2008. In addition, OSERS is implementing an initiative to assist States with emerging comprehensive State-level transition systems by proving intensive TA to improve school and community-based services. The workgroup that planned the initiative included RSA staff, the Council of State Administrators of Vocational Rehabilitation, and a number of the Department's TA grantees.

Evidence: OSEP technical assistance network information is at www.ed.gov/parents/needs/speced/resources.html. A copy of the IDEA amendments is at frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf - see particularly section 616.

YES 14%
3.6

Does the program use strong financial management practices?

Explanation: The Department conducts periodic monitoring of State activities under this program, and States are required to conduct annual audits of their education programs. No internal control weaknesses have been reported by auditors. Few audit problems related to use of funds are encountered. The IDEA amendments of 1997 made a number of changes in the way funds are distributed within States. A number of States incorrectly applied the new formula for within State distributions. These errors were found and corrected.

Evidence: Information on Grants to States monitoring can be found at www.ed.gov/policy/speced/guid/idea/monitor/index.html.

YES 14%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The Department routinely reviews program management to identify weaknesses. These reviews have addressed timely issuance of monitoring reports, tracking of required corrective actions from States, and tracking States' draw down of funds. In 2004, procedures for developing and issuing monitoring reports were revised and reports are now issued in an average of 3 months after a visit as compared to the previous average of 13 months. A data base was developed to track due dates for State submissions of corrective actions and program timelines for responding to the submissions. To ensure timely issuance of documents to States, a Department shared drive is being used to store documents that are being reviewed by multiple offices. In addition, when responses to States' submissions are going through Department clearance, the shared drive is used for transmitting drafts for review. Recent reviews of major program functions have resulted in the development of standard language for use in routine submissions to States to ensure timely and consistent responses. The Department has implemented a systematic process of reviewing the Department's Grant Award and Payment System (GAPS) data to determine if States have drawn down a reasonable percentage of funds. In response to initial PART improvement actions proposed in 2003, the Department now collects timely NAEP data on children with disabilities that meets the same standards as other NAEP data, and the Office of Special Education Programs, which administers the Grants to States program, has improved its collaboration with other Federal programs within the Department (e.g. the Rehabilitation Services Administration) and in other Departments (e.g. the Centers for Medicare and Medicaid Services in the Department of Health and Human Services).

Evidence: See information on monitoring at www.ed.gov/policy/speced/guid/idea/monitor/index.html

YES 14%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: Most Grants to States goals are both long-term and annual. The 5 exceptions are 4 National Assessment of Educational Progress (NAEP) goals, which are long-term because data are collected every 2 years, and 1 postsecondary outcomes goal, for which data are currently collected periodically through a longitudinal study. Of the programs 11 long-term performance goals, the program has shown solid progress over time in achieving 5; data on 2 goals dealing with the extent to which students with disabilities are excluded from the NAEP do not clearly indicate a significant changes; and data are not available on performance over time for 4 NCLB goals. Many Grants to States goals are intended to complement each other. For example, goals dealing with how children with disabilities exit special education include dropout and graduation goals. These goals need to be considered together in determining whether there are improvements in the way children with disabilities exit special education. If children were not dropping out but were, at the same time, not graduating, we would be concerned the quality of the children's secondary education. Between 2000 and 2006 the dropout rate for children with disabilities fell from 42% to 26% and graduation rate increased from 46% to 57%. However, these improvements might be due to States changing their definitions of graduation or drop-out rates and a change in methodology for computing these rates made by the Department in 2004. There are several goals related to the educational performance of children with disabilities. In this regard, the improvements shown in NAEP performance for children with disabilities would be questionable if they were associated with more children with disabilities being excluded from testing. While there do not appear to be significant reductions in the NAEP exclusion rates for children with disabilities, the overall picture of performance for children with disabilities on the NAEP is very positive. We know that performance has improved and that this improvement does not appear to have been at the expense of fewer children being tested. Between 2003 and 2007 the percent of children with disabilities scoring at or above basic on the NAEP increased from 29%to 36% for 4th grade English and from 29% to 33% for 8th grade math. However, preliminary results for students with disabilities on State assessments do not show meaningful improvements or decreases in the performance gap between students with disabilities and their non-disabled peers.

Evidence: FY 2009 Congressional Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf. Information on progress toward the long-term goal to improved postsecondary outcomes is currently collected through periodic longitudinal studies. Data from these studies indicate that the percentage of students employed or enrolled in postsecondary education or both increased from 52 percent for 1987 to 59 percent for 2003. The Department of Education is in the process of developing a new measure for postsecondary outcomes based on annual collections of data on employment and postsecondary education. There are two problems with NAEP data. First, some unknown number of children with disabilities are excluded from the NAEP sample because they go to schools specifically for children with disabilities. These schools are not included in the NAEP sample base. The number of these children is relatively small. Also, within the sample base, a large proportion (3 percent to 5 percent of all children or about 1/3 of children with disabilities) are excluded from testing because of their disabilities. In determining AYP States may assess up to 2 percent of children using modified achievement standards and up to 1 percent using alternate assessments against alternate standards.

SMALL EXTENT 8%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program is meeting its short term goals. Graduation rates and NAEP performance are improving, and drop-out rates are declining. Between 2000 and 2006 the graduation rate increased from 46% to 57% for children with disabilities and the dropout rate fell from 42% to 26%. Moreover, the Individuals with Disabilities Education Improvement Act of 2004 reinforced the link between Federal and State goals by directing the Department to consider relevant data (such as graduation and drop out rates) in monitoring States. Meeting AYP goals established by States under NCLB is reflected in one of the program's goals. Meeting AYP goals is also expected to have a positive impact on graduation rates, NAEP performance, and drop-out rates.

Evidence: FY 2009 Congressional Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf. Individuals with Disabilities Education Improvement Act of 2004. A copy of the amendments is at http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf. No Child Left Behind Act of 2001. Department of Education Fiscal Year 2006 Performance Plan (see http://www.ed.gov/about/reports/annual/2006plan/fy06perfplan.pdf) Department of Education Revised Fiscal Year 2005 Performance Plan (see http://www.ed.gov/about/reports/annual/2006plan/fy06perfplan.pdf).

LARGE EXTENT 17%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department is working to increase the effectiveness and efficiency of its special education data collection by collecting increasing amounts of the data required under section 618 of the IDEA through the EDEN/EDFACTS system. EDEN/EDFACTS data are collected across Department programs as part of a single cross-program data request to States and can be submitted electronically at the point in time when they are available. The State EDFACTS coordinator coordinates data submissions across programs and serves as a single point of contact for resolving any questions related to accuracy. As of April 2008, EDFacts has made provisions to collect data on 6 of 7 Part B section 618 data collections. More than two-thirds of States are approved to submit data only through EDFacts for at least three of these collections. It is anticipated that EDFacts will be ready to collect data on the seventh collection area (Dispute Resolution) by 2009. By 2010 (for data referencing the 2008-09 school year), States are expected to be reporting all of their Part B section 618 data through the EDFACTS system. However, none of these separate collections have been turned 'off' yet. The Department has also made timeliness and accuracy of State reported data an element of State Performance Plans. States are required to report progress on implementing these plans Annual Performance Reports to the Department. Improvements in both the timeliness and accuracy of State data is expected to enhance the information available for management decisions on program implementation, The Department has developed and implemented an efficiency measure to determine how quickly ED can provide findings reports back to the States after monitoring visits. The average number of work days between the completion of site visits and the Department's responses were 123 days in 2004, 107 days in 2005, 50 days in 2006, and 92 days in 2007. This efficiency measure directly relates to program performance because the faster an SEA receives the report identifying monitoring findings, required corrective actions, and required documentation, the sooner it can make program improvements. To ensure that reports are completed on schedule, monitoring team leaders report weekly to the program director on their progress towards completing reports. Once an SEA receives the report, program staff track any required SEA response to ensure that it is provided to the Department on time.

Evidence: Evidence: FY 2009 Budget Justification to Congress http://www.ed.gov/about/overview/budget/budget09/justifications/h-specialed.pdf.

SMALL EXTENT 8%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: Explanation: There are no comparable programs serving this population.

Evidence: Evidence:

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: The Special Education Elementary Longitudinal Study (SEELS) was conducted by OSEP from 2000 to 2007 and included a nationally representative sample of more than 11,000 students who were ages 6-12 and receiving special education services in grades 1 through 6 in 2000. SEELS will be releasing an Executive Summary highlighting study findings in 2008. To be included in this summary are findings on student progress in academic achievement. Many of the more recent findings of the study are included in the SEELS "What Makes a Difference? Influences on Outcomes for Students with Disabilities" report published in February 2007. IDEA has long required that, to the maximum extent appropriate, children with disabilities are educated with children who are not disabled. One finding reported in the "What Makes a Difference?" report is that, after controlling for differences between students, students with disabilities who took more of their academic classes in regular education classrooms had higher reading and mathematics scores and read more fluently than students who took fewer of their academic classes in such settings. Results from the SEELS reading assessment showed that on average, in 2001 (Wave 1), students with disabilities read correctly 76 words per minute on a fourth-grade passage. This rate was comparable to the reading abilities of fourth graders in the general population, although at the time, more than half of the students with disabilities were in fifth through seventh grade. Reading fluency increased to 109 words per minute in 2004 (Wave 3), an increase of 33 words per minute, or approximately 11 words faster each year. This represents a significant improvement in reading fluency over time. SEELS also administered student assessments in mathematics that included the Woodcock-Johnson III mathematics calculation subtest that measures students' computation skills, ranging in difficulty from elementary computations (e.g., simple addition) to calculus (e.g., function integration). In a 2007 SEELS report on these data, the median mathematics calculation performance of all students improved over the 3-year period, with median increases of 13 to 18 W-score points. With respect to course grade changes over time, this 2007 report included findings suggesting that overall, students with disabilities received higher grades in Wave 3 (2004) than in Wave 1 (2001). Specifically, about 59% of students earned mostly As and Bs, or Bs and Cs in Wave 1, increasing to 66% in Wave 3. With respect to SEELS' measures of children's social adjustment, overall, participation by students with disabilities in elementary/middle school or community social groups was high (around 70%) and remained stable from Wave 1 to Wave 3. Moreover, the pattern of involvement in school disciplinary incidents was consistently low in both Waves 1 and 3-the majority of students with disabilities were not involved in any type of school disciplinary incident in either year (70% and 68%, respectively).

Evidence: What Makes a Difference? Influences on Outcomes of Students with Disabilities, February 2007 http://www.seels.net/info_reports/what_makes_difference.htm. Completed reports are posted at http://www.seels.net/infoproduct.htm. Findings from the Special Education Elementary Longitudinal Study: Executive Summary 2000-2007, to be published in 2008. IDEA section 612(a)(5) http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=108_cong_public_laws&docid=f:publ446.108.pdf

SMALL EXTENT 8%
Section 4 - Program Results/Accountability Score 42%


Last updated: 09062008.2008SPR