ExpectMore.gov


Detailed Information on the
IDEA Special Education Personnel Preparation Grants Assessment

Program Code 10001039
Program Title IDEA Special Education Personnel Preparation Grants
Department Name Department of Education
Agency/Bureau Name Office of Special Education and Rehabilitative Services
Program Type(s) Competitive Grant Program
Assessment Year 2003
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 0%
Program Management 60%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $90
FY2008 $88
FY2009 $88

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Take steps to ensure that program performance data from the program's Personnel Preparation Database are made available to the public in a transparent and meaningful manner.

Action taken, but not completed The Department already publishes data collected through the on-line Personnel Prep Data collection, at www.OSEPPPD.org. However, the format that these data are presented in doesn't meet current OMB standards for transparency, as specified in PART guidance.
2007

Implement the new program evaluation, to determine how effectively the program achieves its key outcomes.

Action taken, but not completed In summer 2006, the IES' National Center for Educational Evaluation (NCEE) awarded a SOW for a 7 month task order to help design an independent, rigorous evaluation strategy for this program. The design work has now been completed, and IES awarded the 4-year evaluation contract to Westat in fall 2007.
2007

Ensure that reliable and accurate data are collected for the program's new annual, long-term, and efficiency measures.

Action taken, but not completed A single year of data have been collected for 5 of the 8 measures established for this program, but the Department is still working to improve the overall quality, validity, and reliability of these data. Data have yet to be collected for the long-term performance measures.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Develop a new program evaluation strategy, along with a schedule for independent program evaluation(s), that will yield reliable information on how effectively the program achieves its key outcomes.

Completed In summer 2006, the IES' National Center for Educational Evaluation (NCEE) awarded a SOW for a 7 month task order to help design an independent, rigorous evaluation strategy for this program. The design work has now been completed, and IES expects to award the contract to implement this evaluation in Fall 2007.
2006

Develop performance measures and goals that appropriately reflect the impact of the federal government's investment in increasing the supply and/or quality of special education personnel.

Completed
2006

Develop program efficiency measures.

Completed An efficiency measure has been developed for the program, and program staff are currently reviewing data.
2005

Finalize a data collection strategy to ensure that reliable and accurate data are collected for the program's new annual and long-term performance measures.

Completed

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Percentage of scholars completing IDEA-funded training programs who are knowledgeable and skilled in scientifically- or evidence-based practices for infants, toddlers, children and youth with disabilities.


Explanation:Baseline data have not yet been collected.

Year Target Actual
2006 Baseline 91%
2007 Baseline + 1% [October 2008]
2008 Baseline + 2% [October 2009]
2009 Baseline + 3% [October 2010]
Long-term Outcome

Measure: Percentage of low incidence positions that are filled by personnel who are fully qualified under IDEA.


Explanation:Baseline data have not yet been collected.

Year Target Actual
2007 Baseline [October 2008]
2008 Baseline + 1% [October 2009]
2009 Baseline + 2% [October 2010]
Annual Outcome

Measure: Percentage of projects that incorporate scientifically- or evidence-based practices.


Explanation:Baseline data have not yet been collected.

Year Target Actual
2005 Baseline 68%
2006 Baseline 41.5%
2007 Baseline + 1% [October 2008]
2008 Baseline + 2% [October 2009]
2009 Baseline + 3% [October 2010]
Annual Output

Measure: Percentage of scholars who exit training programs prior to completion due to poor academic performance.


Explanation:Baseline data have not yet been collected.

Year Target Actual
2005 n/a 1.3%
2006 1% 1.7%
2007 1% [October 2008]
2008 1% [October 2009]
2009 1% [October 2010]
Annual Outcome

Measure: Percentage of degree/certification recipients who are working in the area(s) for which they were trained upon program completion.


Explanation:

Year Target Actual
2002 n/a 55.4%
2003 n/a 57.8%
2004 n/a 62.8%
2005 n/a 63.1%
2006 n/a 59.1%
2007 69% [October 2008]
2008 72% [October 2009]
2009 75% [October 2010]
2010 78% [October 2011]
Annual Outcome

Measure: Percentage of degree/certification recipients who are working in the area(s) for which they were trained upon program completion and who are fully qualified under IDEA.


Explanation:

Year Target Actual
2007 establish baseline data lag [Oct. 2008]
Annual Outcome

Measure: Percentage of degree/certification recipients who maintain employment in the area(s) for which they were trained for 3 or more years and who are fully qualified under IDEA.


Explanation:

Year Target Actual
2007 establish baseline data lag [Oct. 2008]
Annual Efficiency

Measure: Percentage of funds expended on scholars who drop out of programs because of: 1) poor academic performance; and 2) scholarship support being terminated when the federal grant to their institution ends.


Explanation:This measure is derived by dividing the total funds expended during a single academic year on students who drop out for academic reasons and students who drop out when their scholarship support is terminated because the federal grant to their institution ends by the total funds expended on student support during that same academic year.

Year Target Actual
2003 n/a 5%
2004 n/a n/a
2005 n/a n/a
2006 n/a 0.29%
2007 maintain baseline [October 2008]
2008 maintain baseline [October 2009]
2009 maintain baseline [October 2010]

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program's purpose is to improve the supply and training of special education personnel, targeting the following four areas: (1) personnel to serve children with low-incidence disabilities; (2) personnel to serve children with high-incidence disabilities; (3) leadership personnel; and (4) projects of national significance. There is disagreement, however, (particularly in high-incidence) as to whether the primary purpose of this program is to provide scholarships to increase the quantity of aspiring special education personnel, or to improve the quality of academic programs for these personnel. The Personnel Preparation program has only existed in its current form since the 1997 IDEA re-authorization. The upcoming 2003 re-authorization of IDEA is also likely to lead to significant programmatic changes. For example, the House Bill (H.R. 1350) eliminates the high-incidence authority and takes steps to focus program expenditures related to high-incidence personnel on qualitative rather than quantitative interventions.

Evidence: As defined in regulations, the program's purpose is to: "address State-identified needs for qualified personnel in special education, related services, early intervention, and regular education, to work with children with disabilities," and to "ensure that those personnel have the skills and knowledge, derived from practices that have been determined, through research and experience, to be successful, that are needed to serve those children." Also see IDEA, Part D, Section 673.

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: Persistent shortages of qualified personnel have been identified since the enactment of the Education of All Handicapped Children Act (P.L. 94-142) in 1975. Although it is not possible to provide reliable estimates of the numbers of special education teachers and related personnel trained over time, the various Federal "personnel" program authorities have made significant investments towards the goal of increasing the supply of special education personnel. The funding level for personnel authorities increased from approximately $2.5 million in FY 1963 to nearly $13 million in FY 1964, and continued to increase to nearly $55 million in the early 1980's. Despite such investments, very serious shortages still exist. The quality of special education training programs is also consistently raised as an issue requiring attention. While it is difficult to identify the specific attributes of a "high quality" training program, all projects funded under this authority are required to take steps designed to lead to improvements in quality (e.g. - by using curricula and pedagogy that are shown the be effective, and demonstrating how research-based curricula and pedagogy are incorporated into training requirements).

Evidence: Quantity - State reported data indicate that approximately 47,532 special education teachers, roughly 11.4 percent of special education positions nationally, were not fully certified for their main teaching assignment for the 2000-2001 school year (up 1.4 percent from the 1999-2000 school year). According to SPeNSE (a national study of special education personnel issues), during the 1999-2000 school year more than 12,000 openings for special education teachers were left vacant or filled by substitutes. While there is some debate about severity of shortages, there is agreement that shortages do exist in most States. According to recent estimates by ED, the President's Commission on Excellence in Special Education, and the Council for Exceptional Children, the U.S. will need over 200,000 teachers to fill open positions during the next 5 years. Quality (of teacher training programs) - the most serious problems are: (1) the absence of a reliable research base; and (2) insufficient understanding of which program attributes lead to improved student outcomes. Recent testimony by leading researchers before the President's Commission revealed a complete lack of research that indicates whether or not "certification and years of experience are reliable predictors of student achievement."

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any Federal, state, local or private effort?

Explanation: The program makes a unique contribution by investing in key areas of special education personnel training (mostly at the higher education level) where the incentive for meaningful State and/or local educational agencies' investment is low. Although the current IDEA Part D State Improvement Grants (SIG) program also makes significant contributions to State identified special education personnel issues, SIG funds are devoted almost exclusively to in-service (professional development) activities. While funds under both HEA Title II and ESEA Title II may also be used to train special education teachers (along with general education teachers in relevant areas of special education), there is no evidence that funds are being used for this purpose.

Evidence: Grantees supported through low-incidence, leadership, and national significance grants conduct work primarily in areas where SEAs and LEAs have little incentive to invest, or insufficient capacity to produce meaningful results. Particularly in these critical programmatic areas there is no excessive overlap with other Federal or non-Federal efforts. In each of these areas, ED is the primary source of funds.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The program does not have any major design flaws that prevent it from meeting its defined objectives. It effectively supports training for personnel by concentrating the largest portion of its funds in areas where States have limited capacity and/or incentive to invest (e.g. - low-incidence and leadership). However, the program could be even more effectively targeted. A significant portion of the program's funds are currently used to support training for high-incidence personnel (from fiscal year 2002 through 2004, approximately $48 million or 17.5 percent of all program funds support new and continuation grants under high-incidence). It is unlikely that these investments will lead to measurable benefits, because annual program funds ($90 million) are insignificant compared to the total funds devoted to training high-incidence personnel from other sources (While it is not possible to develop an accurate estimate, many $ billions are devoted to such training annually. Examples of other sources of support for training include: Federal student loan programs, private foundations, personal savings, State and local tax dollars, etc . . .).

Evidence: The largest portion of funds under this program are devoted to low-incidence ($35 million or 32 percent) but a significant portion ($14 million or 13 percent) support continuation grants under high-incidence. Studies outlining the history of the Federal role in special education teacher training suggest that this role (at least in the area of high-incidence) has shifted dramatically over time. During the early years of ED's support for personnel activities (1963 to 1980), Federal contributions helped establish and solidify the field of special education as a separate profession, actually starting training programs in many institutions of higher education (IHEs) where none existed before. More recently, however, this balance has shifted significantly. Although it is not possible develop reliable estimates of total overall investments (from all non-Part D sources) to training special education and related personnel, it appears that (in relation to total sum) the share of funds available through the Personnel Preparation program is substantially less than it used to be. According to NCES' Integrated Postsecondary Education Data System (IPEDS), as of fall 2001 approximately 357 degree-granting institutions offered masters-level training in the area of General Special Education (this category excludes low-incidence fields of study such as deaf and hearing impaired, emotionally handicapped, and multiple handicapped). By comparison, in 2002 a total of 55 public and private institutions received awards to support training for high-incidence personnel at all levels (average annual award amount is $200,000) through the Personnel Preparation program.

YES 20%
1.5

Is the program effectively targeted, so program resources reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: As discussed above, the program's authority is intentionally broad and highly flexible (in order to increase the likelihood that investments in critical areas will impact the field). Within this broad authority, program funds are targeted effectively to activities where investments are most likely to yield the greatest impact (e.g. - low-incidence and leadership). But, limiting the program's current scope of authority and/or concentrating limited program funds more strategically could produce more significant effects. For example, targeting high-incidence program funds on qualitative interventions would most likely yield a greater impact.

Evidence: Program funds currently support interventions designed to address issues related to both quality and quantity. The current statutory authority clearly envisions both of these as areas where the Federal role should be strong. For example, "activities incorporating innovative strategies to recruit and prepare teachers and other personnel to meet the needs of areas in which there are acute and persistent shortages of personnel" are explicitly authorized in IDEA section 673. Given the relative size of the program and the wide variety of activities currently authorized, however, enhancing the supply of high-incidence personnel is an unrealistic goal (for this program at its current funding level). Program funds could be more effectively utilized if the high-incidence authority were either eliminated or designed specifically to support qualitative interventions for personnel training programs.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program does not have quantifiable long-term performance goals that focus on either quantitative or qualitative aspects of the program's purpose. Program staff recently participated in a Department-wide planning activity and are curretnly developing program specific performance measures. These draft indicators however are not yet being used. The Department is also working with OMB on developing an appropriate efficiency measure for this program.

Evidence: Program GPRA reports and assorted analyses of program related activities.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program does not have meaningful long-term measures.

Evidence: N/A

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that demonstrate progress toward achieving the program's long-term measures?

Explanation: Four broad GPRA performance measures are now used for all IDEA Part D programs, including Personnel Preparation. These goals are intended to determine whether the Part D programs: (1) respond to critical needs of children with disabilities and their families; (2) use high quality methods and materials; (3) communicate effectively with target audiences; (4) produce products and practices that are actually used. Unfortunately, these indicators (along with the methodologies used to measure them) do not meaningfully address the Personnel Preparation program's responsiveness to its stated goals. Program staff also maintain a separate set of "unofficial" measures that are more closely tailored to Personnel Preparation activities, and that are linked to a separate (2 year old) data collection called the Personnel Prep Data collection (PPD). Because participation in the PPD collection is voluntary, OSEP has agreed that this data would not be used for accountability purposes. Starting next year, however, OSEP intends to require all grantees to participate in data collections as a condition for receipt of funds. Once this requirement is in place, PPD data will be used for accountability purposes. ED, program staff, and OMB are currently working to define a limited number of more appropriate and ambitious annual performance goals for this program.

Evidence: Personnel Preparation GPRA goals and indicators are: (1) The percentage of IDEA program activities that are determined by expert panels to respond to critical needs of children with disabilities and their families will increase; (2) Expert panels determine that IDEA-funded projects use current research-validated practices and materials; (3) The percentage of IDEA-funded projects that communicate appropriately with target audiences will increase; (4) Expert panels determine that practitioners, including policy-makers, administrators, teachers, parents, or others as appropriate, use products and practices developed through IDEA programs to improve results for children with disabilities. "Unofficial" goals for this program are: "PPD1: Increase in the number of IDEA-supported pre-service students who successfully complete training requirements; PPD2: Increase in the percentage of IDEA-supported pre-service student completers who are members of underrepresented populations; PPD3: Increase in the number of IDEA-supported students who are trained in areas of greatest need."

NO 0%
2.4

Does the program have baselines and ambitious targets and timeframes for its annual measures?

Explanation: See above.

Evidence: See above.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, etc.) commit to and work toward the annual and/or long-term goals of the program?

Explanation: OSEP takes specific steps to ensure that all partners commit to and work toward the existing annual goals. Program solicitations (priority packages) explicitly include all program goals, and grant applications and progress reports assess performance and continuing relevance against these goals. Although existing program measures do not meaningfully measure the program's responsiveness to its stated goals, all partners do commit to and work towards these goals. Program staff are also currently working to develop both annual and long-term goals that are more appropriate for this program. Once the revised annual and long-term goals are implemented, OSEP can continue to use its current process to ensure that all program partners actually commit to and work toward the new measures.

Evidence: Program priority packages.

NO 0%
2.6

Are independent and quality evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: No independent evaluations of this program exist.

Evidence: No independent evaluations of this program exist.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: In the absence of long- and short-term goals that yield reliable and appropriate program outcomes data, it is not possible to link the budget request to accomplishment of such goals. Budgeting is not currently linked to long-term goals and/or a strategic plan.

Evidence: N/A

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Although OSEP has been working to address its strategic planning deficiencies, meaningful actions to eliminate such deficiencies have not yet been implemented. As OSEP works to address planning deficiencies, it is placing particular emphasis on "adopting a limited number of specific, ambitious long-term performance goals and a limited number of annual performance goals." (OMB Memorandum No. 861)

Evidence: The program is actively participating in a Department-wide Teacher Quality common measures meeting (which includes all ED teacher quality staff and relevant OMB staff). Among other things, participation in this group is intended to yield a long-term program indicator. OSEP developed Program staff are also working with relevant Budget and OMB staff to develop more appropriate short-term goals and indicators.

NO 0%
Section 2 - Strategic Planning Score 0%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: All Personnel Preparation grantees are required to submit Annual Performance Reports and Final Reports. Data gathered in such reports occasionally translate into improved performance/accountability for grantees, but are not linked to more formal ED data/management initiatives. For example, existing GPRA indicators and data generated through this reporting process do not measure the actual performance of existing grantees. Instead, GPRA indicators gauge what grant recipients propose to accomplish. Available GPRA data is not used in program management. Limitations in the relevance of data gathered through GPRA and annual reports hamper meaningful use of such information for management and improved performance. However, the program is taking meaningful steps towards utilizing newly available data gathered through the PPD for accountability purposes. Next year, OSEP has agreed to require all grantees to participate in relevant data collections as a condition of receiving funds.

Evidence: Priority notices and EDGAR require grantees to submit Annual Performance Reports and Final Reports. Grantee participation in the separate OSEP PPD data collection is now voluntary and not used for accountability purposes to encourage increased participation. Starting next year, however, OSEP will require all grantees to participate in relevant data collections as a condition of receiving an award. This will help to address the link between data collection and program management by allowing program staff to use the best available data for accountability purposes.

NO 0%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, cost-sharing partners, etc.) held accountable for cost, schedule and performance results?

Explanation: ED's managers are subject to EDPAS which links employee performance to relevant Strategic Plan goals and action steps, and is designed to measure the degree to which a manager contributes to improving program performance. However, ED cannot demonstrate specific ways by which OSEP's managers are held accountable for linking their performance standards to the program's long term and annual measures. Program partners are subject to project reviews and grant monitoring but these oversight activities are not designed to link partners to specific performance goals.

Evidence:  

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: The program successfully obligates funds by the end of each fiscal year. OSEP should institute changes to ensure that grant competitions are announced on a regular schedule and provide sufficient time for preparation and review of applications. Funds are spent for the intended purposes; this is assessed through grant and contract monitoring and intensive grant reviews for major grant programs. No improper uses of funds have been identified.

Evidence: Contract files; summaries of formative and summative grant reviews.

YES 10%
3.4

Does the program have procedures (e.g., competitive sourcing/cost comparisons, IT improvements, approporaite incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: This program has not yet instituted procedures to measure and improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing "One-ED" -- an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements.

Evidence: N/A

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: There are many instances of the program collaborating and coordinating with related programs. Program staff recently participated in Department-wide teacher quality meetings designed to yield new long-term program measures for all teacher quality programs. The indicator generated through these meetings that relates to special educators will be implemented in 2003. Additional examples of program collaboration include 2 summits (hosted through the Center on Personnel Studies in Special Education (COPSSE)) that brought together policy-makers from state and local education agencies, related Federal programs, and non-profits to target COPSSE's research agenda on issues important to practitioners. The program also supported the development of model standards for special educators through the Council of Chief State School Officers (CCSSO). These model standards articulated what all general and special education teachers should know and be able to do to effectively teach students with disabilities. The standards specifically address the nature of the collaborative relationship between general and special education teachers.

Evidence: Teacher Quality "Common Measures" materials; Departmental "teacher quality" team participation materials; For a discussion of how the COPSSE policy advisor meetings translate into the program research agenda (and how COPSSE has implemented specific recommendations) see the "3+2 Evaluation" of the COPSSE program at: www.coe.ufl.edu/copsse/Briefing%20Book.pdf; See "Model Standards for Licensing General and Special Education Teachers of Students with Disabilities: A Resource for State Dialogue (2001)": www.ccsso.org/content/pdfs/SpedStds.pdf

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: No internal control weaknesses have been reported by auditors. The Department has a system for identifying excessive draw downs, and can put individual grantees on probation where draw downs need to be approved.

Evidence: N/A

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: OSEP has taken steps to address specific management deficiencies for the Personnel Prep program. Most significantly, program staff recently developed and implemented data collection designed to help staff manage the program more effectively. While OSEP's inability to meaningfully address strategic planning deficiencies is a critical fault, it is also an agency-level deficiency that does not affect this program as much because it has relatively few priorities and annual competitions. The priorities for this program are generally well written and competitions are also managed in an efficient and timely manner.

Evidence: Deficiencies at the program planning level (e.g. - the funding split between low incidence, high incidence, and leadership) are identified through forums, peer reviews, PPD data collection, and other processes implemented at the program level.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Grants are awarded through a competitive peer review process that includes a qualified assessment of merit and many grantees have demonstrated track records for preparing special education teachers. The President's Commission for Excellence in Special Education recommended that OSEP's peer review process be improved in several ways, including: ensuring appropriate separation between program management and peer review responsibilities; developing a more effective process for recruiting and utilizing peer reviewers; ensuring that the peer review process is organized in a way that actively encourages progressive improvement of proposals through revision and resubmission. OSEP has already taken specific steps to address such concerns. For example, OSEP recently engaged the National Academy of Sciences to conduct a study to improve the quality of peer review in IDEA Part D. An internal agency group is also developing procedures to standardize the training of reviewers.

Evidence: Program funds are used to support peer review costs. 100% of applicants are subject to peer review. "A New Era: Revitalizing Special Education for Children and Their Families" - the President's Commission on Excellence in Special Education

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The program has several mechanisms designed to generate meaningful information on grantees' use of funds, including periodic regional site visits, periodic institutional site visits, analysis of data (submitted in Annual Reports and through the PPD) that relates to intended program outcomes, various meetings intended to develop and enhance the relationship and the level of understanding between grantees and ED/OSEP program staff.

Evidence: PPD reporting structure is a dedicated on-line system. Site visits are typically conducted where high concentrations of funds occur, although occasionally institutions are visited because a specific deficiency/problem has been identified and requires attention.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: GPRA data are now reported in several formats (including on the web), and GPRA data is made available to the public through annual reports on the implementation of IDEA. Grantee final reports are available to the public, just as Research final reports are; however, information contained in these reports is not aggregated and disaggregated in a way that "relates to the impact of the program" as required by the OMB guidance document. Similarly, the program does not have in place a system to "collect and present publicly information that captures the most important impacts of program performace."

Evidence: http://ericec.org

NO 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome performance goals?

Explanation: Program does not yet have long-term goals.

Evidence: N/A

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Program is currently working to develop and implement more appropriate annual performance goals.

Evidence: N/A

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program performance goals each year?

Explanation: The Department is working with OMB on developing an appropriate efficiency measure for this program.

Evidence: N/A

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., that have similar purpose and goals?

Explanation: Although programs with similar goals and purposes do exist in other areas of education (e.g. - non-special education personnel training/supply programs such as HEA Title II, ESEA Title II, various private foundation programs focusing on teacher quality, etc . . .) there is no reliable basis for comparing Personnel Preparation to such programs. No current studies, analyses, or evaluations have attempted to make such comparisons, and in the absence of reliable comparisons between these programs further analysis would be arbitrary.

Evidence: N/A

NA 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: No independent evaluations of this program exist.

Evidence: N/A

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2003SPR