Program Code | 10000204 | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Program Title | Tech-Prep Education State Grants | ||||||||||
Department Name | Department of Education | ||||||||||
Agency/Bureau Name | Office of Vocational and Adult Education | ||||||||||
Program Type(s) |
Block/Formula Grant |
||||||||||
Assessment Year | 2002 | ||||||||||
Assessment Rating | Results Not Demonstrated | ||||||||||
Assessment Section Scores |
|
||||||||||
Program Funding Level (in millions) |
|
Year Began | Improvement Plan | Status | Comments |
---|---|---|---|
2003 |
The Budget proposes to terminate the program so that Federal resources for this program can be redirected to programs with a proven track record for effectiveness, such as Pell Grants. |
Action taken, but not completed | Action proposed again in FY2009 President's Budget. |
2006 |
Issuing regulations on implementation of performance measures systems under the new Perkins law. |
Action taken, but not completed | The Department determined not to issue regulations on implementing the performance measures under the 2006 Perkins Act. The Department has issued guidance on measurement approaches for the program's indicators, and expects to issue additional nonregulatory guidance in 2008. |
2006 |
Providing technical assistance to recipients during 2008 on improving the quality of performance data. |
Action taken, but not completed | The Department will conduct Data Quality Institutes to promote valid and reliable data collection and submission by States. The National Research Center will also initiate technical assistance projects to help States improve data quality. In addition, Department staff conduct bi-monthly teleconference calls with States to discuss data issues. |
Year Began | Improvement Plan | Status | Comments |
---|
Term | Type | ||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Long-term/Annual | Outcome |
Measure: For postsecondary students - Percentage of participants who entered employment in the 1st quarter after program exit.Explanation:
|
|||||||||||||||||||||||||||||||||
Long-term | Outcome |
Measure: For secondary students - Attainment of a high school diploma, certificate, or GEDExplanation:
|
|||||||||||||||||||||||||||||||||
Long-term/Annual | Outcome |
Measure: For secondary students - Entry into employment or enrollment in postsecondary education/advanced trainingExplanation:
|
|||||||||||||||||||||||||||||||||
Long-term/Annual | Efficiency |
Measure: Cost per secondary studentExplanation:OMB job training common measure for efficiency.
|
Section 1 - Program Purpose & Design | |||
---|---|---|---|
Number | Question | Answer | Score |
1.1 |
Is the program purpose clear? Explanation: The program provides financial assistance to states in support of expanding 2 + 2 programs (i.e., 2 years of secondary education transitioning into 2 years of postsecondary education) with the goal of increasing the number of students who receive technical degrees. Evidence: Sec. 202(a)(3) of the Carl D. Perkins Vocational and Applied Technology Education Act (hereinafter, "the Act"). |
YES | 20% |
1.2 |
Does the program address a specific interest, problem or need? Explanation: Labor market data demonstrate that the supply of jobs necessitating technical degrees exceeds the number of individuals with technical degrees. The disparity is expected to grow in the coming years. Evidence: National Assessment of Vocational Education, Interim Report for 2002. |
YES | 20% |
1.3 |
Is the program designed to have a significant impact in addressing the interest, problem or need? Explanation: Because the impacts of the program are not currently known, the effect of reducing or increasing the federal investment in this program is unclear. Evidence: The Act requires grantees to report on outcomes for Tech Prep students. However, to date, the Department has only baseline data on grantee performance. Moreover, grantee performance reporting suffers from silimar data integrity problems as found in the Voc. Ed State Grant program -- non-uniform definition of a Tech Prep student, inability to aggregate outcome data to a national level. |
NO | 0% |
1.4 |
Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)? Explanation: All relevant activities under this program are allowable under the Vocational Education State Grant program. Evidence: For example, nothing in the law prevents a Voc Ed Grantee from using funds to develop a 2 + 2 program |
NO | 0% |
1.5 |
Is the program optimally designed to address the interest, problem or need? Explanation: There is no conclusive evidence that a different design would improve program performance. However, the absence of conclusive evidence does not mean that program improvements are not needed. Evidence: |
YES | 20% |
Section 1 - Program Purpose & Design | Score | 60% |
Section 2 - Strategic Planning | |||
---|---|---|---|
Number | Question | Answer | Score |
2.1 |
Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program? Explanation: Consistent with measures established under the job training common measures framework, the Department is working to develop several long-term indicators that are tied to short term goals and are consistent with the program's scope and activities. Evidence: |
NO | 0% |
2.2 |
Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals? Explanation: Through the common measures matrix, the program has established a limited set of performance indicators designed to measure program performance/progress, including for example, placement in employment, degree attainment, and skill attainment. However, the Department must establish numerical targets and ensure that performance data exists to report on those targets. In addition, any short-term measures (whether the common measures or additional measures) must be linked to long-term goals. To the extent performance targets are set by states, a process should be put in place to ensure that state-defined targets are appropriately rigorous and that a methodology can be developed for aggregating performance data at the national level. Evidence: |
NO | 0% |
2.3 |
Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program? Explanation: While the program receives regular and timely annual performance information from grantees, the information cannot yet be tied to a strategic planning framework where a limited number of annual performance goals demonstrate progress toward achieving long-term goals. Evidence: Instructions for this question indicate that a "no" is required if the program received a "no" for both questions 1 and 2 of this section. |
NO | 0% |
2.4 |
Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives? Explanation: Considerable collaboration and coordination occurs at both the Federal level (e.g., with DOL) and at the grantee level (e.g., with WIA title I one-stops) Evidence: |
YES | 14% |
2.5 |
Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness? Explanation: The National Assessment of Vocational Education (NAVE) is an independent analysis, conducted every 5 years, and tracks appropriate program outcomes and use of Federal dollars. Evidence: |
YES | 14% |
2.6 |
Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known? Explanation: The program does not have a strategic planning framework where a limited number of annual performance goals demonstrate progress toward achieving long-term goals. Thus, at this time, performance goals are not currently aligned with budget policy. Evidence: There is limited reliable data informing on critical performance measures. Specifically, educational and employment outcome data are not uniform across states and cannot be aggregated (e.g., states set their own thresholds, states have different definitions for who is a Tech-prep student). |
NO | 0% |
2.7 |
Has the program taken meaningful steps to address its strategic planning deficiencies? Explanation: The Department has undertaken a process to make strategic planning improvements. This process is being coordinated with the Department's ongoing development of a reauthorization proposal. Evidence: |
YES | 14% |
Section 2 - Strategic Planning | Score | 43% |
Section 3 - Program Management | |||
---|---|---|---|
Number | Question | Answer | Score |
3.1 |
Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance? Explanation: While the program receives regular and timely annual performance information from grantees, the information cannot yet be tied to a strategic planning framework where a limited number of annual performance goals demonstrate progress toward achieving long-term goals. In addition, there are data quality problems with the performance information currently obtained. Evidence: |
NO | 0% |
3.2 |
Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results? Explanation: This program has not instituted an appraisal system that holds Federal managers accountable for grantee performance. However, as part of the President's Management Agenda, the Department is planning to implement an agency-wide system -- EDPAS -- that links employee performance to progress on strategic planning goals. Grantee performance is monitored on an annual basis through review and approval of annual budget plans, compliance reviews, audits, and site visits. The program's current accountability framework needs to be further strengthened to ensure that poor performing grantees submit improvement strategies and have grants reduced or eliminated for serious or persistent failures to comply. Evidence: |
NO | 0% |
3.3 |
Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose? Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended. Evidence: |
YES | 11% |
3.4 |
Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution? Explanation: This program has not yet instituted procedures to measure and improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements. Evidence: |
NO | 0% |
3.5 |
Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels? Explanation: Education's 2004 Budget satisfies the first part of the question by presenting the anticipated S&E expenditures (including retirement costs) for this program, which constitute 1.1 percent of the program's full costs. However, Education has not satisfied the second part of the question because program performance changes are not identified with changes in funding levels. The program does not have sufficiently valid and reliable performance information to assess the impact of the Federal investment. Evidence: |
NO | 0% |
3.6 |
Does the program use strong financial management practices? Explanation: The program has a positive audit history, with no evidence of internal control weaknesses. Evidence: |
YES | 11% |
3.7 |
Has the program taken meaningful steps to address its management deficiencies? Explanation: The Department has identified implementation problems that persist at the grantee level and has taken steps to increase compliance monitoring efforts and strengthen grantee accountability. Evidence: |
YES | 11% |
3.B1 |
Does the program have oversight practices that provide sufficient knowledge of grantee activities? Explanation: The Department maintains information on grantee activities through consolidated annual reports, site visits and compliance monitoring, and technical assistance activities. Evidence: |
YES | 11% |
3.B2 |
Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner? Explanation: The performance reports are annual and widely disseminated. Work needs to be done to both rectify data quality problems and make data quality problems more transparent. Evidence: |
YES | 11% |
Section 3 - Program Management | Score | 56% |
Section 4 - Program Results/Accountability | |||
---|---|---|---|
Number | Question | Answer | Score |
4.1 |
Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)? Explanation: Consistent with measures established under the job training common measures framework, the Department is working to develop several long-term indicators that are tied to short term goals and are consistent with the program's scope and activities. Evidence: |
NO | 0% |
4.2 |
Does the program (including program partners) achieve its annual performance goals? Explanation: Through the common measures matrix, the program has established a limited set of performance indicators designed to measure program impacts, including for example, placement in employment, degree attainment, and skill attainment. However, the Department must establish numerical targets and ensure that performance data exists to report on those targets. In addition, any short-term measures (whether the common measures or additional measures) must be linked to long-term goals. Evidence: |
NO | 0% |
4.3 |
Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year? Explanation: The common measures framework includes an efficiency measure -- cost per participant. The Department estimates that the annual cost per participant is $70. However, the lack of valid outcome data makes it impossible to link these costs to the achievement of program goals. Evidence: |
NO | 0% |
4.4 |
Does the performance of this program compare favorably to other programs with similar purpose and goals? Explanation: To date, the Department has been unable to provide data that informs on the common measures. NAVE results and individual State performance reports (non-aggregated) indicate that program as currently constituted is not effective in achieving academic and employment outcomes. Evidence: |
NO | 0% |
4.5 |
Do independent and quality evaluations of this program indicate that the program is effective and achieving results? Explanation: The most recent NAVE findings, released in December, 2002, provides preliminary data on vocational education generally, but do not yet disaggregate results specific to Tech-prep. Historically, the NAVE has provided mixed results on th effectiveness of vocational education in general. The 1994 NAVE concluded that vocational education provides little or no measurable advantage for high school students in terms of high school completion, postsecondary enrollment, and academic achievement. Preliminary results from the 2002 NAVE confirm the 1994 findings and find further that substituting vocational courses for academic courses adversely affects student academic achievement and college enrollment. However, the 2002 NAVE did find that taking a high school vocational course (versus taking no vocational courses) may have a positive impact on earnings. Evidence: 1994, 2002 NAVE. |
NO | 0% |
Section 4 - Program Results/Accountability | Score | 0% |