ExpectMore.gov


Detailed Information on the
Even Start Assessment

Program Code 10000186
Program Title Even Start
Department Name Department of Education
Agency/Bureau Name Office of Elementary and Secondary Education
Program Type(s) Block/Formula Grant
Assessment Year 2002
Assessment Rating Ineffective
Assessment Section Scores
Section Score
Program Purpose & Design 60%
Strategic Planning 43%
Program Management 63%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $82
FY2008 $66
FY2009 $0

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Support grantees in the delivery of high-quality services through monitoring, by monitoring 16 States during fiscal year 2008.

Action taken, but not completed The Department has visited Colorado, Florida, Oklahoma, New Mexico, Texas, Maryland, Arizona, Pennsylvania, Idaho, Illinoios, Wisconsin, North Dakota, and North Carolina. Final monitoring reports have been issued to Colorado, Florida, Oklahoma, New Mexico, Texas, Maryland, and Arizona. All other reports are in draft form and have not yet been released.
2005

Work with Congress to eliminate funding for the program due to program ineffectiveness, and redirecting funding to other education programs.

Action taken, but not completed The Congress and the Administration have reduced funding for the program from $247 million in FY 2004 to $82 million in FY 2007. The FY 2008 Omnibus bill included a further reduction to $66,454,000 for Even Start. The President's Budget requested no funding for this program for FY 2009.
2005

Measure outcomes, such as early literacy skills for children and high school completion for adults, and establish ambitious annual and long-term performance targets. Adjust performance targets based on best available data.

Action taken, but not completed The GPRA measures for preschool-aged participants in the Even Start program have been revised to be the same as the measures for Early Reading First. The Department is considering adjusting performance targets to better reflect consistency across related programs and increased State reporting.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Annual Outcome

Measure: Percentage of Even Start adults showing significant learning gains on measures of literacy and Even Start LEP adults showing significant learning gains on measures of English language acquisition, as measured by the CASAS and TABE.


Explanation:

Year Target Actual
2003 Baseline 70.0%
2004 70.7% 60.5%
2005 71.4% 63.8%
2006 72.1% 66.3%
2007 70.9% 68.4%
2008 72.1%
2009 72.1%
Annual Outcome

Measure: The percentage of Even Start adults with a high school completion goal who earn a high school diploma.


Explanation:This measure replaces the measure, "The percentage of Even Start adults who receive a high school diploma" to focus only on those Even Start adults who were eligible for and aspiring toward a high school completion goal. This change is consistent with guidance the Department gave to States for calculating data for this measure from 2005 on.

Year Target Actual
2003 Baseline 59%
2004 59.6% 44.6%
2005 60.2% 47.2%
2006 60.8% 77.6%
2007 60.8% 68.0%
2008 60.8%
2009 60.8%
Annual Outcome

Measure: The percentage of Even Start adults with a goal of General Equivalency Diploma (GED) who earn a GED.


Explanation:This measure replaces the measure, "The percentage of Even Start adults who receive a GED" to focus only on those Even Start adults who were eligible for and aspiring toward a GED. This change is consistent with guidance the Department gave to States for calculating data for this measure from 2005 on.

Year Target Actual
2003 Baseline 44.6%
2004 44.4% 80.2%
2005 44.9% 57.9%
2006 45.3% 47.3%
2007 45.3% 34.0%
2008 48%
2009 48%
Annual Outcome

Measure: The percentage of Even Start children entering kindergarten who achieve age-appropriate benchmarks on the Peabody Picture Vocabulary Test-III (Receptive).


Explanation:

Year Target Actual
2007 Baseline 66%
2008 67%
Annual Outcome

Measure: The number of letters Even Start children can identify as measured by the PALS Pre-K Uppercase Letter Naming Subtask.


Explanation:

Year Target Actual
2006 Baseline 15
2007 16 16
2008 17
2009 18
Annual Outcome

Measure: Percentage of Even Start parents who show improvement on measures of parental support for children's learning in the home, school environment, or through literacy activities.


Explanation:

Year Target Actual
2006 Baseline 70%
2007 72% 76%
2008 74%
2009 76%
Annual Outcome

Measure: The percentage of Even Start children reading at grade level (Targets under development)


Explanation:No data are available for this measure because a specific grade level was not defined.

Annual Outcome

Measure: The percentage of preschool-aged children participating in Even Start programs who achieve significant gains in oral language skills as measured by the Peabody Picture Vocabulary Test-III (PPVT-III, Receptive). (New measure, added February 2008)


Explanation:This is a measure for the Early Reading First program that should be added for cross-program comparison.

Year Target Actual
2008 85.0%
2007 84.6% 75.0%
2006 84.6% 75.3%
2005 83.7% 79.8%
2004 baseline 82.9%
Annual Efficiency

Measure: Number of days it takes for the Department of Education to send a monitoring report to States after monitoring visits.


Explanation: 

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: Purpose is to help break the cycle of poverty and illiteracy for low-income families by integrating early childhood education, adult literacy, and parenting education into a unified family literacy program.

Evidence: Section 1231 of the No Child Left Behind Act of 2001

YES 20%
1.2

Does the program address a specific interest, problem or need?

Explanation: About 4% of adults cannot read at all and 21 % have only rudimentary reading and writing skills. 56% of beginning kindergarteners are at risk of school failure because of factors such as low family income and low parent education.

Evidence: 1992 U.S. Department of Education survey and ED's Early Childhood Longitudinal Study, 2000.

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: There is no evidence indicating that increases or decreases in Federal funding for this program would have a clear impact on family literacy.

Evidence: Third National Evaluation of Even Start.

NO 0%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: The program is duplicative of several other programs including: Head Start, Adult Education, Early Reading First, Reading First, and Title I of the Elementary and Secondary Education Act (ESEA).

Evidence: Head Start, Early Reading First, and Even Start serve similar early childhood populations; Adult Education and Even Start serve similar adult populations. In Title I and Reading First, family literacy efforts are allowable activities.

NO 0%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation: There is no evidence indicating that the structure of the program -- formula grants to States, competitive grants to the local level -- is the wrong design for the program. This does not mean that program improvements are unnecessary.

Evidence:  

YES 20%
Section 1 - Program Purpose & Design Score 60%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program has two outcome goals for adults and two for children that directly support the program's mission and purpose. However, the program lacks numerical targets for its long-term goals.

Evidence: Even Start indicators of program quality and Section 1240 of the No Child Left Behind Act of 2001.

NO 0%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: The program must set numerical targets for its annual goals and ensure that data exist to report on whether those targets have been met.

Evidence:  

NO 0%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: SEAs are required to develop indicators of program quality to monitor, evaluate, and improve their programs.

Evidence: While States have begun to implement the statutory requirements to set performance goals around specified measures, they do not fit into a strategic framework since the Department has not established numerical targets for its performance goals. (see Q. 1 and 2 in this section). A process should be put in place to ensure that State goals are rigorous and that would help ensure achievement of national goals set by the Department.

NO 0%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation: Program staff at the national, state, and local levels coordinate with Title I of ESEA, Vocational Education, and Head Start programs.

Evidence: Even Start has conducted 2 National Forums jointly with Vocational Education and Head Start programs. The first brought together local teams representing the three programs that wrote action plans for how to promote family literacy. The second culminated in the publication of research papers representing each program.

YES 14%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: Education conducts independent evaluations of this program every 3-5 years.

Evidence:  

YES 14%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: The program does not have a strategic planning framework where a limited number of annual performance goals demonstrate progress toward achieving long-term goals. Thus, at this time, performance goals are not currently aligned with budget policy.

Evidence:  

NO 0%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: Even Start has developed an action plan addressing the program's long term planning deficiencies.

Evidence:  

YES 14%
Section 2 - Strategic Planning Score 43%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The program collects annual data through an extensive data collection system and uses it to target technical assistance activities.

Evidence:  

YES 13%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: This program has not instituted an appraisal system that holds Federal managers accountable for grantee performance. However, as part of the President's Management Agenda, the Department is planning to implement an agency-side system -- EDPAS -- that links employee performance to progress on strategic planning goals. Grantee performance is monitered on an annual basis through review and approval of annual budget plans, compliance reviews, audits, and site visits.

Evidence:  

NO 0%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Grants for the State formula grant program are obligated on schedule. In addition, evaluation and technical assistance funds are obligated on schedule based on a spending plan. However, the funds for the competitive portion of the program are often not obligated in a way to meet Education's internal schedule, even though they are obligated before the legal deadline.

Evidence:  

YES 12%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: This program has not yet implemented measures and procedures to improve cost efficiency in program execution. However, as part of the President's Management Agenda, the Department is implementing an agency-wide initiative to re-evaluate the efficiency of every significant business function, including the development of unit measures and the consideration of competitive sourcing and IT improvements.

Evidence:  

NO 0%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: ED's 04 budget submission satisfies the first part of the question by presenting the anticipated S&E expenditures (including retirement costs) for this program, which constitute 1% percent of the program's full costs. However, ED has not satisfied the second part of the question because program performance changes are not identified with changes in funding levels. The program does not have sufficiently valid and reliable performance information to assess the impact of the Federal investment.

Evidence:  

NO 0%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:  

YES 12%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Material internal management deficiencies have not been identified for this program.

Evidence:  

NA 0%
3.B1

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: ED collects and reviews extensive summaries of local activities.

Evidence:  

YES 13%
3.B2

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The program collects and reports annual data through an extensive data collection system, and has published summaries of local evaluations.

Evidence:  

YES 13%
Section 3 - Program Management Score 63%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: Since targets have not been set, it is not currently possible to assess progress toward meeting them.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Since targets have not been set, it is not currently possible to assess progress toward meeting them.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: This program does not lend itself to the development of efficiency measures that link the Federal investment to program outcomes because it's funds are combined with a significant amount of other program dollars from the Federal, State, and local levels to provide achieve its goals.

Evidence:  

NA  %
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation: No comparable data are available for other programs.

Evidence:  

NA 0%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Education has conducted three major evaluations of this program, two including a small experimental design study. None of the studies could show that the parents or children who received these services made greater gains than those who did not. Results from 3 States that have conducted their own evaluations are more positive than the national results, however these evaluations were not as rigorous as the national evaluations.

Evidence: National Evaluations of the Even Start Family Literacy Program.

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2002SPR