ExpectMore.gov


Detailed Information on the
Teaching American History Assessment

Program Code 10002072
Program Title Teaching American History
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2004
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 50%
Program Management 70%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $120
FY2008 $118
FY2009 $50

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Continue to improve collection of program performance data and use these data to inform technical assistance, funding recommendations, program management, and evaluation.

Action taken, but not completed The Department is developing new performance measures for the program that reflect data collection that provides accurate and reliable data. Targets will be developed once baseline data for the annual measures are established. The Department will use data from grantee reports and evaluations for the Department's evaluation of the program. Performance data also help the Department to track grantee progress and use these data to provide technical assistance to grantees.
2006

Make program performance information available to the public in a transparent manner.

Action taken, but not completed In fiscal year 2008, the Department will make performance reports and other appropriate performance-related information available on the Teaching American History website.
2006

Establish program efficiency measures that assess the cost of achieving key program outcomes and begin reporting on these measures.

Action taken, but not completed The Department of Education has established two efficiency measures for the program. The first measure was recently modified from an earlier measure -- the cost per program participant who completes 75 percent or more of the total hours of professional development offered. The second measure was added in the spring of 2008 -- the cost per teacher hour of professional development attended. The Department should have baseline data for these measures in the summer of 2009.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2004

Fully implement the experimental/quasi-experimental program evaluation strategy.

Completed The Department has published a national evaluation of the program. Additionally, the Department included a competitive priority for grantees to conduct scientifically-based evaluations in its 2004, 2005, 2006, and 2007 annual grant competitions. The Department is providing technical assistance to grantees to assist the qualifying grantees in conducting their evaluations.

Program Performance Measures

Term Type  
Annual Outcome

Measure: Percentage of students in studies of educational effectiveness who demonstrate higher achievement than those in control or comparison groups.


Explanation:Students in experimental and quasi-experimental studies of program-supported projects will demonstrate higher achievement on course content measures and/or statewide U.S. history assessments than students in control and comparison groups.

Year Target Actual
2007 Set Baseline data lag (Apr 2009)
2008
2009
Long-term/Annual Outcome

Measure: Percentage of participant teachers who demonstrate an increased understanding of American history, as measured by nationally validated tests of American history.


Explanation:

Year Target Actual
2007 Set Baseline data lag (Apr 2009)
2008
2009
Annual Efficiency

Measure: Cost per teacher participant. (New measure, added February 2007)


Explanation:

Year Target Actual
2007 Set Baseline data lag (Feb. 2008)

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the program is to support programs that raise student achievement by improving teachers' knowledge, understanding, and appreciation of American history.

Evidence:  

YES 20%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: The 2001 National Assessment of Educational Progress found that approximately 90 percent of high school seniors scored below the proficient level and 57 percent scored below the basic level in their knowledge of American history. In addition, while there has been improvement in the proportion of students scoring at or above basic proficiency levels among students in fourth and eighth grades since 1994, the gains have disappeared as students have moved from elementary and middle school to high school.

Evidence: 2001 National Assessment of Educational Progress -- U.S. History

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Although LEAs may provide professional development in U.S. history with funds from other Federal education programs, including Improving Teacher Quality State Grants, this is the only Federal program that focuses solely on teaching American history, provides competitive grants to ensure that projects are of high quality, and requires grants to consist of partnerships between one or more LEAs and one or more organizations that can provide professional development in teaching American history. Other organizations include: IHEs, history organizations, humanities organizations, libraries, and museums.

Evidence:  

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: There is no evidence that the program design is flawed.

Evidence:  

YES 20%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: There is no evidence indicating tha the program is not effectively targeted.

Evidence:  

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program does not yet have a long-term performance measure.

Evidence:  

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program does not have targets and timeframes for a long-term performance measure.

Evidence:  

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The Department has established the following performance measure for the program: Students in experimental and quasi-experimental studies of educational effectiveness in Teaching of Traditional American History projects will demonstrate higher achievement on course content measures and/or statewide U.S. history assessments than students in control and comparison groups, as measured by: (1) The percentage of students in studies of educational effectiveness who demonstrate higher acheivement than those in control or comparison groups; and (2) The percentage of school districts that demonstrate higher educational achievement for students in Teaching of Traditional American History classrooms than those in control or comparison groups.

Evidence:  

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The Department expects to have baseline data for this indicator in the winter of 2005. The Department will establish targets once baseline information is available.

Evidence:  

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Through annual performance reports, the Department confirms grantee commitment to working towards the program's goal of improving student achievement by providing high-quality professional development to elementary- and secondary-level teachers of American history.

Evidence: The Department will determine how well partners are meeting the program's goals through annual performance reports.

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The Department is currently conducting a thirty-month evaluation of the program. The evaluation addresses questions related to the characteristics of funded activities; the types of instructional training and support services teachers are receiving, including the specific subjects and areas of American history in which teachers receive training; and the qualifications and characteristics of teachers who participate in the grant projects. Information will be collected through surveys of project directors and project participants and through observation of training sessions offered through the program. The Department expects to have the final report of this evaluation completed in 2005.

Evidence:  

YES 12%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Department has not satisfied the first part of the question because program performance changes are not identified with changes in funding levels. The program, at this time, does not have sufficiently valid and reliable performance information to assess (whether directly or indirectly) the impact of the Federal investment. However, the Department has satisifed the second part of the question because the Department's budget submissions show the full cost of the program (including S&E).

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department is working to ensure that high-quality data will be available to report on the performance measure. In the fiscal year 2003 competition for the program, applicants received up to 20 additional points if they proposed a project that was designed to determine, through rigorous evaluation, whether the implemented program produces meaningful effects on student achievement or teacher performance. Approximately 40 grantees scored highly on this priority and met in Washington in January 2004 to discuss the challenges of designing and conducting these evaluations. Data from these grantee evaluations will provide the information needed to report on the performance measure.

Evidence:  

YES 12%
Section 2 - Strategic Planning Score 50%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department collects data annually from program performance reports. The Department is collecting additional performance data from approximately 50 grantees that are conducting evaluations using quasi-experimental or experimental designs.

Evidence:  

YES 10%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Currently, ED cannot demonstrate how federal managers and program partners are held accountable for program goals. ??However, the Department has initiated several efforts to improve accountability in its programs. ??First, ED is in the process of ensuring that EDPAS plans -- which link employee performance to relevant Strategic Plan goals and action steps ' hold Department employees accountable for specific actions tied to improving program performance. ??ED is also revising performance agreements for its SES staff to link performance appraisals to specific actions tied to program performance.?? Finally, ED is reviewing its grant policies and regulations to see how grantees can be held more accountable for program results.

Evidence: The President's Management Agenda scorecard (Human Capital and Budget & Performance Integration initiatives) notes ED's efforts to improve accountability. ??The Department's Discretionary Grants Improvement Team (DiGIT) recommendations indicate that ED is reviewing its grant policies and recommendations.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for purposes intended. The Department reserves some funds for program evaluation, which are obligated based on an evaluation plan.

Evidence: Evidence suggests that grantees are drawing down funds at an acceptable rate.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: To date, the Department has not established procedures for this program to measure and achieve efficiencies in program operations. ??However, ED is in the process of developing its competitive sourcing Green Plan, and is working to improve the efficiency of its grantmaking activities. ??The Department has also established a strengthened Investment Review Board to review and approve information technology purchases agency-wide. ??

Evidence: Department Investment Review Board materials. ??ED's Discretionary Grants Improvement Team (DiGIT) recommendations.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The program collaborated with other teacher quality programs in the Department to develop common measures for teacher quality programs. In addition, the Department works with the National Endowment for the Humanities, which administers a separate training program for history teachers. NEH staff work with Department staff on grant reviews and on conferences and forums to promote the two programs.

Evidence:  

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: While material internal management deficiencies have not been identified for this program, the program has put in place a system to identify potential problems.

Evidence: Program staff monitor excessive drawdowns of funds to prevent high-risk situations.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The Department awards grants on a point system that is based on selection criteria published in the Federal Register.

Evidence:  

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department maintains information on grantee activities though annual performance reports, site visits, and technical assistance activities.

Evidence:  

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: The Department collects performance data annually from grantees but has not yet displayed this information to the public in a meaningful manner. Education is developing a department-wide approach to improve the way programs provide performance information to the public. In 2004, Education will conduct pilots with selected programs to assess effective and efficient strategies to share meaningful and transparent information.

Evidence:  

NO 0%
Section 3 - Program Management Score 70%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Department has not yet established a long-term performance measure for this program.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: While the Department has established an annual performance measure for this program, data will not be available until the winter of 2005.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department has not established an efficiency measure for this program.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: No data are available for comparable programs.

Evidence:  

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: Program evaluation data will be available in 2005.

Evidence:  

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2004SPR