ExpectMore.gov


Detailed Information on the
Comprehensive Regional Assistance Centers Assessment

Program Code 10002086
Program Title Comprehensive Regional Assistance Centers
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2004
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 80%
Strategic Planning 25%
Program Management 60%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $56
FY2008 $57
FY2009 $57

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Establish long-term performance goals, targets, and timeframes for the Comp. Ctrs. program, based on the results from the national evaluation.

No action taken In September 2006, ED awarded a contract for the National Evaluation of the Comprehensive Centers. Baseline data should become available by July 31, 2008. At that time, baseline data and targets will be set.
2007

Implement the new efficiency measure and collect baseline data.

Action taken, but not completed
2007

Use the data on the quality, relevance, and usefulness of the technical assistance provided by the Centers from the national evaluation to plan for technical assistance and program management.

No action taken Data will be available by July 31, 2008.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Embed the new measures in the application notice for the new Comprehensive Centers program. Also, embed the appraisal of how applicants address the measures into the peer review of applications.

Completed The notice inviting applications for new awards, published June 3, 2005, listed the measures and established selection criteria for peer review that included an assessment of each applicant's ability to provide reliable data for the common measures.
2005

Implement the new efficiency measure and continue work to establish and implement at least one additional efficiency measure.

Completed Carryover funds from 2006 were high, largely because the first year of the grants lasted only nine months and work was delayed until negotiation of cooperative agreements with ED and negotiation of work plans with States. The Department plans to monitor the centers closely in order to meet the targets for upcoming years.
2005

Establish long-term performance goals and targets and time frames for the new Comprehensive Centers.

Completed The Department is planning an evaluation that will, among other things, include the formation, by the end of FY 2006, of panels of peer reviewers to rate each center's products and services against the common measures. Long-term targets and time frames will be established after baseline data are obtained for the measures, during the centers' second year of operation (2007).
2007

Develop a plan for monitoring the performance of the Comprehensive Centers and adjust the plan annually, based on findings from monitoring visits.

Completed The program has developed a plan, including a schedule, for conducting five on-site monitoring visits this summer and fall.

Program Performance Measures

Term Type  
Annual Outcome

Measure: The percentage of all technical assistance products and services that are deemed to be of high quality by an independent review panel of qualified experts or individuals with appropriate expertise to review the substantive content of the products and services.


Explanation:Measure of quality of recipient services and products.

Year Target Actual
2007 Baseline July 2008
Annual Outcome

Measure: The percentage of all technical assistance products and services that are deemed to be of high relevance to educational policy or practice by target audiences.


Explanation:Measure of relevance of recipient products and services.

Year Target Actual
2007 Baseline July 2008
Annual Outcome

Measure: The percentage of all technical assistance products and services that are deemed to be of high usefulness to educational policy or practice by target audiences.


Explanation:Measure of usefulness of recipient products and services.

Year Target Actual
2007 Baseline July 2008
Annual Efficiency

Measure: The percentage of grant funds carried over in each year of the project.


Explanation:

Year Target Actual
2006 Baseline 40
2007 30 15
2008 20 July 2008
2009 10 July 2009
2010 10 July 2010
Long-term/Annual Efficiency

Measure: The number of working days it takes the Department to send a monitoring report to grantees after monitoring visits (both virtual and on-site).


Explanation:The program office intends to conduct a series of five pilot site visits in 2008 and will be ready to implement a full monitoring program in 2009.

Year Target Actual
2008 Baseline

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the program is to establish technical assistance providers to help States, local educational agencies, schools, tribes, and other agencies to administer and implement programs authorized under the Elementary and Secondary Education Act of 1965.

Evidence: Section 13002 of the ESEA.

YES 20%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: With States, local educational agencies, and schools indicating the need for assistance in implementing the requirements of the ESEA, the program addresses a relevant problem.

Evidence:  

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: Other technical assistance providers help the same entities in areas such as math, science education, and educational technology.

Evidence: Title XIII of the ESEA.

NO 0%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The program has established a network of technical assistance providers that provides services to the entities identified in the 1994 statute.

Evidence: Program evaluation found that the Centers had succeeded in establishing a customer base at the school, district, and State levels. The new program requires recipients to align the content of technical assistance with NCLB requirements and with regional needs. The new Centers replace the current Comprehensive Regional Assistance Centers, Eisenhower Regional Mathematics and Science Consortia, the Regional Technology in Education Consortia, and the Eisenhower National Clearinghouse for Mathematics and Science Education.

YES 20%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: In providing services, P.L. 103-382 required that Centers give priority to schools implementing schoolwide programs (schools with a poverty level of at least 40 percent) under Title I of the ESEA, and local educational agencies and BIA schools that serve concentrations of poor students.

Evidence: A 2002 evaluation found that school districts with high rates of poverty and districts with significant enrollments of LEP, American Indian, and migrant students were more likely to receive services from the Centers than other districts.

YES 20%
Section 1 - Program Purpose & Design Score 80%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The current Centers will be replaced after FY 2004 with new Centers authorized under the Education Sciences Reform Act of 2002. The Department has not developed long-term measures for the current program, but will develop measures for the new Comprehensive Centers. For FY 2005, the Department replaced a process-focused measure with an interim outcome measure for the funding period before the new Comprehensive Center program (authorized by the Education Sciences Reform Act of 2002) is implemented.

Evidence: Education Sciences Reform Act of 2002 Sec. 205 of the Act authorizes the Department to continue funding the current Centers until the new Comprehensive Centers are established. The Department will compete the new Centers early in FY 2005.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: There are no long-term measures for the current program, which will be replaced with new Comprehensive Centers in FY 2005 (see 2.1). The Department will develop long-term measures with targets and timeframes for the new Comprehensive Centers program.

Evidence: Education Sciences Reform Act of 2002 Sec. 205 of the Act authorizes the Department to continue funding the current Centers until the new Comprehensive Centers are established. The Department will compete the new Centers early in FY 2005.

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The current Comprehensive Centers will be terminated with the establishment of new Centers in FY 2005. However, several proposed performance measures for the program hold promise for measuring progress. Program staff recently participated in Department-wide meetings to develop common measures for assessing the performance of ED technical assistance programs. The program has adopted three annual measures (common to all Education TA programs) for 2006 to measure the quality, relevance, and utility of program products and services. These measures will be implemented in 2005. Implementation includes development of a methodology for convening of panels of scientists and practitioners to review products and project designs and developing an instrument for obtaining data from target audiences on the usefulness of ED TA products and services.

Evidence:  

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: Because the current program will be terminated with the establishment of new Centers, the Department has no plans to establish baseline or targets for the current interim measure (see 2.1). The Department will establish baselines and targets for the measures developed for the new Comprehensive Centers program.

Evidence: Education Sciences Reform Act of 2002 Sec. 205 of the Act authorizes the Department to continue funding the current Centers until the new Comprehensive Centers are established. The Department will compete the new Centers early in FY 2005.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The Department has developed common performance measures for multiple technical assistance programs. That measure will apply to the new Comprehensive Centers program.

Evidence:  

NO 0%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The most recent program evaluation conducted by the Department was published in 2000. The Department also conducts a statutorily required biennial customer service survey.

Evidence: Evaluation and customer service survey reports.

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: ED has not satisfied the first part of the question because program performance changes are not identified with changes in funding levels. The program, at this time, does not have sufficiently valid and reliable performance information to assess (whether directly or indirectly) the impact of the Federal investment. However, ED has satisfied the second part of the question in that ED's budget submissions show the full cost of the program (including S&E).

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department has developed common performance measures across technical assistance programs that will be applied to the new Comprehensive Centers program.

Evidence:  

YES 12%
Section 2 - Strategic Planning Score 25%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department collects annual performance data from each Comprehensive Center. However, there is no baseline data available and measures will be reconsidered in implementing the new centers required under P.L. 107-279.

Evidence: Annual performance reports submitted by grantees.

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Currently, ED cannot demonstrate how federal managers and program partners are held accountable for program goals. ??However, the Department has initiated several efforts to improve accountability in its programs. ??First, ED is in the process of ensuring that EDPAS plans -- which link employee performance to relevant Strategic Plan goals and action steps ' hold Department employees accountable for specific actions tied to improving program performance. ??ED is also revising performance agreements for its SES staff to link performance appraisals to specific actions tied to program performance.?? Finally, ED is reviewing its grant policies and regulations to see how grantees can be held more accountable for program results.

Evidence: The President's Management Agenda scorecard (Human Capital and Budget & Performance Integration initiatives) notes ED's efforts to improve accountability. ??The Department's Discretionary Grants Improvement Team (DiGIT) recommendations indicate that ED is reviewing its grant policies and recommendations.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended.

Evidence: Funds are obligated on a quarterly basis.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: To date, the Department has not established procedures for this program to measure and achieve efficiencies in program operations. ??However, ED is in the process of developing its competitive sourcing Green Plan, and is working to improve the efficiency of its grantmaking activities. ??The Department has also established a strengthened Investment Review Board to review and approve information technology purchases agency-wide. ??

Evidence: Department Investment Review Board materials. ??ED's Discretionary Grants Improvement Team (DiGIT) recommendations.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The program office works frequently with other program offices to coordinate technical assistance efforts to States and districts.

Evidence: Manual for Comprehensive Needs Assessments for Migrant Students; agenda for OELA/CC conference calls.

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of the program.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: While material internal management deficiencies have not been identified for this program, the program has put in place a system to identify potential problems.

Evidence: Program staff monitor grantee drawdowns to avoid potential problems.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: The awards were made on a competitive basis and judged on their relative merits. However, funds for the Centers have been authorized for the past 3 fiscal years through appropriations language that extended the Centers beyond their original five-year grant award.

Evidence: Applications were peer-reviewed and the highest scoring were selected for awards.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department collects annual performance data from each Comprehensive Center and maintains information on grantee activities through annual performance reports, telephone contact, and selected site visits.

Evidence: Annual performance reports submitted by grantees, updated project plans, phone logs.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: While the program collectes grantee performance data on an annual basis, information has not been made available to the public. Education is developing a department-wide approach to improve the way programs provide performance information to the public. In 2004, Education will conduct pilots with selected programs to assess effective and efficient strategies to share meaningful and transparent information.

Evidence:  

NO 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Department has not established long-term performance goals for this program (see 2.1).

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The Department has developed common performance measures many of its technical assistance programs.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department is working on an efficiency measure for the program.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The Department is working on annual and long-term performance measures as well as a program evaluation plan, thus cross-program comparisons are not feasible at this time.

Evidence:  

NO 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: An evaluation of the program was completed in 2000; the evaluation measured customer satisfaction with Center services. No evaluations using rigorous methodologies have been conducted.

Evidence: Evaluation report. Program evaluation found that the Centers had succeeded in establishing a customer base at the school, district, and State levels.

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2004SPR