ExpectMore.gov


Detailed Information on the
Parental Information and Resource Centers Assessment

Program Code 10002112
Program Title Parental Information and Resource Centers
Department Name Department of Education
Agency/Bureau Name Department of Education
Program Type(s) Competitive Grant Program
Assessment Year 2004
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 40%
Strategic Planning 38%
Program Management 60%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $40
FY2008 $39
FY2009 $0

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2006

Measure how the Centers increase parents' understanding of their State accountability systems and options for supplemental services and choice under No Child Left Behind.

Action taken, but not completed The Department asks applicants to address these areas through a competitive priority as part of the grant award process. Funded projects must also report annually on the numbers of parents who have received information about State accountability systems and options for supplemental services and choice under NCLB. The Department tracks that information as a GPRA performance measure. New project directors also received training in these areas at the national Title I conference in January 2007.
2006

Implement the efficiency measure and continue to work to establish and implement at least one additional efficiency measure.

Action taken, but not completed The efficiency measure is the percentage of grant funds carried over in each year of the project. The Department collected data for this measure in October 2007, for grantees funded in September 2006. In FY 2006, a total of $37,323,873 was awarded to new PIRC grantees. Of that amount, $13,002,766.18 or 34.84% was carried over. The Department is continuing to work to establish and implement at least one additional efficiency measure.
2006

Work with Congress to terminate funding for the program.

Action taken, but not completed The Administration is not requesting funding for this program in fiscal year 2009.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments

Program Performance Measures

Term Type  
Annual Output

Measure: The number of parents who are participating in PIRC activities designed to provide then with the information necessary to understand their State accountability systems and the rights and opportunities for supplemental services and public school choice afforded to their children under section 1116 of the Elementary and Secondary Education Act.


Explanation:New language for performance measures published in the PIRC program closing date notice published in the Federal Register on March 27, 2006.

Year Target Actual
2007 Baseline
2008 Baseline + 5%
2009 Baseline + 10%
Annual Outcome

Measure: The percentage of customers (parents, educators in State and local educational agencies, and other audiences) reporting that PIRC services are of high quality.


Explanation:Measure of the quality of recipient services. Data for the measure will be collected through annual performance reports and a customer satisfaction survey to be administered for the first time in 2007.

Year Target Actual
2007
Annual Outcome

Measure: The percentage of customers (parents, educators in State and local educational agencies, and other audiences) reporting that PIRC services are highly useful to them.


Explanation:Measure of usefulness of recipient services. Data for the measure will be collected through annual performance reports and a customer satisfaction survey to be administered for the first time in 2007.

Year Target Actual
2007
Annual Efficiency

Measure: The percentage of grant funds carried over in each year of the project.


Explanation:

Year Target Actual
2007

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of this program is to provide training, information, and support for parent education and family involvement programs. In supporting these programs, however, the statute is so expansive that the program purpose is somewhat unclear.

Evidence: Section 5561 of NCLB enumerates six different purposes addressing a range of areas that include: helping school districts implement parent involvement policies and activities that are designed to lead to school improvement; parents' partnerships with teachers, administrators, and schools; furthering the developmental progress of students; coordinating parent involvement activities; and providing a comprehensive approach to student learning through the coordination and of Federal, state, and local activities.

NO 0%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: The Department of Education maintains that parental involvement is important to student academic success.

Evidence: A research synthesis of 51 studies, A New Wave of Evidence: The Impact of School, Family and Community Connections on Student Achievement (2002), found that there is consistent evidence that many forms of family and community involvement influence student achievement at all ages.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: The program is unique in that it focuses technical assistance on the needs of parents.

Evidence:  

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The program's effectiveness could be enhanced by a clearer focus on what the program aims to accomplish.

Evidence: One example of a design flaw that limits the program's effectiveness is Section 5563(b)(5) which requires PIRCs to serve both urban and rural areas. This measure sometimes forces Grantees to provide assistance in regions where they lack the expertise to meet local needs.

NO 0%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: As with the program purpose, statutory requirements complicate the distribution of funding.

Evidence:  

NO 0%
Section 1 - Program Purpose & Design Score 40%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The Department is working on developing long-term performance measures across a number of technical assistance programs.

Evidence:  

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The Department is working on developing targets and times frames for long-term measures across a number of technical assistance programs.

Evidence:  

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: Program staff recently participated in Department-wide meetings to develop common measures for assessing the performance of ED technical assistance programs. The program has adopted three annual measures (common to all Education TA programs) for 2006 to measure the quality, relevance, and utility of program products and services. These measures will be implemented in 2005. Implementation includes development of a methodology for convening of panels of scientists and practitioners to review products and project designs and developing an instrument for obtaining data from target audiences on the usefulness of ED TA products and services.

Evidence:  

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The Department has established new common measures for technical assistance programs, but has yet to set baselines and targets for these measures.

Evidence:  

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The Department worked with grantees to adopt objectives and activities specifically related to the pre-existing performance measure, and will work with grantees on the use of the new common technical assistance measures.

Evidence:  

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: There have not been any independent evaluations of this program.

Evidence:  

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: ED has not satisfied the first part of the question because program performance changes are not identified with changes in funding levels. The program does not have sufficiently valid and reliable performance information to help assess the impact of the Federal investment. However ED has satisfied the second part of this question in that ED's budget submissions show the full cost of the program (including S&E).

Evidence:  

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The Department has developed common performance measures across technical assistance programs that will be applied to the new Comprehensive Centers program.

Evidence:  

YES 12%
Section 2 - Strategic Planning Score 38%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: The Department does collect annual performance reports to oversee grantee performance. However, new performance measures are being developed.

Evidence:  

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Currently, ED cannot demonstrate how federal managers and program partners are held accountable for program goals. ??However, the Department has initiated several efforts to improve accountability in its programs. ??First, ED is in the process of ensuring that EDPAS plans -- which link employee performance to relevant Strategic Plan goals and action steps ' hold Department employees accountable for specific actions tied to improving program performance. ??ED is also revising performance agreements for its SES staff to link performance appraisals to specific actions tied to program performance.?? Finally, ED is reviewing its grant policies and regulations to see how grantees can be held more accountable for program results.

Evidence: The President's Management Agenda scorecard (Human Capital and Budget & Performance Integration initiatives) notes ED's efforts to improve accountability. ??The Department's Discretionary Grants Improvement Team (DiGIT) recommendations indicate that ED is reviewing its grant policies and recommendations.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Funds are obligated within the timeframes set out by Department schedules and used for the purposes intended.

Evidence: Program staff monitor to make sure that grantees are drawing down funds at an acceptable rate.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: To date, the Department has not established procedures for this program to measure and achieve efficiencies in program operations. ??However, ED is in the process of developing its competitive sourcing Green Plan, and is working to improve the efficiency of its grantmaking activities. ??The Department has also established a strengthened Investment Review Board to review and approve information technology purchases agency-wide. ??

Evidence: Department Investment Review Board materials. ??ED's Discretionary Grants Improvement Team (DiGIT) recommendations.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The program fosters collaboration with Title I programs at the Federal and local levels.

Evidence:  

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Recent agency-wide audits have not identified deficiencies in the financial management of this program.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Major internal management deficiencies have not been identified for this program.

Evidence:  

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: Independent peer review panels are used to score and rank all applications.

Evidence:  

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: The Department maintains information on grantee activities through annual performance reports, telephone contact, and selected site visits.

Evidence: The program's monitoring efforts include the review of annual performance reports.

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: While the program collects grantee performance data on an annual basis, the information has not been made available to the public. Education is developing a department-wide approach to improve the way programs provide performance information to the public. In 2004, Education will conduct pilots with selected programs to assess effective and efficient strategies to share meaningful and transparent information.

Evidence:  

NO 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The Department has not yet established long-term performance goals for this program.

Evidence:  

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: Annual performance goals are currently being developed.

Evidence:  

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department has not yet developed appropriate efficiency measures for this program.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: No data are available for comparable programs.

Evidence:  

NA 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: There have not been any independent evaluations of this program.

Evidence:  

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2004SPR