ExpectMore.gov


Detailed Information on the
IDEA Special Education - Technical Assistance and Dissemination Assessment

Program Code 10002100
Program Title IDEA Special Education - Technical Assistance and Dissemination
Department Name Department of Education
Agency/Bureau Name Office of Special Education and Rehabilitative Services
Program Type(s) Competitive Grant Program
Assessment Year 2004
Assessment Rating Results Not Demonstrated
Assessment Section Scores
Section Score
Program Purpose & Design 100%
Strategic Planning 25%
Program Management 60%
Program Results/Accountability 0%
Program Funding Level
(in millions)
FY2007 $49
FY2008 $48
FY2009 $48

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Develop baselines and targets for the program's 2 long-term measures.

Action taken, but not completed The long-term measures focus on 6 targeted areas: assessment, literacy, behavior, instructional strategies, early intervention, and inclusive practices. Data for establishing baselines and targets for a measure dealing with the implementation of practices and a measure dealing with model projects will be available in October of 2008.
2005

Use performance and other program information to actively manage the overall TA&D program portfolio by adjusting issue coverage and reallocating resources when needs and priorities shift.

Action taken, but not completed OSEP is working to collect performance information that may be used to manage the program. In particular long-term performance measures have been developed to focus on 6 target areas (assessment, literacy, behavior, instructional strategies, early intervention, and inclusive practices) that will be monitored and changed as needs change.
2006

Develop a baseline and targets for the program's efficiency measure.

Action taken, but not completed The program efficiency measure is: "Cost per output defined as cost per unit of technical assistance, by category, weighted by the expert panel quality rating." The Department is working to determine what units of technical assistance and categories are appropriate for the Technical Assistance and Dissemination Program, and how these factors should be weighted. Data for establishing a 2006 baseline and targets will be available in October of 2008.
2007

Develop a strategy for evaluating the impact and effectiveness of program activities.

Action taken, but not completed The Office of Special Education Programs has begun to work on a plan for evaluating programs under Part D of the Individuals with Disabilities Education Act.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Develop baselines and targets for three new measures that have been adopted for the Technical Assistance and Dissemination program.

Completed Three new performance measures have been developed. Baseline data for these measures became available in November of 2006. However, this baseline data is of very low quality. Targets starting with 2007 have been established, but are expected to be revised.

Program Performance Measures

Term Type  
Annual Output

Measure: The percentage of products and services deemed to be of high quality by an independent review panel of qualified experts or individuals with appropriate expertise to review the substantive content of the products and services.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2005 Baseline 56%
2006 NA 74%
2007 61% Expected Oct. 2008
2008 76%
2009 77%
2010 78%
Annual Output

Measure: The percentage of products and services deemed to be of high relevance to educational and early intervention policy or practice by an independent review panel of qualified members of the target audiences for the technical assistance and disseminations.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2005 Baseline 63%
2006 NA 94%
2007 68% Expected Oct. 2008
2008 94%
2009 94%
2010 94%
Annual Outcome

Measure: The percentage of all products and services deemed to be of useful by target audiences to improve educational or early intervention policy or practice.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2005 Baseline 43%
2006 NA 46%
2007 48% Expected Oct. 2008
2008 50%
2009 52%
2010 54%
Annual Efficiency

Measure: Cost per output defined as cost per unit of technical assistance, by category, weighted by the expert panel quality rating.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2006 Baseline Expected Oct. 2008
2007 Expected Oct. 2008
2008
2009
Long-term Outcome

Measure: The percentage of school districts and service agencies receiving IDEA Technical Assistance and Dissemination services regarding scientifically- or evidence-based practices for infants, toddlers, children and youth with disabilities that implement those practices.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2006 Baseline Expected Oct. 2008
2008
2010
2012
2014
Long-term Output

Measure: Of the IDEA Technical Assistance and Dissemination projects responsible for developing models, the percentage of projects that identify, implement and evaluate effective models.


Explanation:Data for the measure will be gathered through an expert panel review of products and services provided by projects.

Year Target Actual
2006 Baseline Expected Oct. 2008
2008
2010
2012
2014

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The purpose of the Technical Assistance and Dissemination (TA&D) program is to provide coordinated and accessible assistance and information on early intervention and education issues (special education, regular education, related services, transition, etc.) to support parents, teachers, school/State administrators, and other personnel working with children with disabilities so that they can help improve services and results through systemic-change activities and other efforts. For example, the National Center on Monitoring and Evidence-Based Decision Making is assisting state and local education agencies, and the Department to implement a system to develop data, monitor performance based on that data, and use data to adjust State and local educational programs. www.monitoringcenter.lsuhsc.edu/

Evidence: IDEA Part D section 681(b)(2). The program addresses a wide range of problems, many of which are chronic.

YES 20%
1.2

Does the program address a specific and existing problem, interest or need?

Explanation: Special education and early intervention services are complex and cut across a wide range of issues dealing with diverse types and servereness of disabilities, services, and age ranges. Parents, teachers, early intervention service providers, and other personnel who support children receiving IDEA services have an ongoing need for high quality technical assistance and information to address these complicated issues.

Evidence: IDEA Part D section 681(b)(2). A listing of funded projects is available at www.cec.sped.org/osep/database/ search on 84-326*.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: OSEP's TA&D activities are specialized and do not overlap with other Federal activities. Many of the program's activities are structured to support rather than duplicate State TA&D services. Many products and services are geared toward States so that they can, in turn, provide services to their local educational agencies and early intervention service providers. However, there is some concern at the project level that some TA&D activities overlap with each other. OSEP is addressing this issue by emphasising greater coordination amongst grantees and by its new policy requiring projects to obtain approval from OSEP's Dissemination Center prior to development of new materials.

Evidence: See listing of funded projects at www.cec.sped.org/osep/database/ search on 84-326*.

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: We do not have evidence that another approach, mechanism, or infrastructure would be more efficient or effective to achieve the program's purposes.

Evidence:  

YES 20%
1.5

Is the program effectively targeted, so that resources will reach intended beneficiaries and/or otherwise address the program's purpose directly?

Explanation: The program funds a variety of technical assistance and dissemination projects focusing on wide range of special education and early intervention issues targeted to help States, school administrators, parents, teachers, and other support personnel provide high quality services to children with disabilities.

Evidence: Application notices. Required meetings with project officers. Most awards are made through cooperative agreements for periods of 5 years.

YES 20%
Section 1 - Program Purpose & Design Score 100%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The program does not have meaningful long-term measures. The Department is also working with OMB on developing an appropriate efficiency measures.

Evidence: Lack of long term measures.

NO 0%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program does not have meaningful long-term measures or ambitious targets.

Evidence: Lack of long term goals.

NO 0%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: Program staff recently participated in Department-wide meetings to develop common measures for assessing the performance of ED technical assistance programs. The data for the measures generated through these meetings will be collected in 2006. Implementation includes development of a methodology for convening of panels of scientists and practitioners to review products and project designs and developing an instrument for obtaining data from target audiences on the usefulness of ED TA products and services.

Evidence: Draft Common Measures for Education Technical Assistance Programs.

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: OSEP still needs to develop baselines and targets for these annual measures.

Evidence: Draft Common Measures for Education Technical Assistance Programs.

NO 0%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: Most TA&D projects are funded through cooperative agreements where awardees and OSEP staff work together to define and achieve project goals. Annual program goals (e.g. the use of high quality materials) are embedded in the project priorities. The program has also adopted a clearance process for the development of new materials, which should lead to improved quality. Program partners are likely to be committed to the new annual goals.

Evidence: Announcements for fiscal year 2004 Technical Assistance and Dissemination competitions can be found at www.ed.gov/legislation/FedRegister/announcements/2004-1/031004c.pdf (Regional Resource Centers) www.ed.gov/legislation/FedRegister/announcements/2004-1/031504i.pdf (Projects for Children and Young Adults Who are Deaf-Blind). www.ed.gov/legislation/FedRegister/announcements/2004-2/042104h.pdf (National Clearinghouse on Deaf Blindness).

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: There has not been an independent evaluation of the entire TA&D program in recent years, but one is planned to start in 2004 or 2005. However, the program assesses the activities of its project grantees. For example, many projects (those with the most significant Federal funding) are required to have an independent evaluation in their second year of operation to help determine whether funding should be continued in future years.

Evidence: "The OSEP State Technical Assistance intiative New York State Pilot" external evaluation by The Study Group Inc. (May 31, 2003); "EMSTAC Effectiveness Data" (Elementry and Middle Schools Technical Assistance Center) self evaluation by American Institutes for Research (March 2002).

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: The Budget request is not tied to either annual or long-term goals.

Evidence: Department of Education Fiscal Year 2005 Budget.

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: Although OSEP has been working to address its strategic planning deficiencies, meaningful actions to eliminate such deficiencies have not yet been implemented. As OSEP works to address planning deficiencies, it is placing particular emphasis on "adopting a limited number of specific, ambitious long-term performance goals and a limited number of annual performance goals."

Evidence: The program is participating in Education's Technical Assistance common measures group but more work still needs to be done to correct strategic planning deficiencies.

NO 0%
Section 2 - Strategic Planning Score 25%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: A team of reviewers assesses the performance of large coopertative agreements in their second year. These evaluations are used to determine if continuation funding is appropriate for the final project years. OSEP staff also work closely with awardees to implement their projects and review their continuation and final reports. However, OSEP has not adequately used performance and other information to actively manage the overall TA&D program portfolio, adjust priorities or allocate resources.

Evidence: OSEP's "3+2" evaluation and annual performance reports.

NO 0%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Currently, ED cannot demonstrate how federal managers and program partners are held accountable for program goals. However, the Department has initiated several efforts to improve accountability in its programs. First, ED is in the process of ensuring that EDPAS plans -- which link employee performance to relevant Strategic Plan goals and action steps ' hold Department employees accountable for specific actions tied to improving program performance. ED is also revising performance agreements for its SES staff to link performance appraisals to specific actions tied to program performance. Finally, ED is reviewing its grant policies and regulations to see how grantees can be held more accountable for program results.

Evidence: The President's Management Agenda scorecard (Human Capital and Budget & Performance Integration initiatives) notes ED's efforts to improve accountability. ??The Department's Discretionary Grants Improvement Team (DiGIT) recommendations indicate that ED is reviewing its grant policies and recommendations.

NO 0%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: OSEP successfully obligates funds by the end of each fiscal year (but mostly late in the year). ED is instituting changes through the Discretionary Grants Re-Engineering process to ensure that grant competitions are announced on a regular schedule and provide sufficient time to review applications. Funds are spent for intended purposes (as assessed through grant and contract monitoring, and grant reviews for major grant programs). We have not identified improper uses of funds.

Evidence: Finance reports, notices of competitions, lists of funded applications.

YES 10%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: To date, the Department has not established procedures for this program to measure and achieve efficiencies in program operations. However, ED is in the process of developing its competitive sourcing Green Plan, and is working to improve the efficiency of its grantmaking activities. The Department has also established a strengthened Investment Review Board to review and approve information technology purchases agency-wide.

Evidence: Department Investment Review Board materials. ED's Discretionary Grants Improvement Team (DiGIT) recommendations.

NO 0%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: Education has convened a technical assistance working group to better coordinate similar Department TA&D programs in OSEP, IES, the What Works Clearinghouse, and elsewhere. All programs will collect common annual performance measures starting in 2006 on program quality, relevance, and utility. Also, OSEP is working to ensure that its various TA&D project grantees are collaborating with each other on program activities and strategies in order to reduce duplication. OSEP should also better coordinate with RSA on issues such as school transition that are of interest to both agencies.

Evidence:  

YES 10%
3.6

Does the program use strong financial management practices?

Explanation: Auditors have not reported internal control weaknesses. The Department has a system for identifying excessive draw downs, and can put individual grantees on probation where draw downs need to be approved.

Evidence:  

YES 10%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The program has taken steps to address some of its management deficiencies. For example, the President's Commission on Special Education identified the "peer review" process as an area of weakness in current program management practices. In response, OSEP has provided internet training on the peer review process. Also, OSEP has adopted a clearance process for the development of new materials, which should lead to improved quality and reduce duplication of efforts. However, OSEP's inability to produce a Comprehensive Plan as required by the IDEA Amendments of 1997 for this and other Part D National Activities program remains a problem.

Evidence: Lack of OSEP planning document.

YES 10%
3.CO1

Are grants awarded based on a clear competitive process that includes a qualified assessment of merit?

Explanation: New awards are based on a clear competitive process, but OSEP has recently expanded its practice of awarding supplements to recipients, which are not competitive. In addition, there is often limited competition for some awards.

Evidence: OSEP application notices.

YES 10%
3.CO2

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: OSEP reviews awardee performance through annual performance reports and final reports, and holds annual meetings with project officers in Washington. When necessary, OSEP staff also conduct site visits to review grantee activities.

Evidence:  

YES 10%
3.CO3

Does the program collect grantee performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: Performance data is collected annually from awardees. However, this data is not readily available to the public in a transparent and meaningful manner. Education is developing a department-wide approach to improve the way programs provide performance information to the public. In 2004, Education will conduct pilots with selected programs to assess effective and efficient strategies to share meaningful and transparent information.

Evidence: Lack of transparent data for the public.

NO 0%
Section 3 - Program Management Score 60%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The program does not have meaningful long-term measures.

Evidence: Lack of long term measures.

NO 0%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The program still needs to develop annual performance goals.

Evidence: Lack of annual performance goals.

NO 0%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Department is working with OMB on developing an appropriate efficiency measure for this program and other similar TA&D programs.

Evidence:  

NO 0%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: We have no systemic evidence to compare OSEP's TA&D program with other TA&D programs. However, the Department is currently working with OMB to develop a limited number of cross cutting performance indicators that may allow for such comparisons in the future.

Evidence:  

NO 0%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: There has not been an independent evaluation of this program, but OSEP is planning an evaluation of all of its Part D National Activities in 2004 or 2005.

Evidence: Lack of an independent program evaluation.

NO 0%
Section 4 - Program Results/Accountability Score 0%


Last updated: 09062008.2004SPR