ExpectMore.gov


Detailed Information on the
Federal Emergency Management Agency: Grants and Training Office Training Program Assessment

Program Code 10003607
Program Title Federal Emergency Management Agency: Grants and Training Office Training Program
Department Name Dept of Homeland Security
Agency/Bureau Name Federal Emergency Management Agency
Program Type(s) Direct Federal Program
Assessment Year 2005
Assessment Rating Adequate
Assessment Section Scores
Section Score
Program Purpose & Design 60%
Strategic Planning 75%
Program Management 100%
Program Results/Accountability 47%
Program Funding Level
(in millions)
FY2007 $195
FY2008 $220
FY2009 $79

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

In coordination with DHS components and Federal agency partners, complete development of the National Training Program required under HSPD-8.

Action taken, but not completed Training Operations is working with the Integrated Management Systems Branch (IMSB) , who has the lead in coordinating the development of the National Training Program charter. The charter is in draft and under review. Once the charter is approved, a strategy and implementation plan will be developed. Training Operations will continue to support IMSB in the development and implementation of the charter, strategy and implementation plan of the National Training Program.
2005

As DHS manages several major training programs aimed at Federal, state, and local personnel, it should pursue cross-cutting, comparative evaluations of their strengths and weaknesses. Consolidation of several training activities under the new "Preparedness Directorate" is a first step in this effort.

Action taken, but not completed Under the new FEMA organization, Training Operations is working with other NPD training organizations and programs such as EMI, CDP and IMSB to develop comparative analysis measures. Training Operations continues to participate in broader coordination efforts at the senior level on how to improve and integrate training activities under the "New FEMA".
2007

Develop a automated and integrated level 1 and level 2 evaluation system to accurately and effectively assess the gain in knowledge, skills, and abilities of Training Operations course participants

Action taken, but not completed Currently a prototype of the new evaluation system is being developed for approval. Development of the system will begin once the prototype is approved by Training Operations. Pilot of the system is estimated to begin July 1, 2008. Full implementation and rollout of the system to Training Operations training partners is estimated to begin January 1, 2009.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2005

Seek language encouraging greater flexibility and/or competition among training partners.

Completed The Training Division funds the Competitive Training Grants Program (CTGP). In '07, 189 applications were submitted in response to a solicitation for programs to address issues relating to the 8 national priorities in the National Goal. 59 applicants were invited to submit full proposals. Through the CTGP initiative's expansion of our training partners, the Training Division promotes greater flexibility and competition to meet the training needs of first responders.
2005

Incorporate state-based risk methodology into allocation of training slots among SLGCP training partners.

Completed States and territories receiving Homeland Security grant funds are required to develop state strategies, which identify their funding priorities. Also, grant recipients have been required to develop multi-year exercise plans. In order to incorporate state-based risk methodology, the Training and Exercise Division began requiring that states and territories begin development of multi-year training and exercise plans.
2005

Develop standardized assessments of homeland security knowledge, skills and abilities that can be used to more systematically compare the impact of training, both among trainees and training providers.

Completed In addition to requiring all training partners to administer standard Level 1 and 2 evaluations to ensure high quality training, the Training Division implemented more quality assurance measures for instructors. The Training Division's Instructor Quality Assurance Program (IQAP) sets standards for recruitment and retention of quality instructors. The Instructor Audit Program (IAP) conducts random audits to measure instructional effectiveness of trainers and ensure quality course instruction.
2005

Develop an FY06 spending plan and FY07 Budget request that more closely link resource allocation to the program's long-term goals.

Completed In 2006 the Training and Exercises Divisions were merged. At that time, a Business Office was created. One of the major responsibilities of the Training and Exercise Division Business Office was to link resource allocations more closely to program goals. All requests for 2006 funds were accompanied by a justification explaining how the funded project is tied to the accomplishment of goals from the Division??s strategic plan

Program Performance Measures

Term Type  
Long-term Outcome

Measure: Percent increase knowledge, skills, and abilities (KSAs) of State and local homeland security preparedness professionals receiving training as measured by pre and post assessments.


Explanation:Percentage improvement in knowledge, skills, and abilities (KSA) of state and local homeland security professionals after the completion of training, which demonstrates strengthened first responder preparedness and mitigation with respect to acts of terrorism, natural disasters, and other emergencies. Measuring these improvements indicates the impact of training services on the nation's preparedness level. This measure evaluates the gain in knowledge, skills, and abilities of students through pre and post course assessments.

Year Target Actual
2007 27% 25%
2008 28%
2009 28%
2010 28%
2011 28%
2012 28%
2013 28%
Annual Output

Measure: Number of people provided awareness level training through direct, remote, online, and satellite broadcast methods


Explanation:This measure determines the level of participation in training courses. The G&T training providers maintain student rosters for each delivery of all courses, which can be divided into four performance levels-awareness, performance defensive, performance offensive, and planning-management. This data also includes viewer estimates from our satellite and web broadcasts, which provide awareness level training content in a news broadcast format.

Year Target Actual
2007 550,000 393,373
2008 555,000
2009 560,000
2010 565,000
2011 570,000
2012 575,000
2013 580,000
Annual Efficiency

Measure: Average cost of training delivered by the National Domestic Preparedness Consortium


Explanation:The average cost of training is represented by the total budget for NDPC divided by the number of responders. This number yields the average cost per student for training

Year Target Actual
2003 Baseline $980.01
2004 $875 $821.91
2005 $800 $786
2006 $775 $1228
2007 $600 $1200
2008 $1,175
2009 $1,150
2010 $1,125
2011 $1,100
2012 $1,075
2013 $1,050
Annual Outcome

Measure: Percent of training recommendations identified from After Action Reports that were acted upon.


Explanation:This measure reflects the efforts of the training program to address gaps identified through exercises and real-world events and pairs with the other outcome measure to indicate that training interventions lead to improved operational performance. The training program determines what recommendations that come from After Action Reports are feasible, given their course offerings, and then calculates how many of those recommendations are acted up by the relevant jurisdictions. Based on updates to the AAR/IP tracking process, data for this measure will be available beginning in FY08.

Year Target Actual
2007 Baseline Baseline
2008 40%
2009 50%
2010 60%
2011 70%
2012 75%
2013 80%
Annual Output

Measure: Number of action steps taken per student following delivery of a course


Explanation:Course participants list action steps they plan to take as a result of a course; ODP follows up three months later to determine how many action steps were taken.

Year Target Actual
2004 Baseline 5.5
2005 6 5.8
2006 6.5 4.37
2007 6.5 Closed
Annual Output

Measure: Number of preparedness professionals that are trained to train first responders.


Explanation:Centralized Scheduling Information Desk (CSID) maintains a database on numbers of responders trained as trainers, which more efficiently disseminates critical knowledge, skills, and abilities to the public safety community.

Year Target Actual
2003 Baseline 7,018
2004 7500 7799
2005 8500 14,399
2006 10,315 9,200
2007 10,315 8929
2008 9250
2009 9500
2010 9750
2011 10,000
2012 10,250
2013 10,500

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: The program's purpose is to provide a comprehensive WMD training program for first responders that improves their capacity to prevent, protect against, respond to, and recover from acts of terrorism. In administering this program, Training Division (TD) staff oversee, coordinate, and strive to institutionalize the development and delivery of comprehensive training that targets members of the responder community, providing WMD knowledge to enhance their skills and abilities. Ensuring the quality and consistency of training of first responders is a key element in achieving ODP's mission to enhance domestic preparedness.

Evidence: Section 430 of the Homeland Security Act of 2002: "ODP shall have the primary responsibility within the executive branch of Government for the preparedness of the United States for acts of terrorism." 3) FY 2006 Congressional Justification for SLGCP's budget states "SLGCP is the principal component of the Department of Homeland Security responsible for preparing state and local governments for acts of terrorism." 4) The National Strategy for Homeland Security, released in July 2002, states that "The growing threat of terrorist attacks on American soil is placing great strain on our Nation's system for training its emergency response personnel."

YES 20%
1.2

Does the program address a specific and existing problem, interest, or need?

Explanation: There is a specific and existing threat of terrorism, as made apparent by the terrorist attacks of September 11, 2001. To respond to this threat, ODP's Training program is the primary entity tasked to train the Nation's first responders to be able to prevent, protect against, respond to, and recover from terrorist attacks. The State Homeland Security Strategies completed each year by the 56 states and U.S. Territories, have consistently identified a need for first responder training as a requirement to achieve domestic preparedness. State Strategies also yield information on how many responders from each discipline need training at the four training levels. In addition, exercise after action reports, improvement plans, and lessons learned provide information for course content that ensures that material taught to responders remains relevant to current threat conditions. In response to these needs, the TD develops, oversees, and coordinates training targeted to ten key responder disciplines and the private sector

Evidence: Data from the FY 2003 State Strategies estimated total preparedness course needs as: Awareness Training: 4,333,068 ; Performance - Defensive Training: 2,196,959; Performance - Offensive Training 819,611; Planning/Management Trainig 496,152 However, these numbers do not equate to individual responders, as individual who functions in multiple disciplines require several courses. Also, state and local estimates may overlap.

YES 20%
1.3

Is the program designed so that it is not redundant or duplicative of any other Federal, state, local or private effort?

Explanation: State and Local Training Program offers a wide range of courses and includes unique training assetss. However, training on terrorism preparedness and weapons of mass destruction (WMD) response is also provided by the Department of Defense, Department of Energy, Department of Health and Human Services, Department of Transportation, and within DHS, the Federal Law Enforcement Training Center and Federal Emergency Management Agency. State and local responder training centers also cover many types of incident response. This multiplicity of counterterrorism and WMD training programs has been the subject of congressional hearings and independent reports. Homeland Security Presidential Directive Eight directed DHS to establish a "national training program" for first responders, in full coordination with other agencies. While DHS has not submitted a coordinated proposal for a National Training System, the State and Local Training Program has taken steps to improve coordination and avoid duplication through the interagency TRADE group and the DHS Training Leaders Council, and at the operational level through the Interagency Board. The approved Federal course list is a result of this coordination.

Evidence: ODP Course Catalog Homeland Security Presidential Directive Eight National Homeland Security Strategy FY 2004 and 2005 Competitive Training Grants Program Solicitation Compendium of Federal Terrorism Training for State and Local Audiences (FEMA: EMI) Federal Law Enforcement Training Center course catalog ODP Training Division: Course Review Policy and Procedure GAO report: "Combating Terrorism: Need to Eliminate Duplicate Weapons of Mass Destruction Training." March 2000, GAO/NSIAD??00-64

YES 20%
1.4

Is the program design free of major flaws that would limit the program's effectiveness or efficiency?

Explanation: The design of the State and Local Training Program is generally supportive of effective and efficient training, but still has systematic flaws. While the Program has developed and disseminated specialized curriculum on WMD awareness, the lack of standardized grades, ratings, or certifications hampers measurement of long-term performance. The Program is flawed by a lack of administrative flexibility in selecting training partners, as well as limited outreach to state and local training programs. The five training centers in the Domestic Preparedness Consortium were selected at the direction of Congress in 1999, four of the five are administered by non-DHS entities and have not been subjected to a competitive, merit-based selection process. Independent studies have questioned whether this focus has limited the Program's outreach to other state and local training programs. Congress has provided SLGCP with additional funds for Competitive Training Grants, though these funds have not been requested.

Evidence: ?? Training Strategy: Chapter 1: Planning and Implementing a Curriculum in a Specialized Discipline ?? ODP: Approach to Blended Learning (p.9) ?? www.ojp.usdoj.gov/odp/training_bl.htm ?? ODP Training Division Policy: Course Reviews ?? FY 2004 and 2005 Competitive Training Grants Program Solicitation; ?? H.R. 4567, 108th Congress, 2nd Session, Title III, pages 22- 24: ?? FY 2003 State needs assessments, requirements in the State Homeland Security Strategies. ?? ODP Training Division web site: www.ojp.usdoj.gov/odp/training.htm" * Report to the House Committee on Appropriations on DHS Office of State and Local Government Coordination and Preparedness, April 2005.

NO 0%
1.5

Is the program design effectively targeted so that resources will address the program's purpose directly and will reach intended beneficiaries?

Explanation: The ODP Training Division uses several tools to allocate training resources to the nation's first responders, but these mechanisms could be more effective in ensuring that budget resources are targeted to the greatest training needs. ODP's Centralized Scheduling Information Desk (CSID) links responders with responders with training courses offered by the Program's training partners. Allocation of course slots to each state is guided by an Allocation Matrix based primarily on population, not relative risk or unmet training needs. There is not a consistent means of ensuring that training is targeted to the communities or responders with the greatest risks or knowledge gaps. The Congressional requirement to deliver most training through the National Domestic Preparedness Training Consortium also limits the program's ability to target resources effectively. The TD is changing its approach to comply with the requirements of HSPD-8 and the National Preparedness Goal to base its course delivery increasingly on risk. As HSPD-8 is implemented and the formula for tiering is completed, the Training Division will adopt the risk-based formula for determining its allocation of training courses.

Evidence: ?? Executive Summary: ODP Training Strategy: Who Should Be Trained? (p. 9) ?? ODP Information Bulletin # 130: WMD Standardized Awareness Authorized Trainer (SAAT) Program ?? ODP Information Bulletin # 138: Updated Guidance for the WMD Standardized Awareness Authorized Trainer Program ?? ODP Training Division web site www.ojp.usdoj.gov/odp/training.htm"

NO 0%
Section 1 - Program Purpose & Design Score 60%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific long-term performance measures that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: The long term performance goal of the training program is to improve the ability of first responders to prevent, protect against, respond to, and recover from terrorist incidents by providing WMD training tailored to a full range of responder disciplines. The TD has two outcome measures that meaningfully reflect the program's purpose: 1) Percent of jurisdictions demonstrating acceptable performance on applicable critical tasks in exercises using SLGCP/ODP-approved scenarios 2) Average percentage increase in WMD and other knowledge, skills, and abilities of state and local homeland security preparedness professionals receiving training from pre and post assessments However, neither of these measures directly capture the program's impact on the 'training shortfalls' cited in State Homeland Security Strategies. The lack of standardized grades, ratings, or certifications for the topics covered by the State and Local Training Program hampares measurement of long-term performance.

Evidence: ?? SLGCP Goal Statement ?? Interim National Response Plan ?? Interim National Preparedness Goal ?? DHS Fiscal Year Homeland Security Plan

YES 12%
2.2

Does the program have ambitious targets and timeframes for its long-term measures?

Explanation: The program has set baseline data and specific, quantifiable targets for its long-term measures. The TD has set more ambitious targets for training outcomes - specifically an increase of nearly 40 percent between 2005 and 2006 - for jurisdictions performing acceptably on critical tasks in exercises. The FY '05 training budget totals $194 million, therefore, a proposed target in which nearly 80 percent of jurisdictions perform acceptably by FY 2010 is ambitious. This target will require that responders or jurisdictions being exercised are receiving adequate, well-designed training. It is less clear whether the target for improved responder knowledge, skills, and abilities is ambitious. Only slight improvements in pre vs. post assessments were made over FY03 and FY05. The FY05 performance level is defined as success. The DHS Training Division believes sustaining 38% gain is ambitious, noting that post test scores are in the ninetieth percentile. However, as this measure focuses on the course-level impacts, it's relevance to the focusing on the relative improvement among those being trained reflects the lack of standardized skill assessments across various disciplines.

Evidence: ?? Exercise Data ?? NDPC data on exams ?? Self Reported gains in KSAs

YES 12%
2.3

Does the program have a limited number of specific annual performance measures that can demonstrate progress toward achieving the program's long-term goals?

Explanation: The Training Division has established annual performance goals and measures that directly support its long-term goals. The annual performance goal is to provide 400,000 courses annually. (These numbers do not necessarily represent individuals. A responder may require training in multiple disciplines and therefore may be counted more than once.) The annual measures are the numbers of responders trained: 1) each year; 2) by level; 3) by delivery method; and 4) as trainers. The fifth measure examines action steps taken as a result of training. The Training Division has been tracking data for these measures since 1999. Data collected from this effort demonstrates progress in achieving annual and long-term goals.

Evidence: ?? The Training Division's annual performance measures that demonstrate progress in achieving the annual, and ultimately the long-term, performance goal are: ?? The number of state and local homeland security preparedness professionals trained each year ?? The number of responder professionals trained at each level ?? Number of people trained by delivery method ?? Number of action steps taken per student following delivery of a course ?? Number of preparedness professionals that are trained to train first responders. CSID - ODP training courses administered to State and local constituents. The data is generated by ODP's Centralized Scheduling and Information Desk (CSID) in ad-hoc reports

YES 12%
2.4

Does the program have baselines and ambitious targets for its annual measures?

Explanation: The training program has five specific annual performance measures with baselines and ambitious targets. The baseline for the number trained shows increases each year since 1997. This growth is expected to stabilize at the goal of 400,000 annually in FY '06. These are ambitious targets given the need to continuously identify eligible personnel in states and territories and select appropriate courses, while maintaining adequate quality control. The baseline for responder professionals trained at each level is based on FY '03 with 80,943 in awareness, 42,198 in performance, and 14,253 in planning/management for a total of 148,819 trained. In FY '05 the program has an ambitious target of training at least 5,000 more responders in awareness and performance and 1,500 in planning/management (fewer responders require training at this level).

Evidence: "?? Centralized Scheduling Information Desk reports entitled: 1) Total Trained by Fiscal Year, 2) Number Trained by ODP Training Level, 3) Number Trained by Delivery Method, and 4) Number Trained to Train ?? Number of Action Steps: ODP Follow-Up Surveys (10% receive follow-up surveys) Action steps are those activities identified by students as a result of their training that they will take back to their agencies to implement. ?? Training Division Strategic Plan

YES 12%
2.5

Do all partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) commit to and work toward the annual and/or long-term goals of the program?

Explanation: The TD enters into interagency and cooperative agreements with its partners in which each commits to work toward the annual and long-term goals of the TD. Federal managers oversee training partner efforts and ensure that they implement systems to measure and report on their performance. Some reporting mechanisms include: Semi-annual progress reports required of all training partners that detail progress toward achieving the TD's performance goals, monthly data showing numbers trained submitted to CSID which directly address the annual measures, onsite and desk monitoring activities that are designed to ensure that partners deliver the training to which they have committed, and semi-annual Symposiums that provide direction to ensure that training partners achieve established TD goals.

Evidence: Other evidence includes: ?? Samples of Progress Reports ?? Interagency and Cooperative Agreements with Training Partners ?? CSID tracking information

YES 12%
2.6

Are independent evaluations of sufficient scope and quality conducted on a regular basis or as needed to support program improvements and evaluate effectiveness and relevance to the problem, interest, or need?

Explanation: The State and Local Training Program has been the subject of relatively few independent evaluations. Those directly commissioned by the Program - such as a 2002 study by Virginia Commonwealth University - have been limited in scope to broad strategy issues. SLGCP-funded courses are subject to intensive quality review, but there have been few independent evaluations of their impact or outcomes. Similarly, there have been no independent evaluations the program's training partners.

Evidence: ?? ODP Training Strategy Executive Summary, Virginia Commonwealth University, Pelfrey and Kelley, 2002 ?? Thoughtlink Report: Review of Models, Simulations, & Games for Domestic Preparedness, Vol III, p. 41, 49, 64, & 97 (www.ojp.gov/odp/docs/Review_of_MSG_Vol3.pdf) ?? Training Division: Course Review Policy and Procedure

NO 0%
2.7

Are Budget requests explicitly tied to accomplishment of the annual and long-term performance goals, and are the resource needs presented in a complete and transparent manner in the program's budget?

Explanation: As specific targets and goals for the State and Local Training Program were established recently, current budgets have not tied funding levels to achieving specific goals. Previous budgets have not clearly linked existing or anticipated resources to the number of personnel trained, or improvements in training quality. However, as SLGCP possesses the information draw such linkages, future budgets are likely to be better linked to performance.

Evidence: ?? Congressional Justification for FY06

NO 0%
2.8

Has the program taken meaningful steps to correct its strategic planning deficiencies?

Explanation: The TD has taken meaningful steps to correct deficiencies through biannual strategic plan workshops that are designed to identify problems and develop actions steps to correct them. The bi-annual meetings address how to achieve long-term and annual goals, the first of which was held in October 2004. Some of the areas identified for improvement were 1) enhancing the "Training Doctrine," for managing the development, implementation and oversight of training activities; 2) developing procedures for working more closely with the exercise division to ensure that exercises are linked to training; 3) improving and standardizing tests that measure participant gains in KSAs and the reporting of results, and 4) mapping training courses to the TCL. The TD will be directly impacted by the rquirements of HSPD-8 and will amend its strategic plan as a result. This, along with efforts to integrate budget and performance, will stregnthen the linkage between the funding and results. "

Evidence: ?? Training Division: Workshop After Action Report (p. 3,6,7)

YES 12%
Section 2 - Strategic Planning Score 75%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: ODP's Training Division regularly collects performance data from key program partners relating to its program goals. Baseline performance data is collected to set meaningful, ambitious performance targets, with sources included pre and post course assessment information, semi-annual progress reports, CSID (numbers trained), site visits, State Strategies, and course reviews. Data is used to adjust program priorities, allocate resources, and take appropriate management actions to improve program performance in these ways: Pre and post assessment data is analyzed to identify and assess the effectiveness of training and adjust courses as needed; State strategies are reviewed to identify gaps in training and develop priorities for new training and the Competitive Training Grants Program is used to solicit training partners that can address the priorities; Semi-annual reports highlight progress toward goals and identify programs that are not achieving them in a timely manner; Course reviews are conducted every 3 years to ensure that course material remains relevant.

Evidence: ?? ODP Training Division Policy: Course Reviews ?? Reports from CSID database ?? Pre and post test scores ?? Semi-annual progress reports from key program partners ?? Site visit reports ?? State Strategies

YES 14%
3.2

Are Federal managers and program partners (including grantees, sub-grantees, contractors, cost-sharing partners, and other government partners) held accountable for cost, schedule and performance results?

Explanation: Several checks and balances hold federal managers and program partners accountable to the TD's annual and long-term goals. Partner funding provide, detailed task/timelines that spell out the schedule for development and delivery of training. Training partners enter into agreements which establish specific performance standards. ODP imposes special conditions in keeping with program and agency guidance, and managers are responsible for ensuring compliance. The TD assigns federal managers to oversee training partner grants. Each is responsible for achieving key program results and is under a Performance Management Plan with clearly defined performance standards for achieving program goals. Managers are given responsibility and are accountable for development and implementation of training programs in their area and are responsible and accountable for costs, meeting the schedule for development and delivery of courses, and for performance results. If program managers identify issues of non-compliance with the terms of the agreements, every effort is made to develop a corrective action plan to bring the training partner into compliance. However, program managers may request that grant funds be frozen until compliance is achieved. The last measure a program manager may exercise for serious issues of non compliance is to initiate deobligation of funds.

Evidence: ?? Semi-annual progress reports ?? Quarterly financial reports ?? Monitoring Policy and Procedures ?? OJP Financial Guide

YES 14%
3.3

Are funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: Grant funds are awarded to training partners in a timely manner. Training Division funds are tied to the Federal fiscal year and are awarded within the fiscal year for which they were appropriated. The OJP Comptrollers Office (OC) is still responsible for processing and awarding funds. OC sets a schedule for the preparation and submission of award documents by the Training Division. The Training Division has never missed the deadline for the submission of awards to OJP. Only a limited amount of unobligated funds remains at the end of the fiscal year because the OJP Office of the Comptroller does not make supplemental awards when unobligated balances are high. Program managers must resolve the issue of unobligated balances with the projects they manage before the Comptroller's Office will make an award. All grantees receiving $500,000 or more are subject to OMB's single audit requirement (A-133). OC is the entity charged with resolving single audit findings. They can freeze funds or hold awards until audit findings are resolved.

Evidence: "?? Semi-annual progress reports ?? Quarterly financial reports ?? Monitoring Policy and Procedures ?? OJP Financial Guide ?? OMB A-133 Circular"

YES 14%
3.4

Does the program have procedures (e.g. competitive sourcing/cost comparisons, IT improvements, appropriate incentives) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: The TD has procedures in place to measure and track efficiencies, such as the average cost of training per student and the per unit cost of training. The TD demonstrated improved efficiency by decreasing the average cost of training delivered by the NDPC from $980.81 in FY 2003 to $821.91 in FY 2004. (Formula = Total NDPC budget (less M&A) by FY divided by numbers trained by FY) The TD achieves efficiencies and cost effectiveness by supporting program improvements and vehicles for training delivery that multiply outputs (e.g. Train-the-trainer, web-based, satellite delivery of courses). An increased use of train-the-trainer (TtT) courses is a service multiplier, increasing the return on the investment of federal dollars. In FY '05, ODP instituted the WMD standardized awareness authorized TtT program to provide standardized awareness training to 10 disciplines and the private sector by training teams of trainers at the state and urban area level who can then institutionalize this training in their respective organizations.

Evidence: ?? CSID database on numbers trained as trainers and by delivery method ?? NDPC budgets"

YES 14%
3.5

Does the program collaborate and coordinate effectively with related programs?

Explanation: The TD collaborates and coordinates effectively with federal agencies specifically on the review of terrorism preparedness courses and curricula. Some examples of collaboration leading to meaningful actions are: 1) ODP co-sponsored a national training and exercise conference in conjunction with FEMA , EMI and the CDC in May 2005, which was attended by state and local training and exercise personnel; 2) The TD worked closely with FEMA to develop program guidelines related to USAR courses; 3) Key FEMA training personnel are full participants in the IAB training subgroup providing input for FY '06 training grant guidance currently in development; 4) The TD coordinates with DoD primarily through the TSWG Training Technology subgroup, which sets priorities for training development and reviews and funds proposals accordingly; and 5) The TRADE group establishes and maintains processes to ensure the consistency and quality of federally-sponsored preparedness training. However, some state and local training providers believe the Program has not collaborated sufficiently with their efforts to improve and expand terrorism incident training.

Evidence: ?? Vigilance, a training video produced jointly by ODP, FBI, and LSU ?? FLETC Interagency Agreement ?? DHS Training Leaders Council Charter ?? TRADE Fact Sheet ?? Cooperative Agreement with Nevada Test Site ?? Interagency Board for Equipment Standardization and Interoperability, www.iab.gov ?? HSPD-8 Training, Exercises, and Lessons Learned Integrated Concept Team Program Implementation Plan and Requirements document

YES 14%
3.6

Does the program use strong financial management practices?

Explanation: Through the TD's strong financial management practices, its programs are free of internal control weaknesses. The TD requires its program managers (PM) to conduct financial monitoring through desk audits and onsite monitoring visits in conjunction with the OJP Office of the Comptroller (OC). Financial monitoring establishes that the grantee has proper protocols and oversight of its grant-related financial activities. PMs conduct reviews of expenditure records, ensuring that the grantee has established a system for approving grant expenditures and has other checks and balances in place to ensure that expenditures are allowable, are separated from non-grant funds, and are in-line with the approved budget. The TD provides training at its symposiums on financial requirements. The PM also conducts routine and issue-driven site visits, as well as quarterly desk audits. All grantees receiving $500,000 or more are subject to OMB's single audit requirement (A-133). OC resolves single audit findings and can freeze funds or postpone awards until audit findings are resolved.

Evidence: "?? Office for Domestic Preparedness Training Division: Policy and Procedure for Monitoring Grants and Cooperative Agreements (p. 2) ?? OJP Financial Guide ?? Semiannual Progress Report format ?? Quarterly Financial Report, SF 269"

YES 14%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: The following are examples of procedures the TD has put in place identifying and correcting program management deficiencies. First, the TD has implemented a monitoring policy and procedure guide to provide enhanced direction for monitoring contracts, grants, and cooperative agreements. Second, a quarterly review of all projects has been instituted; Third, program managers are required to perform quarterly desk reviews of all files. Fourth, the TD has implemented a Learning Management System pilot to improve data collection related to student registration, scheduling and completion; Fifth, an enhanced semi-annual report format has been implemented to collect additional data on program expenditures and training results. Sixth, all training partners receiving FY '05 funds are required to adhere to a special condition regarding enhanced data collection on pre and post course assessments.

Evidence: "?? Office for Domestic Preparedness Training Division: Policy and Procedure for Monitoring Grants and Cooperative Agreements (p. 1) ?? OJP Financial Guide ?? Desk Review Template * Program Review Questions ?? Semi-Annual Progress Report Form

YES 14%
Section 3 - Program Management Score 100%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term performance goals?

Explanation: The State and Local Training Program is making some progress on its long-term performance goals. The Program expects to meet its 2005 target for 23 percent of jurisdictions demonstrating adequate performance on exercises, but will be challenged to nearly triple this level of performance in FY 2006 in order to demonstrate that 60 percent of jurisdictions achieve acceptable performance. The First Responder Training Portal and the National Exercise Program systems, such as the HSEEP Toolkit, will increase compliance with HSEEP and training standards and will consequently improve jurisdictional performance in exercises The Program is on track to achieve its long-term goal for participating responders to achieve an average 38 percent increase in their WMD and other knowledge, skills, and abilities. However, as most responders still have inadequate training to perform acceptably on homeland security exercises, this effectiveness measure has limited application to the training deficiencies cited in the 2003 State Homeland Security Strategies.

Evidence: [FYSP?]

SMALL EXTENT 7%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: The Training Division continues to achieve its annual performance goals, and measures its effort to ensure progress toward its long-term performance goals. This effort was begun under GPRA and feeds into the President's Management Agenda. The Training Division has met and exceeded all of its FY 2001, FY 2002, FY 2003 and FY 2004 targets.

Evidence: "?? PART Performance Measures Table for the Training Program contains data on demonstrated progress in achieving its annual and long-term and performance goals. ?? CSID Report: Total Trained by Fiscal Year and ODP Training Level ?? NTPI Training Report ?? CSID Report: Number of Train-the-Trainer Students by Fiscal Year"

YES 20%
4.3

Does the program demonstrate improved efficiencies or cost effectiveness in achieving program goals each year?

Explanation: The Training Division demonstrated improved efficiency over FY 2003 by decreasing the average cost of training delivered by the NDPC from $980.81 to $821.91 in FY 2004. (Formula = Total NDPC budget, less M&A, by FY, divided by numbers trained by FY). The Training Division strives to achieve efficiencies and cost effectiveness by establishing the following procedures for improving efficiency: 1. Implementing information technology improvements to reduce the time, effort, and cost of scheduling training class deliveries; 2. Conducting periodic program audits that review and scrutinize training partner budgets relative to course costs; 3. Ensuring training partners are performing within budget and meeting or decreasing the per unit cost of delivering courses as specified in their budgets. The program has an efficiency measure (per unit cost of training delivered by the NDPC) with supporting baseline and targets. However the relative efficiency of NDPC training programs varies considerably, and the cost efficiency of some partners has decreased.

Evidence: In FY 2003 the average cost per student for training was $980.81. In FY 2004, the NDPC trained responders at an average per student cost of $821.91.

LARGE EXTENT 13%
4.4

Does the performance of this program compare favorably to other programs, including government, private, etc., with similar purpose and goals?

Explanation: The breadth and depth of the State and Local Training Program's support of state and local responders compares favorably to other Federal training programs. However, despite numerous hearings, reports, and studies noting the overlaps among the Federal governments terrorism and WMD-response training programs, there has been little cross-cutting analysis of which agencies or programs are relatively more effective. Also, other DHS training programs with more specialized focus, such as the National Fire Academy and TSA Trainer Screening Program, have made greater use of standardized ratings and certifications in tracking performance.

Evidence: ?? FEMA Course Catalog ?? FEMA Strategic Plan, Goals, and Measures ?? FLETC Catalog ?? ODP Training Strategy Executive Summary, Virginia Commonwealth University, Pelfrey and Kelley, 2002 ?? ODP Training Division web site www.ojp.usdoj.gov/odp/training.htm * CSID reports: 1) Total Trained by FY; 2) Number trained by ODP training level; 3) Number Trained by Delivery Method, 4) Number Trained to Train ?? ODP Course Catalog "

SMALL EXTENT 7%
4.5

Do independent evaluations of sufficient scope and quality indicate that the program is effective and achieving results?

Explanation: As discussed in question 2.6, there have been no broad, independent evaluations of the State and Local Training Program's effectiveness or results. Internal strategy assessments in 2002 validated the program's strategic direction, but did not address the relative effectiveness of training partners or validate the effectivness of specific courses.

Evidence: ?? ODP Training Strategy Executive Summary, Virginia Commonwealth University, Pelfrey and Kelley, 2002

NO 0%
Section 4 - Program Results/Accountability Score 47%


Last updated: 09062008.2005SPR