ExpectMore.gov


Detailed Information on the
Manufacturing Extension Partnership Assessment

Program Code 10000040
Program Title Manufacturing Extension Partnership
Department Name Department of Commerce
Agency/Bureau Name National Institute of Standards and Technology
Program Type(s) Competitive Grant Program
Assessment Year 2002
Assessment Rating Moderately Effective
Assessment Section Scores
Section Score
Program Purpose & Design 40%
Strategic Planning 86%
Program Management 91%
Program Results/Accountability 80%
Program Funding Level
(in millions)
FY2007 $105
FY2008 $90
FY2009 $4

Ongoing Program Improvement Plans

Year Began Improvement Plan Status Comments
2008

Supporting discontinuation of Federal funding and transition of MEP centers to self-supported entities.

Action taken, but not completed
2006

Moving ahead with plans to implement Next Generation MEP, with a greater emphasis on technology-based innovation services.

Action taken, but not completed In Q2 2008, MEP met with state partners from California, Pennsylvania, Colorado, Maryland, Alabama, and Tennessee to discuss opportunities for greater collaborations and the implementation of MEP's innovation services in support of implementing Next Generation MEP. In addition, MEP is meeting with the Southern Growth Policies Board (an organization representing 13 southern states) to present its Next Generation plans and explore opportunities for collaboration.

Completed Program Improvement Plans

Year Began Improvement Plan Status Comments
2003

Focus funding on individual centers' performance and need.

Completed Hollings MEP has implemented a Center Performance Management process to direct funding based on performance. Each center must meet certain performance thresholds to maintain an Acceptable Performance rating. Unacceptable performance results in a four part probation period. If after a specified period of time in probation, the results of the client survey indicate that the center does not meet the requirements, the result is re-competition of the funding.

Program Performance Measures

Term Type  
Annual Output

Measure: Increased sales attributed to MEP assistance ($ in millions)


Explanation:

Year Target Actual
2002 726 953
2003 522 1483
2004 228 1889
2005 591 2842
2006 591 3460
2007 762 avail 12/2008
2008 630
2009 0
Annual Output

Measure: Capital investment attributed to MEP assistance ($ in millions)


Explanation:

Year Target Actual
2002 910 940
2003 559 912
2004 285 941
2005 740 2248
2006 740 1270
2007 955 avail 12/2008
2008 485
2009 0
Annual Output

Measure: Cost savings attributed to MEP assistance ($ in millions)


Explanation:

Year Target Actual
2002 497 681
2003 363 686
2004 156 721
2005 405 1304
2006 405 919
2007 521 avail 12/2008
2008 330
2009 0

Questions/Answers (Detailed Assessment)

Section 1 - Program Purpose & Design
Number Question Answer Score
1.1

Is the program purpose clear?

Explanation: MEP has a clear statutory mandate, clear mission statement, and an overarching programmatic focus on raising the productivity and competitiveness of small manufacturers.

Evidence: MEP's mission and purpose is clearly stated in the DOC FY 2004 budget and Annual Performance Plan and associated documents. MEP's purpose is rooted in its statutory authority, see 15 USC 278k.

YES 20%
1.2

Does the program address a specific interest, problem or need?

Explanation: MEP is intended to address the productivity gap between large and small manufacturers. The importance of small manufacturers in virtually all manufacturing supply chains and the economy as a whole makes the resolution of this problem a broad national need. However, it is not evident that there is a clear need for the federal government to fill this role--i.e., a "national need" does not necessarily require a "federal response."

Evidence: U.S. Census Bureau data show the size of the productivity gap between small and large manufacturers getting larger over time. In 1997, productivity per employee for large manufacturers was 70% higher than small manufacturers. An NRC study identifies unique needs of small manufacturers.

YES 20%
1.3

Is the program designed to have a significant impact in addressing the interest, problem or need?

Explanation: Despite its success in leveraging financial support and establishing a nationwide network, MEP only serves a small percentage of small manufacturers each year, and it is not clear that there is a significant impact on the productivity and competitiveness of small manufacturers as a whole. Through its national network the MEP program coordinates and leverages the activities of partners and contributing organizations, including state and local governments, universities, colleges, community colleges and other educational organizations, local chambers of commerce, and related organizations. Federal investment represents approximately 1/3 of the funding for each MEP Center; this investment leverages 1/3 state contribution and 1/3 fee supported activities.

Evidence: Currently, MEP serves less than 7% of the small manufacturing community each year. While MEP's performance measures and outside studies show improvements in productivity and competitiveness of client firms (see DOC annual performance plan and Jarmin study), it is difficult to isolate the impact of MEP from other factors, such as changes in the economy. A long-term study of MEP clients vs non-MEP clients is not available. Some performance gains may also be the result of displacing business from non-client firms, resulting in little or no net effect on the economy. Because firms self-select into the MEP program, it is possible that the firms could have sought assistance through other means and achieved similar results.

NO 0%
1.4

Is the program designed to make a unique contribution in addressing the interest, problem or need (i.e., not needlessly redundant of any other Federal, state, local or private efforts)?

Explanation: Through its state and local affiliates, MEP is designed to reach small manufacturing establishments that are less likely to be served by large private consulting firms. MEP leverages state and local resources to provide tailored manufacturing technical assistance to its customers. MEP is unique in that it is the only nationwide network of specialized manufacturing extension centers. However, the services provided could be obtained through other sources.

Evidence: While the big consulting firms may not provide services to small manufacturing firms, there are a number of non-federal entities across the country that are available to small firms for various consulting services. In a recent survey by the Modernization Forum, half of MEP clients surveyed said that alternative service providers were available, however, prices for services were more than double the rates charged by MEP centers.

NO 0%
1.5

Is the program optimally designed to address the interest, problem or need?

Explanation: MEP has developed a nation-wide network of centers through State-Federal partnerships, but what the program's next steps will be is unclear. The original design of the program intended for centers to become self-sufficient, yet there are currently no plans for achieving this goal (most Centers indicate that they would not be able to continue operations if funding is cut as requested in the President's Budget). The program should focus on creating a private sector market for these services rather than continually providing federal subsidies.

Evidence: The MEP network encompasses over 3000 local partnerships and over 2000 MEP center staff (not Federal government employees). However, improvements to the design of the program should be made. The large benefits received by firms more than offset any increase in fees that would be necessary to replace federal funding.

NO 0%
Section 1 - Program Purpose & Design Score 40%
Section 2 - Strategic Planning
Number Question Answer Score
2.1

Does the program have a limited number of specific, ambitious long-term performance goals that focus on outcomes and meaningfully reflect the purpose of the program?

Explanation: MEP focuses on a single overarching goal that derives directly from the program's implementing statute: improving the productivity and competitiveness of its customer base. The outcomes that it manages to and measures directly reflect the Program's long-term goal and purpose.

Evidence: MEP's goal is "to raise the productivity and competitiveness of small manufacturers". See the DOC budget justification and annual performance plan; see also 15 USCS 278k.

YES 14%
2.2

Does the program have a limited number of annual performance goals that demonstrate progress toward achieving the long-term goals?

Explanation: MEP develops annual quantitative performance targets for a suite of competitiveness indicators, including sales increases, capital investment, and cost savings attributable to MEP services; progress toward these goals are intended to represent progress toward the long-term goal of improving the productivity of its customer base. However, it is difficult to isolate the direct impact of MEP. Other factors can influence these performance measures, and as stated previously, some performance gains by MEP clients may be at the expense of non-clients, resulting in no net impact on the small manufacturing community.

Evidence: MEP's performance evaluation system and management processes focus on programmatic performance; these methods and the outcome-oriented data they produce are presented in the DOC budget justification and annual performance plans and reports. Data also are presented in MEP documents, such as "NIST MEP Partnership: A Network for Success, A Review of Results and the Evaluation Process"(NIST/MEP, May 2002).

YES 14%
2.3

Do all partners (grantees, sub-grantees, contractors, etc.) support program planning efforts by committing to the annual and/or long-term goals of the program?

Explanation: MEP's survey-based evaluation system obtains results data from clients that, when aggregated, demonstrate system-wide progress toward the Program's goals, however results vary widely by Center. Also, there is no evidence that any of the Centers are focused on becoming self-sufficient as was intended in the original design of the program.

Evidence: MEP's performance evaluation system processes, measures, targets, and results are reported in the DOC budget justification and the annual performance plan and performance report. Performance data also are presented in MEP program literature (see above).

NO 0%
2.4

Does the program collaborate and coordinate effectively with related programs that share similar goals and objectives?

Explanation: Where possible, MEP leverages its program and distribution network through collaboration and coordination with other federal agencies.

Evidence: MEP's collaborative activities are described in program literature and in the Program's annual report to Congress. Collaborative efforts over time have included EPA, SBA, DoL, and USDA. Current initiatives underway include coordination with MBDA and other offices within the Department of Commerce. The MEP system itself is structured around collaborative partnerships, and includes not only state and local governments but also universities, colleges, community colleges and other educational organizations, local chambers of commerce, and related organizations that ultimately share MEP's goals and objectives.

YES 14%
2.5

Are independent and quality evaluations of sufficient scope conducted on a regular basis or as needed to fill gaps in performance information to support program improvements and evaluate effectiveness?

Explanation: In addition to 3rd party survey data, MEP is reviewed by an external advisory board, the MEP National Advisory Board, which meets three times a year. MEP also has contracted for formal external program evaluation studies, which have shown that MEP's client base experiences productivity growth rates that exceed those of a control group of similar manufacturing establishments.

Evidence: Reports of the MEP National Advisory Board; productivity studies conducted by R.S. Jarmin, Center for Economic Studies, Bureau of the Census, GAO studies. For an overview of external evaluation processes and related evaluation information, see "NIST MEP Partnership: A Network for Success, A Review of Results and the Evaluation Process."

YES 14%
2.6

Is the program budget aligned with the program goals in such a way that the impact of funding, policy, and legislative changes on performance is readily known?

Explanation: Out-year targets for quantitative performance measures are based in part on resource inputs; variation in input levels directly affect estimated performance.

Evidence: The DOC budget justification and annual performance plan show the impact of proposed funding and policy changes on MEP's performance measures.

YES 14%
2.7

Has the program taken meaningful steps to address its strategic planning deficiencies?

Explanation: NIST as a whole has developed a new Institute-wide long-term strategic planning process; the process includes new mechanisms for aligning Operating Unit plans with the NIST-wide plan.

Evidence: NIST's external advisory bodies routinely observe and comment on any deficiencies associated with NIST's strategic planning processes, and NIST responds to these observations. For example, the NIST Visiting Committee on Advanced Technology (VCAT) has reviewed and commented favorably on NIST-wide strategic planning efforts in recent meetings (held quarterly). NIST's new long-term strategic plan currently is in DOC review.

YES 14%
Section 2 - Strategic Planning Score 86%
Section 3 - Program Management
Number Question Answer Score
3.1

Does the agency regularly collect timely and credible performance information, including information from key program partners, and use it to manage the program and improve performance?

Explanation: MEP uses an ongoing, systematic center progress reporting and client survey system to obtain performance data from clients and compile information for quantitative performance measures. These data are used not only to report system-wide results but also for program management purposes: Data obtained through the MEP evaluation system are used to review and manage Center performance and to evaluate and adjust the national program's product and service mix. However, it is unclear to what extent data are actually being used.

Evidence: On a quarterly basis, MEP collects performance output data from centers. MEP's performance evaluation system processes, results, and targets are presented in the DOC budget justification and annual performance plans and reports, as well as in various MEP documents (see above). MEP's survey and evaluation system has been reviewed by and designed with the assistance of external evaluation experts.

YES 9%
3.2

Are Federal managers and program partners (grantees, subgrantees, contractors, etc.) held accountable for cost, schedule and performance results?

Explanation: MEP collects Center-level performance through the Center review process and uses results to improve Center performance. Through this process, several Centers have been restructured to meet performance criteria. In cases where performance criteria are consistently not met, Centers have been closed.

Evidence: As a result of the evaluation process, three Centers have been closed and eight Centers have been suspended for inability to meet performance criteria. Ten centers have been significantly restructured resulting in improved performance.

YES 9%
3.3

Are all funds (Federal and partners') obligated in a timely manner and spent for the intended purpose?

Explanation: NIST as a whole manages its resources carefully; it has had no anti-deficiency violations, and the MEP program typically has a limited amount of unobligated funds at year end. NIST's strong budget and accounting systems include rigorous internal reviews and external audits to ensure that funds are expended as intended. MEP obligates funds for MEP Center renewals in a timely manner after successful review and evaluation of each Center.

Evidence: See SF-132 (apportionment schedule) and SF-133 (report on budget execution). Internal NIST processes include rigorous quarterly financial reviews. See the audited NIST Annual Financial Statements. MEP Center reviews examine funds usage; centers undergo OMB Circular A-133 audits annually; OIG audits selected centers and reviews A-133 documents.

YES 9%
3.4

Does the program have incentives and procedures (e.g., competitive sourcing/cost comparisons, IT improvements) to measure and achieve efficiencies and cost effectiveness in program execution?

Explanation: MEP does not have effective incentives or procedures in place to promote Centers to become self-sufficient. A nation-wide network has been in place since 1996, but the program indicates that federal funding is still needed because the overhead cost associated with serving small firms has not significantly declined. MEP should be able to leverage the established infrastructure and serve clients more cost-effectively over time.

Evidence: MEP has 61 MEP centers with over 400 service locations to serve a community of approximately 350,000 small and medium sized manufacturers. Some centers still spend 40% of their funding on marketing and centers indicate that fees would have to increase by 150% to cover costs. Centers do not have long-term plans for becoming self-sufficient and there is no policy in place to encourage them to do so.

NO 0%
3.5

Does the agency estimate and budget for the full annual costs of operating the program (including all administrative costs and allocated overhead) so that program performance changes are identified with changes in funding levels?

Explanation: NIST's budget request and prior year budget data reflect the full annual costs of operating the national MEP Program, including direct and indirect costs. Out-year targets for quantitative performance measures are based in part on resource inputs; variation in input levels directly affect estimated performance.

Evidence: Total program costs are presented in NIST's budget justification and annual financial statements. NIST's internal accounting system reports can provide costs by object class. Overhead is applied uniformly per full cost accounting procedures that are specified in Chapter 8.07 of the NIST Administrative Manual. DOC annual performance plans show the impact of proposed funding and policy changes on MEP's performance measures.

YES 9%
3.6

Does the program use strong financial management practices?

Explanation: NIST has a long history of unqualified financial audits and, in fact, provides accounting services for several other DOC bureaus. MEP works actively with centers to develop consistent financial practices for a stable set of financially sound organizations.

Evidence: See the audited Annual Financial Statements. Evidence of MEP efforts to develop consistent and high quality financial practices at the center level include the MEP Center Audit Guidelines, the Center CFO Working Group, and training sessions held at the MEP national conference and other venues.

YES 9%
3.7

Has the program taken meaningful steps to address its management deficiencies?

Explanation: Regular program oversight is obtained through several channels: the NIST Visiting Committee on Advanced Technology; MEP's external National Advisory Board; and internal NIST program reviews.

Evidence: MEP has made numerous changes in program management in response to recommendations produced by these review mechanisms.

YES 9%
3.CO1

Are grant applications independently reviewed based on clear criteria (rather than earmarked) and are awards made based on results of the peer review process?

Explanation: Funding is provided to MEP Centers through cooperative agreements that are formed on the basis of open competitions as specified in the MEP rules. Center renewals are made only upon completion of a thorough and successful evaluation of Center operations and performance.

Evidence: For evidence of open competitions, see the program's Federal Register Notice of Competition (15 CFR 290). The openness and overall quality of the competition process have been independently confirmed; see OIG Report, "MEP Awards Process Promotes Merit-Based Decisions".

YES 9%
3.CO2

Does the grant competition encourage the participation of new/first-time grantees through a fair and open application process?

Explanation: Establishment of new MEP Centers is made on the basis of fair and open competitions that do not restrict the applicant pool in any manner.

Evidence: See sources cited in question 8 above.

YES 9%
3.CO3

Does the program have oversight practices that provide sufficient knowledge of grantee activities?

Explanation: Center activities are monitored by the MEP staff. Each center is assigned an Account Manager who spends time on a regular basis with the center understanding details of its operation and monitoring performance. Business practices and program results of client activities are a central feature of regular Center performance reviews and renewal evaluations.

Evidence: MEP's oversight practices are evident in formal and systematic internal processes that include Center Quarterly Technical Progress Reporting, Center Information Management System, Center Progress Reports, Center Operating plans, annual reviews, external panel reviews, and renewal process documentation.

YES 9%
3.CO4

Does the program collect performance data on an annual basis and make it available to the public in a transparent and meaningful manner?

Explanation: See response to question 1 above. Pperformance data are made available to the public through annual performance plans and reports and MEP program literature.

Evidence: See DOC budget justification and annual performance plans and reports. See also MEP Center Performance Output Reports, and Criterion 7 of Center Review Criteria.

YES 9%
Section 3 - Program Management Score 91%
Section 4 - Program Results/Accountability
Number Question Answer Score
4.1

Has the program demonstrated adequate progress in achieving its long-term outcome goal(s)?

Explanation: MEP has one overarching long-term outcome goal: to raise the productivity and competitiveness of small manufacturers. External program evaluation studies demonstrate MEP's past success in working towards its long-term outcome goal. MEP's annual performance measures represent indicators of competitiveness and demonstrate benefits to MEP firms, but it is difficult to identify the impact of MEP on the small manufacturing community as a whole.

Evidence: Evidence of long-term program success is provided by productivity studies conducted by R.S. Jarmin, Center for Economic Studies, Bureau of the Census. Jarmin showed that MEP client plants on average experienced 5.2% more growth in labor productivity between 1996 and 1997 than non-MEP clients. However, large manufacturers are still 70% more productive than small manufacturers. Data on indicators of competitiveness are provided below and in the DOC FY 2004 APP. These indicators provide some evidence that MEP has helped individual firms, but it is not clear that these firms actually needed assistance. A recent long-term study of MEP clients vs non-MEP clients has not been completed.

SMALL EXTENT 7%
4.2

Does the program (including program partners) achieve its annual performance goals?

Explanation: MEPs annual performance measures represent indicators of competitiveness and progress toward the program's long-term goal. Two out of three targets were met in FY 2000. FY 2001 actuals are not yet available.

Evidence: See DOC budget justification and annual performance plans and reports. Note that extensive and systematic client survey process entails data lags; latest full year actual performance data is FY 2000.

LARGE EXTENT 13%
4.3

Does the program demonstrate improved efficiencies and cost effectiveness in achieving program goals each year?

Explanation: Program results have been achieved at specified budget levels and on anticipated schedules.

Evidence: System performance results have increased over time while Federal inputs have remained relatively constant, indicating increases in efficiency and effectiveness over time. For program results, see DOC budget justification and annual performance plans and results.

YES 20%
4.4

Does the performance of this program compare favorably to other programs with similar purpose and goals?

Explanation: Other Federal programs that focus on small businesses, such as Small Business Development Centers (SBDCs), Trade Adjustment Assistance Centers, the Defense Adjustment Program, and EPA's Small Business Assistance Program, do not reach the same small manufacturing customers, do not deliver manufacturing technical assistance, do not leverage state and local resources, and do not produce measurable improvements in competitiveness such as those demonstrated by MEP.

Evidence: External analyses of alternative assistance programs include: Urban Institute, "Effective Aid to Trade-Impacted Manufacturers", 1998. Rutgers University, et. al. "Defense Adjustment Program: Performance Evaluation", 1997. Other Federal programs that focus on small businesses, such as Small Business Development Centers (SBDCs), Trade Adjustment Assistance Centers, the Defense Adjustment Program, and EPA's Small Business Assistance Program, do not leverage state and local resources and do not produce measurable improvements in competitiveness such as those demonstrated by MEP.

YES 20%
4.5

Do independent and quality evaluations of this program indicate that the program is effective and achieving results?

Explanation: Controlled-comparison productivity studies demonstrate program effectiveness; external reviews confirm MEP's performance results and programmatic effectiveness.

Evidence: Jarmin studies provide controlled comparison program evaluation data. For external advisory panel findings, see the MEP National Advisory Panel reports and annual NIST VCAT reports. Other external studies of MEP's programmatic effectiveness include three GAO studies--1991, 1995, and 1996. Fewer independent studies have been conducted in recent years.

YES 20%
Section 4 - Program Results/Accountability Score 80%


Last updated: 09062008.2002SPR