October 26 -- The Chief Evaluation Office of the U.S. Department of Labor (DOL) commissioned the high priority Apprenticeship Evidence-Building Portfolio evaluation contract with the Urban Institute (with Mathematica and Capital Research Corporation) to build the evidence on apprenticeship, including apprenticeship models, practices, and partnership strategies in high-growth occupations and industries. DOL's initiatives to expand access to apprenticeship opportunities support the Presidential Executive Order “Expanding Apprenticeships in America.” The portfolio of initiatives includes the Scaling Apprenticeship Through Sector-Based Strategies grants, Closing the Skills Gap grants, Veterans Employment and Training Services Apprenticeship pilot, and other DOL investments. The project end date is September 2024.
DOL has asked OMB to approve its proposal to conduct the Apprenticeship Evidence-Building Portfolio Evaluation and invites comments to OMB by November 25, 2020.
CEO Apprenticeship Evidence-Building Portfolio: https://www.dol.gov/agencies/oasp/evaluation/currentstudies/Apprenticeship-Evidence-Building-Portfolio
Proposed evaluation: https://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=202008-1290-001
Click on View Supporting Statement for narrative on plans, methods, and uses and IC List for guides to surveys and interviews.
FR notice inviting comments: https://www.federalregister.gov/documents/2020/10/26/2020-23624/agency-information-collection-activities-submission-for-omb-review-comment-request-apprenticeship
Point of contact: Christina Yancey, DOL Chief Evaluation Officer ChiefEvaluationOffice@dol.gov
DOL and industry have invested billions of dollars over the past decade to encourage, develop and expand industry-driven apprenticeship training nationwide. Much of the federal investment is through program grants and technical assistance. The breadth of apprenticeship investments has resulted in a diverse sectoral, geographic, and institutional mix of apprenticeship programs and projects. This project will build the evidence base on apprenticeship in three ways: careful review of existing evidence and information; rigorous implementation study to specify apprenticeship typologies and models to include a range of work-based training; and development of rigorous impact evaluation design options to analyze impacts of various models and strategies.
The Scaling Apprenticeship Through Sector-Based Strategies grants ($183.8 million) and the Closing the Skills Gap grants ($100 million) are the two largest recent federal apprenticeship investments and a primary focus of the proposed project. The Scaling Apprenticeship grant awards, announced in June 2019, focus on accelerating expansion of apprenticeships to more sectors with high demand for skilled workers, namely occupations and industries applying for H-1B worker visas. Closing the Skills Gap awards, announced in fall of 2019, are intended to promote apprenticeship as a method for closing the gap between employer skill demands and the skills of the workforce. The source of funding for both grant programs is fee revenue from Section 414(c) of the American Competitiveness and Workforce Improvement Act of 1998, and a substantial portion of grant funds are required to be spent on training activities. In addition, starting in early 2020, the Transitioning Service Member Apprenticeship Demonstration will be rolled out to eight military installations.
Although the evidence base on apprenticeship in the U.S. is growing, there are still several key knowledge gaps that are ripe for rigorous evaluations and evidence-building. Policymakers, researchers, evaluators, and practitioners are generally persuaded that apprenticeship has positive net benefits, but the study need more evidence on what models work in specific occupational contexts, for particular subgroups of apprentices. Impact analysis is needed to better understand what apprenticeship models and components are most effective for apprentices in various industries and occupations.
The Apprenticeship Evidence-Building Portfolio includes two implementation studies: an implementation study of the Scaling Apprenticeship, Closing the Skill Gaps, and other similar DOL initiatives to develop typologies of apprenticeship models and practices, identify promising strategies across the portfolio, and to better understand the implementation of models to help interpret impact evaluation findings (labeled below as the Program Implementation Study); and an implementation study on the VETS Apprenticeship pilot to understand service delivery design and implementation, challenges, and promising practices. The evaluation will address the following research questions:
1. What are promising strategies for enhancing existing apprenticeship models or building new models to better serve, recruit, and retain individuals typically underrepresented in apprenticeship, such as those with disabilities, women, people of color, ex-offenders, and veterans and transitioning service members? (implementation and impact evaluation)
2. Which industry sectors, occupations, and types of companies appeared to be the most promising for expanding apprenticeships, and why? Were they registered or unregistered apprenticeship programs? (implementation evaluation)
3. What types of program components, or combinations of components, were designed and implemented in the apprenticeship programs? What challenges did programs face in implementation, and how were those challenges overcome? What implementation practices appear promising for replication? What types of strategies and approaches were implemented or taken to scale, and what policy changes were developed and implemented that led to systems change? (implementation evaluation)
4. What stakeholders were involved in the design or implementation of the apprenticeship program? What role do sponsors and third parties contribute to engaging employers and apprentices? How were partnerships built and maintained? What factors influenced the development and maintenance of the partnerships? Did partnerships change or evolved over time, and if so, how and why? (implementation evaluation)
5. What type of assistance was provided to increase employer engagement? How did implementation vary by employer characteristics, such as industry, type, size? What were the reasons employers choose to either invest in a new apprenticeship program or expand their existing apprenticeship program? What types of outreach were used to engage employers, and did outreach differ by industry? (implementation evaluation)
6. What are the characteristics of program infrastructure, quality assurance, data management, and technical assistance? What metrics and data are used by different stakeholders to define and measure success of the apprenticeship program? (implementation evaluation)
7. What is the role of apprenticeship placement counselors in assisting transitioning service members to learn about, search for, secure, and complete apprenticeships? How and to what extent do placement counselors conduct outreach to employers, group sponsors, local workforce boards, and other local stakeholders to identify apprenticeship opportunities? How do placement counselors assess, match, and place transitioning service members into apprenticeships? (implementation evaluation)
8. What are the impacts of apprenticeship models, components, and strategies on apprentices’ employment, earnings, and career advancement? (implementation and impact evaluation)
9. What are the proximate impacts of intervening strategies that may be related to employment outcomes? (implementation and impact evaluation)
10. What are promising strategies for improving individuals’ recruitment, retention, and completion of pre-apprenticeship and apprenticeship programs? Do they differ for underrepresented populations? (implementation and impact evaluation)
11. What are promising strategies for improving individuals’ employment outcomes? Do they differ for underrepresented populations? (implementation and impact evaluation)
Data collection will end in September 2023. After data collection, data will be presented in summary formats, tables, charts, and graphs to illustrate the results. Interim briefs will be submitted in 2021. A final report will be submitted in 2024.
For AEA members wishing to provide comments, "A Primer on How to Respond to Calls for Comment on Federal Data Collections" is available at https://www.aeaweb.org/content/file?id=5806