Feb 21 -- The National Science Foundation (NSF) invites comments to OMB by April 3, 2023 on the proposed new Convergence Accelerator Evaluation & Monitoring Plan. [Comments due 30 days after submission to OMB on March 3, 2023.]
The NSF Convergence Accelerator program addresses national-scale societal challenges through use-inspired convergence research. Using a convergence approach and innovation processes like human-centered design, user discovery, and team science and integration of multidisciplinary research, the CA program seeks to transition basic research and discovery into practice — to solve high-impact societal challenges aligned with specific research themes (tracks).
NSF Convergence Accelerator tracks are chosen in concordance with the themes identified during the program's ideation process that have the potential for significant national impact. Each year the program releases a Dear Colleague Letter (DCL), and/or Request for Information (RFI) to gather ideas from the community at large. The purpose is to make the research and innovation community aware of the convergent research topics selected from the program's recent ideation process for the upcoming fiscal year (FY). Promising ideas are selected by the Convergence Accelerator and further explored in funded workshops to further develop an idea, incorporating convergence research and various disciplines and expertise. The topics coming out of the workshop will be then aligned to a specific research focus that are called "tracks". These tracks will in turn be featured in an upcoming funding opportunity requesting submissions to the NSF Convergence Accelerator Program.
For example, The DCL published by the Convergence Accelerator in FY 2021 (NSF 21-012) led to the ideation and eventual selection of the following three tracks for the FY 2022 solicitation (NSF 22-583):
Track H: Enhancing Opportunities for Persons with Disabilities
Track I: Sustainable Materials for Global Challenges
Track J: Food & Nutrition Security
The NSF Convergence Accelerator Pilot, also referred to as the 2019 cohort, featured track topics of Open Knowledge Networks (Track A) related to Harnessing the Data Revolution (HDR) Big Idea and AI and Future Jobs (Track B1) and National Talent Ecosystem (Track B2) related to the Future of Work at the Human-Technology Frontier (FW-HTF) Big Idea.
Here, cohort referred to as teams that are funded through the CA Program for the tracks that were selected for a given FY. The CA 2020 cohort focuses on track topics of Quantum Technology (Track C) and AI-Driven Innovation via Data and Model Sharing (Track D). The CA 2021 cohort focuses on the Networked Blue Economy (Track E) and Trust and Authenticity in Communications Systems (Track F). The CA 2022 cohort focuses on Securely Operating Through 5G Infrastructure (Track G).
All teams within the cohort begin in Phase 1. Each of the awarded teams participated in a 9-month planning effort, apply a multi-disciplinary approach to develop each initial idea into a proof of concept, and to identify new team members and partners. At the end of phase 1, each team participates in a formal NSF pitch and proposal evaluation. Selected teams are then advanced to Phase 2. At Phase 2, teams continue to apply program fundamentals to develop solution prototypes and to build a sustainability model. By the end of Phase 2, teams are expected to provide deliverables and demonstrate sustainability beyond NSF support.
To enable effective oversight of its investment and fulfill its monitoring and management responsibilities, NSF needs 1) current and up-to date administrative data, 2) characteristics of the teams funded by the CA Program, and 3) structured and unstructured data on the immediate, intermediate, short- and long-term outputs and outcomes of the Program.
The information collection will enable the Evaluation and Assessment Capability (EAC) Section within NSF to garner quantitative and qualitative information that will be used to inform programmatic improvements, efficiencies, and enhanced program monitoring for the Convergence Accelerator (CA). This information collection, which entails collecting information from CA applicants and grantees through a series of surveys, interviews, and case studies, is in accordance with the Agency's commitment to improving service delivery as well as the Agency's strategic goal to “advance the capability of the Nation to meet current and future challenges.”
For this effort, four survey instruments have been developed, each of which will include closed-ended and open-ended questions to generate quantitative and qualitative data. For ease of use for our respondent pool, each of the four survey instruments will be programmed into interactive web surveys and distributed to eligible respondents by email. The surveys, which will serve as a census for all applicable CA applicants and/or grantees, will be used to collect baseline measures at the start of the program and vital information on how grantees progress through the program. Follow-up interviews will be conducted with project team leaders, such as Principal Investigators (PIs) and Principal Directors (PDs), and case studies that will use a project team as the unit of analysis will be used to collect qualitatively rich discursive and observational information that cannot be collected within a web survey. Both follow-up interviews and case studies will be conducted virtually with the possibility of in-person interviews and non-participant observation to be held in the future.
Five surveys are being requested for clearance consideration:
-- Group Entrance Survey--designed to collect information on Team Characteristics, Partnership/Engagement, Project Progress/Outcome, and Impact Assessment.
-- Individual Entrance Survey--designed to collect information on Participant Characteristics, Team Dynamics/Engagement, Project Aspirations, and Impact Assessment.
-- Mid-year Survey--designed to get a pulse on the Participants’ Program Experience/Satisfaction, Team Collaboration/Relationship, Project Outcomes, and Impact Assessment. This survey is an identical version of the Exit Survey.
-- Exit Survey--designed and specifically tailored to collect information from the participants on Program Experience/Satisfaction, Team Collaboration/Relationship, Project Outcomes, and Impact Assessment.
-- Coaching Feedback Survey--designed to collect information from Participants on the engagement, effectiveness, and experience when working with the coaching staff who are contractors to the NSF CA Program.
The frequency of which these surveys are administered as follows: Teams—as groups and individuals—get the Entrance surveys (only once) at the beginning of their award. All teams will take mid-year survey, which is a modified and shortened version of the Exit Survey, at about five months into the Program. Then the teams will take the Exit Survey about four months prior to the end of their awards. The Coaching Feedback Survey is administered to the participants at the end of every year (throughout the life of the award).
The request for this data collection stems from the following: (1) the desire to obtain the best data possible from the grantees, (2) increased transparency and accountability for the Program, and (3) effective oversight of its investment and fulfill its monitoring and management responsibilities. Since the extent and complexity of the questions in these surveys go beyond the standard NSF annual report requirement, a separate clearance would enable the Program to receive the feedback from the grantees on information that would otherwise be unable to obtain.
The Evaluation & Monitoring for the National Science Foundation (NSF) Convergence Accelerator (CA) Program has been created to manage and track the outputs, outcomes, impact, and experience of the NSF CA Program. The data collected are focused on program-specific qualitative and quantitative indicators. Collection of these data serves several purposes, including:
-- Identifying outputs of projects funded by the CA Program for award management
-- Monitoring and assessing the effectiveness of the new operational model (with different tracks in every FY, and every cohort consists of two phases)
-- Tracking the progress of the Program expansion and growth of the innovative ecosystem
-- Providing information on research investments in terms of advancements in science, technology, and society impact
-- Responding to Congressional requests and, inquiries from the general public, NSF's external merit reviewers who serve as advisors, and NSF's Office of the Inspector General
-- Supporting the agency’s policymaking and internal evaluation and assessment needs
NSF Convergence Accelerator: https://beta.nsf.gov/funding/initiatives/convergence-accelerator
NSF submission to OMB: https://www.reginfo.gov/public/do/PRAViewICR?ref_nbr=202302-3145-003
Click IC List for information collection instrument, View Supporting Statement for technical documentation. Submit comments through this webpage.
For AEA members wishing to submit comments, "A Primer on How to Respond to Calls for Comment on Federal Data Collections" is available at https://www.aeaweb.org/content/file?id=5806