Why and How GAO Did This Study
In 2019, the U.S. government funded more than $42 billion in basic scientific research across a wide range of scientific disciplines. Unsuccessful attempts to reproduce and replicate research results have been documented across many scientific disciplines, including those funded by NASA, NIH, and NSF. The scientific community has expressed concern over the difficulty of replicating prior research results. The importance of promoting rigor and transparency in research has become more prominent over the past decade or more, as evidenced by a series of reports and workshops from the National Academies of Sciences, Engineering, and Medicine (National Academies).
Scientific discoveries can be serendipitous. However, the 2019 National Academies report notes that a body of reliable knowledge tends to be the cumulative product of investigations by successive research employing careful design, testing, corrections, and confirmation over a period of years. Policymakers; researchers; federal and nonfederal funders of research; academic, corporate, and independent research institutions; professional societies; publishers; journalists; and many others depend on reliable research.
The 2019 National Academies report also highlights that reliability of research is dependent, in large part, on methodological rigor. In addition, transparency in reporting the data and analytic methods employed is important to enable other researchers to assess and understand the results. Without the latter, it can be difficult to test prior work and learn from it.
A number of recent studies attempted and found it difficult to replicate peer-reviewed research results from a wide spectrum of disciplines. This has prompted many in the scientific community to call for improvements to protocols and practices among researchers and funders to better assure both rigor and transparency for existing and planned research. The difficulty of replicating some of the research has been due, in whole or in part, to shortcomings in methodological rigor or a lack of transparency that provides sufficient information about the research to allow another researcher to replicate and confirm the findings. Not every type of scientific research lends itself to replication, nor does research need to be entirely replicable to be useful or informative.
GAO was asked to review strategies to improve the reliability of federally funded research. Among other things, this report (1) examines what actions, according to experts, federal agencies could take to foster rigor and transparency in the research they fund; and (2) assesses the extent to which selected federal science funding agencies have taken actions to improve rigor and transparency. GAO conducted a literature review; reviewed NIH, NSF, and NASA documents; and conducted four roundtable discussions with 22 experts. GAO also interviewed agency officials as well as stakeholders from academia, professional societies, publishing, and other parts of the scientific community.
In this report, we (1) outline strategies that, according to stakeholders in the scientific community, are available to promote research rigor and transparency, and factors they say currently discourage wider adoption in the research community; (2) examine what actions, according to experts, federal agencies could take to foster rigor and transparency in the research they fund; and (3) assess the extent to which selected federal science funding agencies have taken actions to improve rigor and transparency.
To address the first objective, we conducted a review of relevant literature spanning 15 years on topics related to the reproducibility and replicability of research and interviewed a non-generalizable sample of stakeholders from across the scientific community. These stakeholders included representatives from research institutions, non-profit and for-profit publishers, libraries, professional societies, private funders, and others. Those we interviewed were not a representative sample of all stakeholders with expertise on research reliability, but they each demonstrated extensive involvement on these topics and offered a range of perspectives.
To address the second objective, we carried out a series of interviews and held four roundtable discussions with leading experts from academia and nonprofit research organizations who have focused on this issue and have backgrounds in various scientific disciplines such as geophysics, biochemistry, psychology, and economics. We identified experts through what is termed a “snowball sample” by selecting a core group of experts and surveying them for the names of other experts they would recommend. We continued in this manner, iteratively, until we compiled a list of more than 700 experts. Our four discussion groups consisted of 22 experts, chosen primarily based on the number of times that each expert had been recommended by their peers and our assessment of their publications and experience. The roundtable discussions focused on federal actions that could be taken to address challenges to research reproducibility and replicability. To select viewpoints for inclusion in the body of this report, we considered the extent to which a particular topic was discussed, the degree to which other experts agreed or disagreed with one another, and whether the experts provided sufficient support for a particular discussion topic, among other factors. Our analysis of the results of our interviews and roundtable discussions sought to characterize the range of factors that met this threshold. Our approach was designed to capture the range of views, and the associated rationale for these of views, rather than to quantify the prevalence of support among experts for any particular view.
To address the third objective, we interviewed officials from the three largest civilian federal funders of basic research in the United States: the National Institutes of Health (NIH), the National Science Foundation (NSF), and the National Aeronautics and Space Administration (NASA). We also spoke with officials from the Office of Science and Technology Policy (OSTP) and the National Institute of Standards and Technology who have a role in developing related guidance and standards, respectively. We reviewed selected agencies’ policies, procedures, and guidance, applicable laws and regulations, and federal standards for internal control.
What GAO Found
The National Institutes of Health (NIH), the National Science Foundation (NSF), and the National Aeronautics and Space Administration (NASA) are the three largest federal funders of basic scientific research in the United States. According to leading experts GAO interviewed, these agencies could do more to increase the rigor and transparency of the research they fund by taking actions to better align awards and recognition for researchers with more rigorous and transparent research practices. Experts suggested, for example, that agencies could incentivize or mandate that researchers preregister their studies as a means to share their research plans before the research is conducted. Doing so would enable other researchers to comment on and strengthen the methodology and analysis plans. Experts further suggested that agencies help improve standards for data repositories where research data are stored publicly, encourage the publication of null research results, and support training in statistical analysis and study design. Although the scientific community has developed many such practices to enhance research reliability, GAO found that they are not widely adopted because of researcher misconceptions and misaligned incentives in funding and publishing, among other things.
NIH, NSF, and NASA have taken steps to promote and support additional rigor and transparency in research, such as establishing requirements for researchers to disclose research results and associated data publicly. However, these agencies largely rely on grant application reviews and the prepublication peer review process to help ensure research rigor.
GAO found that these agencies do not evaluate the rigor and transparency of the research they fund to help identify strategies for improvement. Specifically, they do not collect indicators of rigorous study design and transparency of research results such as study sample size, adherence to research plans, or the extent to which research data are findable, accessible, and usable. As a result, the agencies lack information to support changes to the grant making process and research funding priorities. Federal guidance and Standards for Internal Control in the Federal Government call for agencies to prioritize making federally funded research more rigorous and transparent and to use quality information to achieve agency objectives. Without this information on the research they fund, agencies are limited in their ability to take effective actions to improve research reliability, like those the experts described to GAO.
GAO is making six recommendations, two each to NIH, NSF, and NASA to evaluate research using indicators of rigor and transparency, and to use this information to inform further actions. NIH and NSF concurred with the recommendations. NASA did not concur with our first recommendation and partially concurred with our second. GAO continues to believe the recommendations are valid.
• The Director of NIH should collect information on relevant indicators of rigor to assess the research projects the agency funds, and implement steps, as needed, to promote strong research practices in future work. (Recommendation 1)
• The Director of NIH should take steps to collect information to determine whether current policies and requirements are adequate to achieve transparency by ensuring research results and data are findable, accessible, and usable, and implement programmatic or policy changes, if needed. (Recommendation 2)
• The Director of NSF should collect information on relevant indicators of rigor to assess the research projects the agency funds, and implement steps, as needed, to promote strong research practices in future work. (Recommendation 3)
• The Director of NSF should take steps to collect information to determine whether current policies and requirements are adequate to achieve transparency by ensuring research results and data are findable, accessible, and usable, and implement programmatic or policy changes, if needed. (Recommendation 4)
• The Administrator of NASA should collect information on relevant indicators of rigor to assess the research projects the agency funds, and implement steps, as needed, to promote strong research practices in future work. (Recommendation 5)
• The Administrator of NASA should take steps to collect information to determine whether current policies and requirements are adequate to achieve transparency by ensuring research results and data are findable, accessible, and usable, and implement programmatic or policy changes, if needed. (Recommendation 6)