In my work with federal agencies over the last 15 years on violence prevention, social emotional learning, mental health and homelessness, the idea of translating research to practice has become increasingly important. We know there is a gap between what we discover through research and what is applied by practitioners, funders and policymakers.
Over the past decade, federal agencies — and the US Department of Health and Human Services (HHS), in particular — have sought to learn more about the ‘science’ of implementing programmes, practices and policies. They want to invest smartly and do a better job of ensuring the most evidence-based decisions. These are noble goals — especially during this pandemic, when health and human service organizations are being asked to do things they have never done before, with lightning speed. Unfortunately, it gets complicated fast: Each field has its own terminology, frameworks and measures, making it difficult to synthesise information and create a shared body of knowledge across disciplines. So where do we start?
Through a multi-year effort with HHS, I and others at American Institutes for Research tried to tackle the question posed in the title. We reviewed the literature, interviewed experts, and convened federal staff representing six divisions within HHS and two federal agencies outside of HHS to engage in a consensus-building process.
We learned that we had a lot to learn! The group identified key themes related to (1) an intervention (a program, practice, or policy), (2) the intersection between that intervention and its context, and (3) the intervention’s implementation process:
- The intervention: We must understand the evidence supporting a particular intervention, the relevance of that evidence, and the complexity of the intervention.
- The intersection between intervention and context: We must understand the capacity of an organisation to implement any intervention and this specific intervention, and the advantage of this intervention over alternatives and existing practice.
- The implementation process: We must understand the fit between the intervention and the setting, adapt the intervention appropriately, clearly articulate its theory of change, monitor implementation and focus on data-driven improvement.
The example of violence prevention
Let me demonstrate these themes using a study I recently completed with partners from the Center for the Study and Prevention of Violence at University of Colorado Boulder and $6.2 million of funding from the National Institute of Justice. The study evaluated the implementation and outcomes of a framework, Safe Communities Safe Schools, designed to promote school climate and reduce violence.
Ultimately, we found partial implementation and mixed outcomes. Although these findings could imply that the federal dollars weren’t well invested, these are exactly the types of investment that need to be made to bridge the gap between research and practice. The study yielded rich data in terms of where we found impacts, on what, for whom, and why. For example, we found value in using ‘readiness’ data to:
(1) select schools that had capacity to implement any intervention and the one we were trying to implement, and
(2) assess whether we increased these types of capacity over time.
We found that implementation data can be used to monitor implementation quality and promote data-driven quality improvement, but also to understand the extent to which the intervention benefits all staff as opposed to the school-based team that we worked directly with. Finally, we found value in clearly articulating a theory of change that included time frames and sequences for when we expected schools to implement different intervention components, and when we expected different outcomes to be influenced. The study taught us a multi-step approach to select schools for any randomised controlled trial, methods for assessing change in systemic change efforts, and which measures to use to capture focused and school-wide impacts. These lessons have value for the broader field, well beyond this specific study.
We need to continue to invest federal dollars smartly by understanding and describing aspects of the intervention, the context, and the implementation process to promote positive outcomes and improve the lives of young people.
This project was supported by Award No. 2015-CK-BX-K002, awarded by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice. The opinions, findings, and conclusions or recommendations expressed in this publication/program/exhibition are those of the author(s) and do not necessarily reflect those of the Department of Justice.
DISCLAIMER: The opinions and views expressed in this report are those of the authors. They do not necessarily reflect the views of the U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation.
You can read the original research in Evidence & Policy:
Dymnicki, A. Bzura, R. Osher, D. Wandersman, A. Duplantier, D. Boyd, M. Cash, A. and Hutchison, L. (2020). Important implementation constructs for federal agencies in health and human service settings that are selecting, monitoring, and supporting grantees, Evidence & Policy, DOI: 10.1332/174426418X15409834211096.
Allison Dymnicki is a principal researcher at American Institutes for Research, USA, with extensive expertise in youth development, implementation science, systems change, measurement and methodology, evaluation design and mixed-methods longitudinal research.
If you enjoyed this blog post, you may also be interested to read: