Evidence informed ‘evidence informed policy and practice’


David Gough, Chris Maidment and Jonathan Sharples

This blog post is based on the Evidence & Policy article, ‘Enabling knowledge brokerage intermediaries to be evidence-informed.’

Research evidence can be useful (alongside lots of other information) in informing policy, practice and personal decision making. But does this always happen? It tends to be assumed that if research is available and relevant then it will be used in an effective self-correcting ‘evidence ecosystem’, but in many cases the ‘evidence ecosystem’ may be dysfunctional or not functioning at all. Potential users may not demand relevant evidence, not be aware of the existence of relevant research, or may misunderstand it use and relevance.

Knowledge brokerage intermediary (KBIs) agencies (such as knowledge clearinghouses and What Works Centre) aim to improve this by enabling the engagement between research use and research production. We believe that KBIs are essential innovations for improving research use. In this blog, we suggest four ways that they might be further developed by having a more overt focus on the extent that they themselves are evidence informed in their work, as we explore in our Evidence & Policy article.

Continue reading

Portable peer review at Evidence & Policy


Zachary Neal, Editor-in-Chief

Evidence & Policy is piloting a new portable peer review policy aimed at reducing inefficiencies in the publication process, and lessening some of the burdens placed on reviewers and authors by the cycle of repeated submissions to different journals. The official policy is available in the journal’s Author Instructions, but this blog post provides some additional background details and rationale for adopting this policy.

Continue reading

What does it mean to use research well?

Joanne Gleeson, Lucas Walsh, Mark Rickinson, Blake Cutler and Genevieve Hall

This blog post is based on the Evidence & Policy article, ‘Quality use of research evidence: practitioner perspectives’.

The use of research to inform practice can play a vital role in improving decision-making and social outcomes. As such, research use has gained widespread attention, with a range of initiatives now in place across sectors, countries and jurisdictions that promote it. Yet, what it takes for research to be used on the ground, let alone what quality research use looks like, is not well understood (Sheldrick et al., 2022). Without these understandings, there are real risks that research into research use will stay, as Tseng (2022) suggests, on ‘the proverbial shelf (or website) — far from the action of policy deliberations and decision-making’. This presents a challenge to the research community; to not only gain robust evidence on how research is used by practitioners, but also what it means to use research well and what it takes for it to improve.

In our new article in Evidence and Policy, we address this challenge by presenting findings from an investigation into Australian educators’ views on using research well in practice. Utilising thematic analysis, we draw on survey and interview data from almost 500 Australian educators (i.e., school leaders, teachers and support staff) to examine their perspectives in relation to a previously developed conceptual Quality Use of Research Evidence (QURE) Framework (Rickinson et al., 2020, 2022).

Continue reading

When should scientists rock the boat? Advising government in a pandemic

Paul Atkinson

This blog post is based on the Evidence & Policy article, How did UK policymaking in the COVID-19 response use science? Evidence from scientific advisers’.

Should scientists who want to influence government ‘speak truth to power’, or follow the ‘rules of the game’? Do you make more difference as an outsider or an insider? This matters to any scientist who wants their research findings to have impact. As a former Department of Health civil servant employed in a University public health department, I often work with my Liverpool and Oxford colleagues on achieving ‘policy impact’, and this question arises each time, but it has never mattered more than in the Covid-19 pandemic. So what is the best way to influence government?

Continue reading

Considerations for conducting consensus in partnered research

Kelsey Wuerstl, Miranda A. Cary, Katrina Plamondon, Davina Banner-Lukaris, Nelly Oelke, Kathryn M. Sibley, Kristy Baxter, Mathew Vis-Dunbar, Alison M. Hoens, Ursula Wick, Stefan Bigsby and Heather Gainforth

This blog post is based on the Evidence & Policy article, ‘Building consensus in research partnerships: a scoping review of consensus methods’.

When reading articles describing a collaborative research decision, such as a research partnership creating a list of research priorities, we often read the statement, ‘The research partnership came to consensus’. But what does this statement actually mean – what is consensus, what does it mean to come to consensus, and how did the research partnership come to consensus?

Research partnerships are characterised by researchers and research users engaged in a collaborative research project to enhance the relevance and usefulness of research findings. Consensus methods require group members to develop and agree to support a solution that is in the group’s best interest. However, simply doing partnered research and using consensus methods does not guarantee the research addresses the priorities of those most affected, nor that inclusion and power dynamics have been considered. Consensus methods are often poorly reported and missing crucial information about how the research partnership made decisions about the project, as well as how issues of inclusion, equity and power dynamics were navigated.

We conducted a scoping review to better understand how research partnerships use consensus methods in health research and how these research partnerships navigate inclusion and power dynamics. Our findings, published in Evidence & Policy, identified six recommendations to enhance the quality of research teams’ consensus methods.

Continue reading

How do contextual factors influence the development of e-cigarette recommendations?

Marissa J. Smith, Srinivasa Vittal Katikireddi, Kathryn Skivington and Shona Hilton

This blog post is based on the Evidence & Policy article, ‘Contextual influences on the role of evidence in e-cigarette recommendations: a multi-method analysis of international and national jurisdictions’.

The use of evidence in public health decision-making is not as straightforward as it may seem – people have different ideas of what constitutes ‘evidence’, and how it should be interpreted and used in different contexts. Even when there is agreement on what constitutes evidence, research has shown that the same evidence, used in different contexts can lead to different policy decisions. A current example of this is e-cigarette policies and their recommendations. Our Evidence and Policy article explores how context, broadly defined as the factors that influence decision-making, influences the role of evidence in developing recommendations and how it may contribute to different policy approaches.

Continue reading

Building trust, managing expectations and overcoming organisational differences: how to solve complex problems through collaboration?

Alexis Dewaele

This blog post is based on the Evidence & Policy article, ‘A grounded theory on collaborative interactions in a community-university partnership: the case of youth in the public space’.

In a society that is steered by complex processes such as globalisation and institutional complexity, we are increasingly confronted with what is sometimes called ‘wicked problems’ (i.e., a problem that is difficult or impossible to solve because of incomplete, contradictory, and changing requirements that are often difficult to recognise). At Ghent University in Belgium, we were interested in trying to solve such problems by setting up a collaboration with diverse community stakeholders. We sent out a call to diverse stakeholders asking them to submit proposals on societal challenges that could be addressed by bringing together various actors and making use of scientific knowledge. An employee of the municipal department of well-being and equal opportunities submitted a case related to antisocial behavior by youth at a municipal park. The researchers involved selected the case as a pilot project to further investigate collaborative processes. For our study published in Evidence & Policy, we analysed a set of four video recorded co-creation sessions of this particular case to learn to better understand the process of how knowledge exchange can actually contribute to problem solving.

Continue reading

For what purposes is research evidence used in legislatures? What are the enablers and hindrances to using evidence in these settings?

Mathieu Ouimet

This blog post is based on the Evidence & Policy article, ‘Use of research evidence in legislatures: a systematic review’.

Our Evidence and Policy article reports the findings from a systematic review of how and for what purpose legislators use research evidence. It also examines legislators’ perspectives on enablers and barriers to using research evidence.

We searched for all published studies, either in English or French, in which the type of use and the barriers and facilitators to using research evidence by legislators were empirically examined. We included relevant studies regardless of the year of publication, the country where the study was conducted or the kind of legislatures. We found twenty-one studies, most of which were conducted in the United States. There has been a noticeable growth in studies since the 2010s.

Continue reading

The changing culture of evidence use in local government

Mandy Cheetham

This blog post is based on the Evidence & Policy article, What I really want is academics who want to partner and who care about the outcome’: findings from a mixed-methods study of evidence use in local government in England’.

Background

It is recognised that closer interaction between those working in policy and practice and academic researchers increases the likelihood of evidence being used to improve outcomes, but progress remains slow. Policymakers and researchers continue to be seen (unhelpfully) as occupying separate worlds, with limited research undertaken into efforts to address this perceived division.

In this blog post, we outline the main messages from a recently published article in Evidence & Policy, which draws on a collaborative, mixed methods study funded by the Health Foundation: Local Authority Champions of Research (LACoR). We explore evidence use in the context of local government from the perspectives of those who work there.

Continue reading

What is ‘evidence’? What is ‘policy’? Conceptualising the terms and their connections

Sonja Blum and Valérie Pattyn

This blog post is based on the Evidence & Policy article, ‘How are evidence and policy conceptualised, and how do they connect? A qualitative systematic review of public policy literature’.

Are policies based on available evidence? Are academic experts willing to provide their expertise? What enables or constrains the effective use of evidence for policymaking?

Public policy scholarship has puzzled over such questions of the evidence and policy relationship for decades. Over time, ever more differentiated branches of public policy research have developed, which complement and enrich each other. However, they have also developed their own perspectives, languages, and understandings of ‘evidence’, ‘policy’ and their connections.

Such differences in terminology and employed concepts are more than ‘just words’. Rather, attentiveness to careful conceptualisations helps to set clear boundaries for theory development and empirical research, to avoid misunderstandings, and enable dialogue across different literatures. Against that backdrop, in our article published in Evidence & Policy, we conducted a qualitative systematic review of recent public policy scholarship with the aim to trace different conceptualisations of ‘evidence’, ‘policy’ and their connections. To be included in our review, the research articles needed to address some sort of evidence, some sort of policy, and had to deal with some sort of connection between the two (a list of all included research articles is available online). The review followed all steps of the PRISMA methodology.

Continue reading