Are evidence-based policy and democratic equality reconcilable?


Tine Hindkjaer Madsen

This blog post is based on the Evidence & Policy article, ‘Reconciling science and democracy: evidence-based policy as seen from the perspective of a role-based democratic theory’.

For policy to be effective, it must be informed by reliable evidence, proponents of evidence-based policy argue. While this may be true, the evidence-based policy ideal nevertheless also conflicts with the requirements of democracy. This is because political equality is an essential element of democracy and evidence-based policy confers superior political influence on those who supply the evidence relative to ordinary citizens.

In my paper recently published in Evidence & Policy, I reflect on whether the evidence-based policy ideal is reconcilable with democratic equality after all. I first argue that evidence-based policy in fact also advances the value of political equality, because political equality requires that citizens be the choosers of political aims and utilising appropriate, high-quality evidence is the most reliable method of identifying how to achieve citizens’ aims. That is of course not to say that utilising appropriate, high-quality evidence will always lead to true beliefs about how to achieve a political aim, but it is the body of information we have that is most likely to be true and therefore utilising appropriate, high-quality evidence makes it more likely that citizens’ aims be realised. 

Continue reading

Big voter is watching you: how politicians evaluate expertise


Anina Hanimann

This blog post is based on the Evidence & Policy article, ‘How perceptions of voter control affect politicians’ evaluations of expertise in the news: a survey experiment on the role of accountability beliefs’.

The news serves as a crucial source of expertise for members of parliament (MPs), offering them cost-effective policy advice. However, the public nature of expertise in the news can significantly influence how MPs perceive and evaluate such expertise. Politicians who feel under intense scrutiny by their constituents may be more inclined to make decisions that align with public opinion, are easily justifiable, or simply appear to be the ‘right’ choice. These motivations can significantly shape the evaluation of expertise presented in the media.

My recent study in Evidence & Policy delves into this complex dynamic. I explore whether MPs’ assessments of expertise in news media differ depending on their perceptions of voter control. To investigate this, I analysed survey data from Swiss cantonal members of parliament, who were tasked with evaluating the credibility of expert statements.

Continue reading

Everybody can claim that a practice or policy is evidence-based. But when is it justified to do so?


Christian Gade

This blog post is based on the Evidence & Policy article, ‘When is it justified to claim that a practice or policy is evidence-based? Reflections on evidence and preferences’.

When you search the internet, you will find a myriad of claims about different practices or policies being evidence-based. To avoid ‘evidence-based’ becoming merely a buzzword that everyone can throw around and use whenever they deem it suitable, it is important to consider the conditions for when it is justified for you as an individual or organisation to claim that your practice or policy is evidence-based.

My argument is that this is the case if, and only if, three conditions are met – an argument that suggests that it depends on subjective preferences whether you are justified in claiming that your practice or policy is evidence-based, and that it is important to give more attention to the normative dimension of the field of evidence-based practice and policy.

Continue reading

Scaling-up user-engagement in education research: drawing insights from Canada’s experience


Bernadine Sengalrayan and Blane Harvey

This blog post is based on the Evidence & Policy article, ‘Engaging knowledge users in Canadian knowledge mobilisation research: a scoping review of research in education.

In the ever-evolving landscape of education policy and practice, research has a critical role to play in informing planning and action. However, in many countries, the link between education research conducted by academics and the potential users of that research in schools and other educational settings is not robust. Knowledge mobilisation (KMb) approaches are seen as an important way to bridge the gap between research production and its practical application in any number of settings, including education.

To better understand if KMB practices are helping to inform educational policy and practice, we explored the changing dynamics of research producer-user connections in Canadian K-12 teaching and education policy. Here are some of the highlights from our findings.

Continue reading

Learning from failure: improving behavioural health treatments through understanding mis-implementation

Grace Hindmarch, Alex R. Dopp, Karen Chan Osilla, Lisa S. Meredith, Jennifer K. Manuel, Kirsten Becker, Lina Tarhuni, Michael Schoenbaum, Miriam Komaromy, Andrea Cassells and Katherine E. Watkins

This blog post is based on the Evidence & Policy article, Mis-implementation of evidence-based behavioural health practices in primary care: lessons from randomised trials in Federally Qualified Health Centers, part of the Special Issue: ‘Learning from Failures in Knowledge Exchange’.

“This is disappointing, but I agree we’ve done the best we can.” 

– CEO of a rural health care system

In October 2021, a rural healthcare system in the US discontinued implementation of a new program to improve access to quality care for patients with co-occurring opioid use disorder and mental health disorders. The program’s mission, fueled by passion for patients, was to help complex patients not fall through the cracks. After two years of immense effort, the system experienced ‘mis-implementation.’  Mis-implementation refers to unsuccessful efforts to implement treatments in real-world settings. Although it is a disappointing outcome, studying mis-implementation can provide insights to improve processes and make changes more successfully in the future.

Continue reading

Productive interactions without impact?

Magnus Gulbrandsen and Silje Maria Tellmann

This blog post is based on the Evidence & Policy article, Productive interactions without impact? An empirical investigation of researchers’ struggle to improve elderly’s oral health, part of the Special Issue: ‘Learning from Failures in Knowledge Exchange’.

Even if researchers do everything that is expected of them – collaborate with stakeholders, target important societal problems, engage in intensive science communication – societal impact may still not happen. What are the possible explanations?

A recurring observation in studies of the societal impacts of research is that substantial change typically involves a great deal of ‘productive interaction’ between stakeholders and researchers. However, not all interactions provide the desired societal impacts, as our empirical study of a cross-disciplinary research group focused on improving oral health in the elderly shows. In our Evidence & Policy article, we examined the subtleties of productive interactions and the intricate web of stakeholders, to shed light on the gaps that keep research efforts from having the desired societal impact.

We followed a group of researchers for six years, and even though they carried out many of the recommended activities to make impact happen, they were unable to achieve the expected outcomes. Even if many events took place that may – in an optimistic perspective – prepare the groundwork for future impact, no decisions in policy or practice targeting elderly’s oral health emerged. To analyse this process, we began by considering the oral health of the elderly as a problem area in which a wide range of stakeholders have a stake, but with varying interest, sense of urgency or capacity to make changes happen.

Continue reading

What can we learn from co-production approaches in voluntary sector evaluation work?


Louise Warwick-Booth, Ruth Cross and James Woodall

This blog post is based on the Evidence & Policy article, Obstacles to co-producing evaluation knowledge: power, control and voluntary sector dynamics’, part of the Special Issue: ‘Learning from Failures in Knowledge Exchange.

Co-production has been increasingly discussed as a positive and useful approach in health and social care research, based on principles such as partnership working, reciprocation, power sharing and the appreciation of all expertise. We have used co-production values to inform our evaluation work for many years, but in our Evidence & Policy article we reflect upon the challenges that such approaches bring, specifically in relation to sharing findings, known as knowledge exchange. Our article discusses evaluation work across three interventions that constitute perhaps the most challenging of our experiences in over a decade of such work. Conflict in evaluation work remains largely underreported, but we feel our experiences provide a useful contribution for readers.

Continue reading

Evidence & Policy Call for Papers – Special Issue on Learning through Comparison

Special Issue Editors: Katherine Smith, Valerie Pattyn and Niklas Andersen

Evidence & Policy is pleased to invite abstracts for papers that explicitly employ comparative analysis and/or that develop insights about evidence use in policy through comparison. Authors of selected abstracts will be invited to submit a full paper for consideration for inclusion in a special issue that is aiming to demonstrate the conceptual and empirical contribution that comparative research can offer scholarship on evidence and policy.

Continue reading

Why failure isn’t the f-word in knowledge brokering


Stephen MacGregor

This blog post is based on the Evidence & Policy article, Theorising a spectrum of reasons for failure in knowledge brokering: a developmental evaluation’, part of the Special Issue: ‘Learning from Failures in Knowledge Exchange.

Failure often gets a bad rap, especially in professional settings. It’s usually seen as a waste of time and resources, something to steer clear of. But failure is not just an unfortunate outcome; it can be a crucial learning opportunity.

Particularly in higher education, the pressure is on for academics and universities to show the real-world impact of research. Here, knowledge brokers play a critical role: they are the human force behind efforts to connect research production and use contexts. Yet, the challenges and failures that these professionals face are not often discussed.

My recent Evidence & Policy article aimed to shed light on the spectrum of reasons for failure in the professional practice of knowledge brokering, drawing on a set of semi-structured interviews with a network of knowledge brokers. To understand knowledge brokers’ experiences, two frameworks were integrated: (a) the integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework, and (b) Dr. Amy Edmondson’s Spectrum of Reasons for Failure framework.

Continue reading

Learning from failures in knowledge exchange: how hard can it be?


Peter van der Graaf, Ien van de Goor and Amanda Drake Purington

This blog post is based on the Evidence & Policy article, ‘Learning from failures in knowledge exchange and turning them into successes, which introduces the Special Issue: ‘Learning from Failures in Knowledge Exchange.

We don’t like talking about failures, as it signals loss of time, resources and reputation, but failures present opportunities for learning in knowledge exchange. However, this requires a ‘failure culture’ in academia and policy, in which failures are no longer avoided but actively encouraged. To learn how to turn failures into successes, we need to share and publish our failures, have early engagement with all stakeholders in the knowledge exchange process, and make more use of boundary spanners.

There are plenty of papers celebrating successes in knowledge exchange, but not many researchers and policy makers talk openly about their failures. However, learning from failures is just as important, if not more crucial, than celebrating successes. Allowing partners to reflect in a safe space on knowledge exchange practices and research projects gone wrong, in which communication broke down, partners did not engage or dropped out, and evidence was not taken up or ignored, will provide important lessons on how knowledge exchange practices and research can be improved.

At the 5th Fuse conference on knowledge exchange in public health, held in Newcastle, UK on 15-16 June 2022, we created such a space by bringing together over 100 academic researchers, policy makers, practitioners, and community members to share and reflect on thier failures and how to turn them into success. Our special issue brings together selected papers from the conference and papers that were submitted in response to an open call afterwards. From 23 original submissions from 14 different countries (including the UK, USA, Cananda, Norway, Switzerland, Kenya, Chile, South Korea, Canada and Portugal) and from a range of disciplines and areas of focus (Public Health, Primary Care, Oral Health, Sociology, Anthropology, Public Management, Policy-Making, and Community and Voluntary Sector), we invited four research papers and three practice papers for full submissions.

Continue reading