Research and knowledge at the heart of development: Transforming the field in the UK

Off

John Young reflects on evidence-informed policy work in the UK today and the need for wider global discussions about what approaches work

My arrival at INASP coincided with a series of really interesting events which resonated strongly with INASP’s vision of “research and knowledge at the heart of development”. The first was a fascinating two-day meeting organized by Annette Boaz and Kathryn Oliver on “Transforming the field: Use of Research Evidence in Policy and Practice” at the Nuffield Foundation in London. As they say in their own blog on the event, “Our feeling was that the community of researchers working in this area in the UK had become somewhat siloed, and that more cross-disciplinary conversations were needed to help push the field forward”.

In his introductory presentation at the meeting, Huw Davies described the history, issues, achievements and challenges of evidence-informed policy (EIP). Other speakers illustrated both the challenges and some of the solutions. Sarah Hartley talked about which she called “The Illusion of Opening Up” – policy processes that appear to be engaging with a wider range of stakeholders through big consultations, where the stakeholders are not representative, and their opportunity to engage is often very limited. But she also talked about responsible research and innovation, which is being adopted by some of the UK Research Councils including the EPSRC and is being applied in UK Universities. Jude Fransman spoke about the challenge of equity in research and policy processes and an initiative to establish fair and equitable research partnerships. Brian Head reflected on the political reality of evidence use that is often highly selective, with a focus on supporting policies rather than the other way round, and a tendency to favour technological rather than stakeholder knowledge. But he also highlighted how place-based approaches can bring all stakeholders together to find local solutions to local problems. Nancy Cartwright a philosopher who said she works on “the internal stuff – how can people think better” described an ESRC-funded project which “aims to construct a radically new picture of how to use social science to build better social policies” that has produced some interesting case studies. And Justin Parkhurst gave an interesting example of how the right kind of evidence in the right way can be very influential in the LINK project.

These initiatives suggest that systems to promote evidence use in the UK are having some positive effect, but, there was a general feeling that, despite the massive investment in EIP since the white paper on “Modernising Government” produced during UK prime minister Tony Blair’s time, there are still many examples of the lack of evidence use, and the threat that it is all being undermined by the current anti-expert, pro-populist post-truth mood – at least in the UK and USA.

Researchers, especially in the UK, struggle with the incentives in universities that emphasize publications over engagement with business and society, and the practical challenge of trying to engage with UK policy processes. There are debates about whether study of evidence use should become a discipline in its own right, but that would seem to me to undermine the interdisciplinary approach that is necessary to understand it at all. There are also debates about whether the role of research uptake into policy should be professionalized and have a code of conduct, or should remain as a diverse community of actors engaging at multiple levels in different sectors. The latter is what seems to be the case in most policy processes that I have studied, and intuitively seems right to me. And finally, there is recognition that evidence of the value of investing in better evidence systems is essential to maintain commitment to EIP, and that there is currently very little empirical evidence available. However, nearly everyone at the event, and academics generally, are very squeamish about the idea of measuring the impact of their research.

Despite all of this, there are many examples of where research has informed policy and practice (from the UK’s REF2014 research assessment exercise), and many examples of approaches to embed evidence use in policy processes in developing and emerging countries, including the establishment of the Department of Planning Monitoring and Evaluation in South Africa, and the Indonesian Policy Analyst Association in Indonesia. People interested in promoting EIP in the UK could learn much from them, and engage more with people working on these issues globally – and of course, vice versa. What we need are more practical examples of replicable approaches worldwide, funding for a large-scale systematic assessment of these cases, and some rigorous studies of the economic and social benefit of research. Only by doing this can we find out systematically what works and doesn’t work in different contexts which would contribute to better research uptake generally, and generate the evidence necessary to maintain research funders and university commitment to investing in building a better evidence system.

John Young

Comments are closed.