Context matters for organizational change—but how exactly?  

Emily Hayter, Vanesa Weyrauch and Leandro Echt reflect on working with agencies in Peru and Ghana to help understand how they use evidence in policy

Good use of knowledge and research is essential for well-informed policy decisions. We know there are many different contextual factors that affect how evidence is used in policy. Knowing how all these interrelated factors play out in any specific public agency can be challenging—but without a systematic way of identifying the entry points, it’s difficult for agencies to bring about lasting changes. 

In 2016, INASP and Politics & Ideas (P&I) produced the Context Matters Framework to identify the main factors affecting evidence use, and provide a way to systematically identify how they manifest in any given agency. The Framework draws on the extensive academic literature as well as on interviews with more than 50 policymakers in Africa, Asia and Latin America, to produce a holistic picture of the factors—formal and informal, internal and external—that shape evidence use.    

Last year, INASP and P&I embarked on a journey to pilot the framework as a participatory diagnostic process in specific government agencies. We wanted to see how the tool would work in practice to identify and address the issues facing an agency – and learn how our approach could be improved.   

It was really important to us to find the right agencies to partner with on the pilots. We were looking for agencies which were already interested in strengthening evidence use, and able to commit considerable time and energy to collaborate with us on testing this new tool. We needed agencies who could act as ‘thinking partners’ to design and implement the pilots with us—and who would give us frank feedback to help improve our tool and approach in the future.  

We invited applications from public agencies in our network, and received 22 applications from over 20 countries. We were delighted to find two agencies, operating in very different national contexts and with different mandates: 

  • The Secretariat of Public Administration (SPA) in Peru had developed a proposal with CIES, a national consortium of 48 universities and think tanks. The SPA is in charge of articulating, implementing and evaluating the National Policy for Public Management Modernization. CIES develops inter-ministerial dialogues and studies performed by think tanks and universities that are relevant for the public agenda in Peru. 
  • The Environmental Protection Agency (EPA) in Ghana also submitted a strong proposal. They had strong leadership to champion evidence use as well as a clear mandate to use evidence, and a committed team of expert technical staff. In their application, EPA told us about a number of new initiatives they had underway to improve systematic use of data and evidence, as well as some key remaining areas they thought the Framework pilot could help them understand and improve. 

What did we do? 

Our approach in Peru and then Ghana followed the same broad pathway. In each case we worked with a core team of four to six people with representatives from the government agency, P&I and INASP, and our new case studies outline the approach we took. 

The facilitation was engaging and encouraged the full participation of staff and other stakeholders. It was more of a discussion than a presentation, which we really enjoyed.”  EPA Representative   

Two key principles underpinned our approach  

First, it was essential to have not only the ‘buy in’ and approval, but the active participation and involvement, of colleagues from the government agency. The agencies made a significant time Context Matters work in Ghana.commitment to the partnership. They led the process of selecting which external stakeholders to consult and securing their participation, as well as using their own transport and financial resources to facilitate their involvement in the workshops. They coordinated internally to ensure that the right people were consulted at the right time; supported the workshop facilitation; gave detailed feedback on our data collection tools and reports; and ensured that the process fed into multiple, continually shifting, government conversations and processes. Thus, their role went far beyond simply ‘approving’ or ‘validating’ an externally-lead process.  

The reflection meetings and the functional feedback loop were very useful introductions particularly for lessons sharing.” – EPA Representative 

Second, we were committed to documenting our learning from these pilots to inform our adaptive approach and understand how we could use the framework in future. We kept detailed records of our impressions as the process developed, and held regular reflection sessions. We incorporated lessons from the first pilot in Peru into the second pilot in Ghana, and have been using our overall lessons to refine both the Framework itself, and our approach to delivering it, in future.   

So what did we learn (and what are we doing about it)? 

  • We’ve broadened our understanding of evidence use in different types of agencies. SPA’s transversal role increased the complexity of the diagnosis as it has such wide and varied interactions with other state agencies, and works largely with data about how to improve public management. EPA’s clearly defined mandate on environmental protection gives it much more of a sector focus, although it also plays an important cross-cutting role. It’s different from the other agencies INASP has worked with in Ghana in that it’s not a ministry but plays a regulatory role, and does not have the same internal research structures as Ghanaian ministries do. This builds on our emerging understanding of evidence use in different types of agencies, from the Department for Environmental Affairs in South Africa to our analysis of evidence use in African parliaments.   
  • Clearly articulating our approach is key so that everyone involved has a shared understanding from the outset. This is especially important since our model involves participation from so many different people. Piloting the Framework has helped us develop a common understanding of the practicalities of the tool and how it can be used. Our experience reminded us that when engaging in a collaboration like this, policymakers are not motivated by the outputs they will receive (a report or a plan), but by the expectation of concrete outcomes, such as strengthened staff capacity, better relationships with stakeholders, or streamlining procedures and reduction of duplication in research.  We have revised and updated the online version of the Framework with this in mind to hopefully make it clearer—let us know what you think!  
  • Effective partnerships are crucial, and need be thoughtfully nurtured. INASP, Politics & Ideas and the government agency collaborated in a three-way partnership in each country.  In Ghana we supplemented this with workshop facilitation support from PACKS Africa, and in Peru CIES was involved as well. Looking back, we all felt that this approach was critical to the success of the project—but it does take time (especially from the government agency), patience, empathy and a sense of humour! We had various reflections about the respective roles of the different non-government partners, and would like to use more time of local partners in future. For us at INASP, this is part of an ongoing reflective process about how to get partnerships right.  
  • Be realistic about what can be achieved: we had a lot of reflections about the scope, depth and time of our diagnostic, the amount and type of data we collected, and how we collected it. Six months is a long time for a government agency to dedicate to supporting such a process. But, many of us felt we could have benefited from more time to explore the issues we raised in greater depth. We’ve come up with a tiered model to guide future engagement with partners who want to go through the process, which outlines how the process could work with different levels of time/depth ranging from a week to a year.   

The framework has exposed the EPA to the critical factors both internal and external that influence its work within those dimensions and has engendered a high commitment by management to promote the production and use of evidence at all levels within the organization.” —EPA Representative 

What’s next? 

The framework offers a valuable and unique opportunity to apply what we know from the literature on EIPM to a government agency which wants to enhance the way it uses evidence in practice. We’ve done a first round of updates to the Framework, and will continue working on other refinements to the content in the coming months. We’re also excited to be in discussions about various opportunities to use the tool in other ways and new contexts.   

We’ll be sharing our insights from Ghana and Peru in more detail over the next few months, via a presentation at the EPA-INASP presentation at the Africa Evidence Network conference next week and a webinar hosted by Politics & Ideas in October and a roundtable in Oxford. If you’re keen to hear more, please contact us by email: ehayter@inasp.info or leandroect@politicsandideas.org

Emily Hayter
Emily Hayter is Programme Specialist, Evidence for Policy at INASP.

Leave a Reply Text

Your email address will not be published. Required fields are marked *