An interview with Clara Richards

Off

Clara Richards is a development professional who has worked for CIPPEC and the Overseas Development Institute. She is the current coordinator of the Evidence-Based Policy in Development Network. Earlier this year, Clara and Vanessa Weyrauch (from CIPPEC) carried out an evaluation of INASP’s programme to teach pedagogy skills to those who train African policy makers in the use of evidence. Following this evaluation, Clara took part as a participant in a similar training programme in Kuala Lumpur — this time for trainers of Asian policy makers. To complete the cycle, later this year, she will take on the role of co-facilitator at a further workshop — pedagogy for trainers of policy makers in Latin America.

Since she has experienced this approach as an evaluator, a participant and will soon also be an implementer, we thought it would be interesting to find out Clara’s perspective on INASP’s approach.

Q1. Thanks very much to Clara for agreeing to take part in this interview. Could you briefly describe the programme that you and Vanessa were asked to evaluate?

The programme we evaluated aimed to build the training abilities of a group of African trainers so that they would be better able to train policy makers on the use of research for policy. It consisted of three phases: the pre-workshop, workshop and post-workshop.

In the pre-workshop phase, participants were asked to write a reflective essay focussing on teaching and learning. The participants were then brought together for a five-day workshop to build training skills. Throughout the workshop, the examples and practical activities concerned relevant training topics. Following the workshop, participants were expected to deliver training to policy makers in the skills needed to access and use research evidence and, where possible, to engage in peer mentoring relationships with other participants.

The objective of carrying out an evaluation was to determine the nature and magnitude of the impact of this training programme in order to help INASP and Institute of Development Studies (IDS) to understand whether this was a cost-effective approach for building the capacity of policy makers to access and use research.

Q2. Why did you decide to get involved in carrying out the evaluation? What about it interested you?

What interested me the most was the approach of the programme. Training policymakers is a key aspect to help them improve different skills such as using evidence for policymaking and encourage the development of more informed policies, therefore trainers should be experts on the topics they teach. However, there is also a great need to better engage with this (often challenging) audience. Although the expertise on different topics such as information literacy, writing skills, research communication, etc. is growing fast, trainers still feel they lack the skills to engage participants and they feel weak while making their training interesting and appealing. Especially in Africa it was discovered that the learning approach is usually very “lecturer centred” creating a large gap between trainer and participants which hinders learning.

INASP and IDS have introduced a very different approach that bridges this distance and makes training much more effective. Based on the constructivist theory, training departs from the idea that people learn best when they ‘co-construct’ the knowledge. The constructivist trainer facilitates the acquisition of new knowledge by acknowledging the wealth of experience in the room, encouraging participants to reflect on previous experiences and through questioning elicits the gaps in their knowledge. The trainer co-constructs the ‘new’ knowledge by filling in the gaps in participants’ knowledge and asks them to consider how the new theory alters their perception or approaches to their current life experiences. I think this approach should be used more often in capacity building activities in general. This evaluation was an excellent opportunity to detect how this can be applied and the type of results that it yields.

Q3. What would you say were the main lessons from the evaluation? Are there lessons which are of relevance to others working in this field?

I think the main lesson was to realise that although immediate results were very positive (in the sense that the participants’ reaction to the programme was optimistic and they felt they had acquired new skills and improved their training abilities), as an evaluator we need to understand that these immediate reactions are not enough to assess long-term change. For example, although some participants said they acquired new skills during the workshop and [as mentioned by trainers] they had performed very well during the teaching sessions, they performed less well when observed after the workshop. This shows that knowing how to deliver good training may not be enough to effect behaviour change in real world settings. There is a need to keep supporting them through diverse channels (for example an e-platform where they can share progress) and  monitoring the long-term progress (or regression) in order to address attitudinal and behavioural change.

Therefore an important aspect to consider within such programmes is the development of an explicit Theory of Change that clearly represents an overall vision of success, including its preconditions and the link between different interventions that could strengthen evaluation and learning. However, this should not be an overly structured, pre-programmed approach that would have been doomed to failure in the dynamic and complex environments where training usually takes place.

Q4. After carrying out the evaluation, you took part in a similar programme as a participant. Did your perspective of the programme change once you had experienced it for yourself?

No, not really. After I carried out the evaluation I was very intrigued and enthusiastic to participate as a trainee since all the African participants had been very excited about the training. It helped me to tie things together and completely understand why the reaction of the interviewees was so positive. To start by acknowledging what participants already knew and brought to the workshop was a key element to make the learning process more interesting, appealing and worthwhile. Furthermore, I have participated in other training and also delivered some but I learned on this one many tactics to make the most of time, to use every single second available and still keep participants engaged – which is not easy in such an intense workshop.

Another outstanding aspect was that the training was explanatory; the way it was delivered was a way of teaching how we should deliver our own training. Consequently, it is not surprising that participants feel they have learned and acquired new skills. However, I am concerned about the sustainability of this knowledge and the achievement of behavioural change. One of the conclusions and recommendations we arrived at in the evaluation was the importance of having some kind of follow-up system with trainees. Mentoring or continuous learning are needed in order to develop and ground this type of change. Notwithstanding, I am aware this is what people working in capacity building usually struggle with, unfortunately resources are scarce!

Q5. What (if anything!) did you learn by participating in the programme? Do you think it will change how you will approach training in the future?

It is very common to believe that if someone dominates a topic there is a high chance they will deliver training successfully. In some way, I believed that before participating in the training. However, I have to say that by the way the programme was designed it shows that INASP and IDS have thought of the many things that an “expert” has to deal with while carrying out training and I completely changed my opinion about expertise and training! Among the things that can go wrong include facing questions trainers don’t really know how to answer and/or dealing with difficult participants. These specific examples made us realise that it doesn’t matter how much we know about a topic, the way we relate and communicate with our audience will bring different outcomes.

Another aspect I think was key, was the fact that when we train people it is fundamental to first acknowledge what participants bring to the table, give them a chance to say what they already know and build up the training from there. If trainers working in capacity building are able to incorporate this last aspect in their programmes I believe capacity development would improve considerably! I think we need to really invest in training our trainers in effective, learner-centred pedagogical approaches.

Q6. Now that you have evaluated one programme and participated in another, do you have any particular concerns or anxieties about delivering the same programme in Latin America? What do you imagine will be the biggest challenges?

The importance of working directly with the “demand side” is growing enormously in Latin America but it is still a challenge. Up until now training has been delivered mainly to members of organisations that wanted to influence policy. Consequently, there has been significant change in the way researchers approach their investigations and public policies, the consciousness of the use of evidence in policymaking has risen considerably. However, in the last few years the “demand side” has been targeted as an audience that should also receive training for better policymaking. Therefore, training on pedagogical skills comes at a perfect time and it will be very much appreciated since dealing with policymakers in Latin America is a tussle and trainers need to improve their skills to deal with this type of audience. Moreover, we know context is fundamental, therefore a constructivist approach will be key to implement this training, our challenge is to recognise what is the previous knowledge participants have and incorporate it in the workshop. I think (and I hope!)  this kind of workshop will boost local training and make engagement with policymakers a more common practice in the region.

 

INASP

Comments are closed.