WALKING THE TALK WITH PARTICIPATORY IMPACT ASSESSMENT LEARNING APPROACH (PIALA)
On the Oxfam Views & Voices blog page, Adinda Van Hemelrijck reflects on how PIALA was adapted for an Oxfam impact evaluation in Myanmar.
In 2015, Oxfam commissioned an impact evaluation of its resilient livelihoods project in the Dry Zone in Myanmar (part of the Effectiveness Review series). This project worked to develop a village-level mechanism, called Membership Organisation (MO), that could facilitate broad citizen participation in local development and community-led action and advocacy in order to build resilience. Such an inclusive governance mechanism didn’t exist before in that part of Myanmar.
The evaluation sought to assess the extent to which the MOs were functioning effectively (and thus impacting resilience) 18 months after project exit and learn about what influenced MO sustainability. The findings needed to be valid for the entire project, which requires large-enough samples to reach generalisable conclusions (scale), and incite learning for future voice and governance work, which required more in-depth systemic inquiry (scope). Moreover Oxfam wanted to ‘walk the talk’ of its Right To Be Heard Framework and uphold the principle of voice and participation in evaluation through the use of a participatory evaluation approach.
Adopting and adapting PIALA
To meet these objectives, we drew on PIALA (Participatory Impact Assessment and Learning Approach), an approach initially developed and piloted with the International Fund for Agricultural Development (IFAD).
PIALA combines five elements, making it possible to assess systemic impact at larger scale and where classic counterfactuals don’t work well:
Systemic Theory of Change – To visualise causal claims and engage stakeholders in framing the evaluation and debating the evidence.
Multi-stage sampling of/in ‘open systems’ – To enable systemic inquiry across medium to large populations.
Semi-standard set of participatory mixed methods – To collect and link the data in the sampled ‘systems’ in a systematic and comparable way.
Participatory sense-making model – To engage stakeholders at local and aggregated levels in debating emerging evidence.
Configurational analysis method – To assess systemic change patterns and draw conclusions about distribution and magnitude of impact.
As Jonathan Lain pointed out, if we want to know and learn about systemic impact, then the system, which is often too big for a classic counterfactual approach, should be our main level of analysis. However, if we can focus on the lowest embedded part of the larger system, and have a medium-sized sample of these, then it’s often possible to find natural counterfactuals in this sample for parts of the system of interest (for example, a finance mechanism in the value chain system). Configurational analysis can then be used for identifying and comparing these.
In Myanmar, the lowest ‘system’ was the MO and its interactions in and outside the community. We did a configurational analysis across a sample of 21 MO villages that proportionally represented the different levels and configurations of MO functioning 18 months after project exit (determined through a rapid survey of all 64 project villages). In total, nearly 1,030 people (44% women) from these 21 MO villages took part in the various participatory methods.
To build local capacity and enable villagers to collect the data, we kept the design fairly simple and provided detailed facilitation guidance for each of the methods and processes. The village researchers proved great listeners, sensitive to manifestations of power, and able to build natural trust and respect. We didn’t get the detailed data for fine-grained systemic analysis we would normally have with professional researchers and more sophisticated designs, but it was sufficient to reconstruct the causal chain and crosscheck the findings (thus mitigating biases or weaknesses) at each step along the chain in each of the 21 villages.
The result was robust evidence showing the value of the MO model and its unmistakable contributions to building resilience. It validated the project’s Theory of Change but also raised some important considerations around sustainability and point-of-exit, government responsiveness, collective power, and leadership and gender, which Jane Lonsdale discusses in her blog post ‘Local governance and resilience – what lasts after the project ends?’
Limitations
Because of the limited capacity of the village researchers and the limited availability of support staff and funds 18 months after project termination, we had to downscale the sense-making, a key element of PIALA. We skipped the local workshops in the villages (which normally take place during fieldwork) and organised a project-level workshop of two days engaging 40-50 people (half of the normal amount) in discussions about the impact and value of the MO. In short: many local people were left out of the debates!
Nevertheless, one-third of the workshop participants were MO villagers, and this was the first time that they had the opportunity to openly discuss their problems and ambitions with government, local NGOs and donors. This is quite important: even small moments of participation can add up to ripples of empowerment, particularly in contexts such as Myanmar where recent changes in government have the potential to create new windows of opportunity for public participation.
In general, PIALA could do a much better job of examining the impact on gender relations. Most participatory methods in Myanmar (as well as other PIALA studies) were employed in gender-specific groups. This, gave us an idea of gender differences in effects on livelihoods, but didn’t reveal the more hidden and internalised gender dynamics affecting equal opportunities. Integrating gender analysis in household surveys and participatory methods would help, but would also add to the cost.
To conclude…
Rigorous impact evaluations of systemic change are often more feasible than we think if we are open to look for creative alternatives to building counterfactual or contrasting evidence. Moreover it’s important to challenge ourselves to think of rigour in impact evaluation in more inclusive and empowering ways that may contribute to systemic learning by those who are the real protagonists of change. Facilitating people’s engagement as researchers and/or discussants is a powerful way to support this kind of learning. Doing this rigorously at scale and in a way that moves beyond data extraction adds to the challenges of present-day evaluation practice (see also our CDI practice paper).