About PIALA
PIALA (Participatory Impact Assessment & Learning Approach) is an approach for co-designing and implementing evaluations, MEL frameworks and strategy and portfolio reviews that focus on achieving transformative change in complex environments. It is essentially a systemic and participatory approach that seeks to support collaborative learning with partners and stakeholders around how multiple interventions and influences interact and combine to generate system change and impact. The approach was initially developed and piloted with the International Fund for Agricultural Development (IFAD) in Vietnam and Ghana between 2011 and 2015.
Drawing on the realist, developmental and transformative/feminist research and evaluation traditions, PIALA offers a tested model for creatively combining different types of methods and data to rigorously assess non-linear contribution to impact with optimal validity and learning value for all stakeholders involved. Pioneering Robert Chambers’ concept of ‘inclusive rigour’ (as described in Chapter 4 of his book ‘Can We Know Better?’ ), it builds on the premise that rigour, validity and utility are enhanced when methods and processes are thoughtfully designed and combined to include multiple views, values, realities, types of knowledge, and ways of knowing. This inclusion leads to deeper and collective understanding of complex change in more authentic and empowering ways.
Use of PIALA
PIALA is particularly found useful for assessing and adaptively managing difficult-to-measure programmes or portfolios aiming for transformative system change in contexts characterised by high unpredictability and ‘causal density’ (i.e. with many interactions and influences). In such contexts, traditional evaluation approaches that draw on with-without and before-after measurements are often unrealistic and/or insufficient.
Using PIALA helps to tackle three major challenges:
1 . The methodological challenge, which is about ensuring ‘rigour’ in assessing causality in complex environments where isolated cause-effect relations hold no sway. This is addressed by systematically cross-checking data and analysing emerging patterns of interaction and change alongside an adaptive Theory of System Change.
2 . The utilisation challenge, which is about generating different types of evidence for different users/uses at different moments in time. This is addressed by making methods and processes adaptive to emerging pathways and learning needs and responsive to local contexts and cultures.
3 . The validity challenge, which is about avoiding bias in making value judgments of contribution to impact. This is addressed by including all stakeholder views and perspectives in the analysis, in particular of change makers and beneficiaries.
“In Impact Evaluation, it may be mixed designs (rather than mixed methods) that are most useful (…) to answer the various questions posed by commissioners and other stakeholders.”
Elliot Stern (2015: 13). Impact Evaluation. A Guide for Commissioners and Managers. Bond.
PIALA-based ‘mixed design’
PIALA framework helps combine and adapt methods and processes for integrated data gathering and analysis by drawing on:
Two design principles: Evaluating systemically, and enabling meaningful participation.
Five methodological elements: Theory of System Change, multi-stage sampling of/in ‘embedded systems’, participatory mixed-methods, Participatory Sensemaking, and Configurational Analysis.
Three quality standards: rigour (defined as thoughtfully combining methods and processes to overcome bias), inclusiveness (defined as including multiple stakeholder views, values, realities, types of knowledge), and feasibility (defined in terms of resources and capabilities to engage).
Ten design sliders that facilitate making tradeoffs and design decisions in every new evaluation phase, based on a careful balancing of the three quality standards.
Typical methods that can be combined and embedded in a PIALA-based mixed design, depending on the user requirements, include:
Methods that can generate quantified qualitative data of changes alongside the impact pathways (e.g. signified perceptions and story fragments, relationship change maps, causal flow maps, benefit scorings and rankings) that can be subjected to quantitative analysis if collected at a large-enough scale. Examples are: Rubrics-based Self-Assessments, Mixed-Surveys, Social Network & System Mapping, Constituent Voice, Participatory Statistics, and SenseMaker.
Methods that can generate more in-depth systemic explanations of the observed changes through mixed and group-based inquiry and dialogue in a Multi-Case Study design. Examples are: Outcome Harvesting, Sustainable Return on Investment, PhotoVoice, and Participatory Sensemaking. Also Constituent Voice and Participatory Statistics apply group-based dialogue and analysis tools that generate systemic explanations.
In a PIALA-based mixed design, methods are chosen to complement and analytically build upon one another, while also overlapping to allow for systematic cross-checking. They are chosen to probe for evidence and explanations of planned as well as emerging changes, accounting for both predictable and unpredictable influences and risks, while facilitating the discovery of unknown pathways.
Participatory methods and tools are used in interactive group settings (e.g. facilitated dialogues, focus-group discussions and cross-stakeholder sensemaking events) to foster collective ‘mapping’, ‘measuring’ and ‘valuing’. These are adapted to the local contexts and cultures, using local forms of dialogue and knowledge while remaining cognizant of local politics and power dynamics. Contribution Tracing is used for the within-case causal analysis, ensuring the evidence is robust. The evidence generated through this methodological synergy underpins the final Configurational Analysis for cross-case comparison and Participatory Sensemaking for cross-validation and collective valuing of contribution.