Emergency management evaluations: beyond the lessons-learned paradigm

Evaluations, although widely used, are often regarded as complex and confusing.

Simply put, an evaluation is a systematic and objective collection of information that helps decision-making about the worth or value of an activity. This can be for programs, projects or interventions. Evaluations may be used as either formative (i.e. used to help shape an activity) or summative (i.e. conducted at the end of the activity) and they are often classified as process, monitoring, outcome or impact evaluations. Specific evaluation methods are multiple and complex, probably one reason why evaluations are considered as varied and confusing.

One approach to evaluation is ‘lessons learned’, which is the predominant method of evaluation in the Australian emergency management sector. Classically, this approach includes operational debriefs, after-action reviews and assurance activities within a quality improvement philosophy. This sophisticated process of moving from ‘identifying’ to ‘learning and translating’ the lessons into practice, in a learning organisation with a culture of continuous improvement, provides the core principles of the Victorian EM-LEARN Framework1 launched in 2015.

Developing the lessons-learned approach has been an initiative at the national level evolving over recent years. Respective comprehensive assurance frameworks for emergency management have been established by the Inspectors-General for Emergency Management in Victoria2 and Queensland3 since 2015. The Australian Institute for Disaster Resilience (AIDR) and the Australasian Fire and Emergency Service Authorities Council have conducted the Annual Lessons Learned Forum4 and the Australian Journal of Emergency Management (AJEM) devoted a special issue in 2018 to lessons learned papers.5 AIDR also devoted one of the AIDR National Handbooks6 and a specific AIDR collection7 to lessons management. However, there is little evidence of other evaluation methods being used in the emergency management sector.

One frequent, anecdotal criticism of the lessons-learned approach is that many reports remain confidential within emergency services organisations and are not available in the public domain to benefit others. Nationally, the lessons-learned paradigm is critiqued as being ‘lessons not learned’. Iain S MacKenzie, the then Inspector-General Emergency Management of Queensland, noted in the 2018 AJEM issue on lessons management:

So, what confidence can we give our key internal and external stakeholders that we really do learn?
My observation is that many processes are overly focused on examining how emergencies were managed rather than considering a complete PPRR approach. Equally, they also often seem to look for deficiencies rather than actively discovering and sharing the very good practices that occur.
(Australian Journal of Emergency Management, vol. 33, no. 2, p.4)

Expanding these reflections, we frame a bigger challenge, ‘does the lessons-learned approach identify if the intervention or practice actually works’? Is it a matter of ‘learning about what happened’, or of determining and adopting ‘what works’?

There are other evaluation strategies, methods and typologies better suited to answer this question and augment the lessons-learned approach.8 One specific method is applicable to this argument (beyond the lessons-learned paradigm) being ‘impact evaluations’9 sometimes seen as one form of outcomes measures. The OECD10 defines ‘impacts’ as, ‘positive and negative, primary and secondary, long-term effects produced by a development intervention, directly or indirectly, intended or unintended’. An impact evaluation provides information about the impacts produced by an intervention, which might be a program, project, specific action or practice, or a policy. This helps to determine what works and what doesn’t, and why.

Fundamental to an impact evaluation is the requirement for a measure of attribution. For example, does the intervention relate to the effects observed? These are complex and challenging evaluations requiring thorough planning and sound design. However, they are achievable in this domain and are being increasingly reported in the disaster-related literature. The Victorian Assurance Framework for Emergency Management2 includes these concepts. Key to the study design is the use of a control group or the process known as a ‘counterfactual’ or what would have happened had there been no intervention. It all sounds a little mystifying, but they are non-the-less achievable and are able to be publicly diseminated. A repository of evaluation studies, of all types, would facilitate public dissemination and benefit others in the sector. Bodies such as the International Federation of Red Cross and Red Crescent Societies, UNICEF and 3ie provide publicly accessible repositories of evaluation studies. We have previously recommended that Emergency Management Australia should develop such a repository for Australian-funded emergency management projects to be located within the AIDR Knowledge Hub.

In Australia, there are national and state-based grant schemes for emergency management projects, all of which are required to produce a ‘report to funder’ and many include a structured evaluation component. It is becoming standard practice to include 10 per cent of the budget of these projects to undertake an evaluation. We suggest that it be a requirement of all funded projects that the summary of the final report to the funder and the evaluation be submitted to the proposed publicly accessible Knowledge Hub evaluation repository.
In the humanitarian setting, all projects funded by the Department of Foreign Affairs and Trade (DFAT) through AusAID are required to include a project Monitoring, Evaluation and Learning Plan and a project evaluation report within the funding agreement.11 Failure to undertake such evaluation precludes that agency from future DFAT funding. Publicly accessible monitoring and evaluation reports in the domestic emergency management sector are noticeably lacking, depriving the sector of a rich knowledge resource. One noticeable exception is the AIDR Monitoring and Evaluation plan released in November 2020.12 This innovative, inclusive and comprehensive plan is publicly available on the AIDR website and serves as a contemporary, exemplar and as a guide for others to consider as they incorporate monitoring and evaluation plans in their projects.

As is demonstrated in the AIDR Monitoring and Evaluation plan, a key element is a ‘Theory of Change’, or Logic Model. Evaluations are often interpreted as end-of-project summative activities, but they are also increasingly seen as beneficial to use the Theory of Change as a formative process to guide the project’s structure and activities at the beginning and during the project.

The benefits of identifying and disseminating what works is worth the effort for the community with the expectation that, over time, outcomes will improve. It is up to the emergency management sector to respond to the challenge.

Endnotes 

  1. EM-LEARN Framework at: www.emv.vic.gov.au/how-we-help/reviews-and-lessons-management/lessons-management-framework-em-learn.
  2. Victorian Assurance Framework for Emergency Management at: www.igem.vic.gov.au/our-work/assurance-framework-for-emergency-management.
  3. Queensland Assurance Framework for Emergency Management at: www.igem.qld.gov.au/assurance-framework.
  4. 2019 Lessons Management Forum at www.aidr.org.au/resources/2019-lessons-management-forum/.
  5. Australian Journal of Emergency Management, April 2018, at: https://knowledge.aidr.org.au/resources/ajem-april-2018.
  6. Lessons Management Handbook at: https://knowledge.aidr.org.au/resources/lessons-management-handbook/.
  7. AIDR lessons management at: https://knowledge.aidr.org.au/collections/lessons-management/.
  8. Science and Evaluation in Disasters at https://wadem.org/wp-content/uploads/2020/04/WADEM-PS-Science-and-Evaluation-in-Disasters.pdf.
  9. Outline of Principles of Impact Evaluation at www.oecd.org/dac/evaluation/dcdndep/37671602.pdf.
  10. Glossary of Key Terms in Evaluation and Results Based Management at www.oecd.org/dac/evaluation/2754804.pdf.
  11. Department of Foreign Affairs and Trade 2016. Monitoring, Evaluation and Learning Framework, at: www.dfat.gov.au/sites/default/files/ancp-monit-eval-and-learning-framework.pdf.
  12. Australian Institute for Disaster Resilience 2020, Monitoring and Evaluation plan, at: www.aidr.org.au/media/8324/aidr_monitoring-and-evaluation-plan_2020-10-15.pdf.