Monitoring and Evaluation with Disaster recovery
As disaster and emergency managers, we always have to plan for the unexpected. In this regard, it is always important to understand what a disaster is and what qualifies as a hazard in order to react accordingly. A low magnitude earthquake in the East African region in 2019 was a natural hazard but it was not a disaster. The intensity of the earthquake was very low and did not have any impact on people or property. In contrast, the cyclone Udai in Zimbabwe and Mozambique, a natural hazard escalated into a disaster with high loss of life and property.
Disasters and emergencies happen after an interaction between a hazard and a vulnerable population that disrupts lives and communities. Due to this, we always evaluate disasters in terms of their intensity, location, scale, and the extent to which they are human-made or ‘natural’ and the vulnerability of the population affected. Of key importance, after a disaster, the efficiency of the after response is usually critical to the recovery of the affected community.
When the response is well coordinated and touches on the key needs of the community focusing on rebuilding with locally available resources the population bounces back fast. Our task, as we focus on quick disaster recovery, is to find the most efficient manner to hand a disaster and it’s aftermath. For this to be possible, Monitoring and Evaluation plays a key role in the process.
Monitoring and evaluation (M&E) as a process provides key:
- guidance on future intervention activities;
- information on what an intervention is doing, how well it is performing and whether it is achieving its aims and objectives;
M&E can be classified as an essential part of accountability to stakeholders and funding agencies. Therefore it’s necessary to plan for Monitoring and Evaluation to be done at the beginning of an intervention development process. Monitoring is the regular collection of information about all project activities. It shows whether things are going according to plan and helps project managers to identify and solve problems and or issues identified in a prompt manner.
On the other hand, Evaluation seeks to determine whether a project is achieving what it set out to do and whether it is making a previously projected impact. If the set objective is being achieved, the evaluation seeks to understand how and why the intervention has worked so well. If the project is unsuccessful, questions are raised as to what could have been done better or differently. Evaluations’ main purpose is to keep track of key outcomes and impacts related to the different project components, assessing whether the objectives, aims and goals are being achieved.
As an ongoing process, we engage in Monitoring throughout an intervention while evaluations take place at specific times during interventions. It is common to start with baseline research towards the beginning of an intervention so as to obtain information with which subsequent changes can be compared. Further evaluations are usually done at intervals in between the intervention processes.
In disaster recovery, we need a Monitoring and Evaluation Framework to ensure the programs being implemented are evaluated to gauge their effectiveness. By improving the quality of evaluations, it makes it possible to improve subsequent disaster recovery programs. The learning we obtain from these evaluations is incorporated into program design and delivery.
Therefore, disaster recovery is a set of activities deployed to achieve the desired recovery objective and outcome after the occurrence of a disaster. In most cases, the sets of program are above and beyond usual services that government provides to the same community while not affected with disaster. For us, the main focus of this program is bringing back the community where they can continue to process on their own.
Monitoring disaster recovery
We define disaster recovery as a continuous and interactive process through which programs are custom-made since the affected community needs evolve and the impact of the disaster changes in scope and intensity. The progress towards sustainability and resilience cannot, therefore, be captured retrospectively. We have to continuously engage in monitoring of outcomes, and how programs are delivering those outcomes. Regular and planned monitoring of disaster recovery outcomes helps ensure:
- Programs are modified to cater to emerging needs
- Available resources can are redirected to other areas of need as earlier targets are achieved
- An early warning system is set up to identify non-performing programs.
- Progress toward successful recovery is communicated to the community and other relevant stakeholders
- All the groups involved in the delivery of recovery programs are accountable for their respective performance.
Monitoring should be followed through with Community Recovery Progress Reports (CRPR). These reports should be compiled in accordance with a timetable set out in the evaluation plan, which should occur at least annually, or on a more regular basis in the earlier phases. The CRPR should include sections that:
- Report on key indicators significant for progress on a particular recovery program
- provide appropriate qualitative assessments of recovery progress
- review key activities performed in the reporting period
- review key activities to be performed by the next reporting period and the expected outcome to be achieved
- identify where expectations have not been met and discussing why and how best to approach such in future.
- Report on ways the local community has been involved in the recovery process.
Evaluation of Findings
We engage in Evaluations of recovery programs to determine whether; they contribute to the improvement of that recovery effort and its successive recovery efforts. In other words, while evaluations promote accountability, that is funds are not misappropriated and programs are delivered as planned, they also are important to ensure learning takes place and is used to improve the overall program (current and subsequent).
Evaluation report findings should be presented in a way that is suitable for the intended audience particularly if the audience is inclusive of the affected community. A balance needs to be struck between the accessibility of these findings to the affected communities and to the broader public, and the need to be sufficiently comprehensive to inform decision-making.
Therefore an evaluation report itself should include:
- A comparison of attained results with other similar recovery programs.
- Biases discovered that could limit the scope of the evaluation
- Alternative results proposal with evaluations on how external factors contributed to the overall recovery program
- Positive and negative consequences discovered on the cause of the evaluation process
- a discussion of the extent to which the different data collection methods lead to similar results and a discussion of any differences.
With Monitoring and Evaluation capabilities involvement in the implementation of a disaster recovery program, frameworks can be developed to build skills needed on matters of speculated possible disasters. As emergency managers, we tend to limit our plans and exercises in accordance with our capabilities with the thought of scaling up with regards to a disaster. In all regards, this a clear set up to fail. With the dynamic nature of disasters, it is becoming increasingly important to plan for the maximum impact of any disaster.
Monitoring and Evaluation of disaster recovery processes assist in making data models to predefine a scale-up of your capabilities enabling responders to work with an expectation of the worst. By doing this (not limiting a disaster to one’s capabilities) it forces us, the recovery team, to look at alternatives means; we think differently. More importantly, it demands us to look for solutions that are not merely scaling up current systems or practices. Clearly, through M&E processes, we are able to change our way of thinking accommodating changing threats. This is the ultimate objective, and this approach will save the most lives.