[Guest post by Rhoda Omenya, iHub Research. Rhoda lead the Uchaguzi Monitoring and Evaluation project. Cross-posted from the ihub blog.] uchaguzigraphic2 iHub Research conducted a 6-month assessment of the Kenya 2013 Uchaguzi deployment. Today, we release the final report detailing the process of deploying the ICT election-monitoring platform during the March 4th Kenya elections. An earlier brief was released immediately after the week of the Kenyan elections in March 2013. The deployment was a collaboration of Hivos, Ushahidi and CRECO. The research project ran from January 2013 to June 2013 with the following objectives:
  • To document the Uchaguzi process leading up to the Kenya 2013 Elections in order for the technology platform to be replicable in other communities and countries;
  • To use evaluation methods and the documentation to understand how to make the Uchaguzi initiative more sustainable and scalable;
  • To develop a set of metrics to aid in this evaluation and analyze the replicability of the Uchaguzi process so as to provide targeted recommendations for scaling up the initiative.
The objectives were achieved through a thorough literature review of material relevant to this and past deployments, in-depth key informant interviews, a stakeholder analysis, and an after action review with digital volunteers. Fieldwork with a small sample of citizen users was also conducted to understand the Uchaguzi KE deployment from citizens’ perspectives. The resulting information was collated into a set of metrics that were aggregated into categories forming the basis of the evaluation. The evaluation revealed strengths and weaknesses of the deployment, which have been documented in lessons learned that we hope will be applied towards enabling future deployments to be more effective.

Strengths of the KE 2013 deployment included:

  • The physical situation that provided for close collaboration between the local volunteers and partners;
  • The fact that the technology platform ran for the whole deployment without going down; and
  • Wide-scale buy-in and participation from partners and volunteers without whom the deployment would have been grounded.
  • A special achievement during this deployment was also that reports were submitted from a wider range of geographic locations around the country. Notably, reports were received from the Northern town of Mandera for the first time ever.
march 11 map 15-51pm

The greatest overarching challenges included:

  • Time management. Apart from reaching out to key partners, every other key aspect of the deployment was delayed, which inevitably affected the quality of the deployment.
  • Inadequate data management strategy – how to clean, use, and store the data, especially after the deployment. The inability to track information sent to response partners also greatly weakened the feedback loop.
  • Insufficient training of volunteers.
  • Disjointed outreach efforts.
From this deployment, the proof of concept of Uchaguzi has been clearly made; the fact that users and partners sincerely believed in the utility of the product highlights the value of such an ICT election-watch initiative. Nonetheless, it is highly recommended that Uchaguzi partners apply the identified lessons learned to their operations and future deployments, otherwise, the same recommendations will be repeated each time a deployment is held, as observed when comparing this report to the evaluation of the Uchaguzi 2010 deployment. With proper project, time and resource management; management of citizen expectations; and value creation amongst partners, future deployments will improve in their efficiency and impact.

See the Reports

Download the Uchaguzi Monitoring And Evaluation full report. See the Uchaguzi Case Study from 2010.