Background

The violence that occurred in Jos, Plateau State in 2010 which claimed many lives and destroyed property worth millions of Naira prompted UNDP to initiate and manage a project aimed at strengthening the State’s capacity for early warning systems, conflict prevention, management and resolution in Jos. The project is funded by the Government of Norway.

Prior to the UNDP’s intervention, Plateau State has been assailed by conflicts of different levels and intensity often fueled by easy availability of illicit Small Arms and Light Weapons (SALW) and elite manipulation. The conflicts are often linked to age-long indigenes versus settlers’ syndrome between the Fulani’s and the natives who have co-habited together for over a century.  The conflict turned violent since 2001 with the cumulative impact being catastrophic for people’s livelihoods. Numerous people have been killed, infrastructures destroyed and socioeconomic activities largely disrupted. As a result, attempts and programmes intended to develop the full potential of the State has been systematically undermined.

The underlying objective of the project, among others, is to ensure that livelihoods support is provided to those affected by conflicts/violence especially youths and widows in Jos. The programme is focused on livelihoods re-orientation/restoration by providing basic knowledge (including vocational training, agricultural training and life skills) and establishing early warning systems to youths and women drawn from the 8 conflict troubled local government areas (LGAs) in Plateau State cutting across all social divides.

This aim of the evaluation is to assess the progress in the implementation of the project in terms of what has worked, what did not and why, look at the challenges faced and ensure accountability for the overall results and impact on people’s livelihoods. The lessons learned and recommendations from the evaluation will be used in the design of scaled up similar development support in the future, particularly in the UNDP response to the North East.

The UNDP Office in Nigeria is commissioning this evaluation under its Governance and Peace Building Unit to capture evaluative evidence of the relevance, effectiveness, efficiency and sustainability in order to strengthen existing and future programmes.

The main objective of this evaluation is to assess the achievements and challenges of the livelihoods promotion and protection for victims of violent conflicts and to draw lessons that can both improve the sustainability and aid in the overall enhancement of UNDP programming.

Specifically, the evaluation will assess:

  • The relevance and strategic positioning of UNDP and other partners’ support to Plateau State on livelihoods promotion, early warning systems and the protection of the victims of violent conflict;
  • The framework and strategies that UNDP has devised for its support on early warning, livelihoods promotion and protection in conflict areas, including partnership strategies, and whether they are well conceived for achieving the planned objectives;
  • The progress made towards achieving intended post-conflict outcomes, through the project and advisory services, including contributing factors and constraints that may have hindered more effective implementation;
  • The progress to date and what can be derived in terms of lessons learned for future livelihoods promotion, early warning systems and conflict prevention for victims of violent conflicts;
  •  Make recommendations about the design of any future early warning, livelihoods and conflict prevention support based on lessons learned in the project implementation.

Duties and Responsibilities

The evaluation will cover the period from the inception of the project in 2013 to date. The evaluation will be conducted during the month of  December 2015, with a view to consolidate the gains made while providing strategic direction and inputs to the preparation of the next phase of future projects and programming.

Questions guiding the evaluation:

Relevance

  • Was the initial design of the project adequate to properly address the issues envisaged in formulation of the project and provide the best possible support to the Plateau State Government?;
  • Has the project remained relevant?

Effectiveness

  • Are the project outputs appropriate, sufficient, effective and sustainable for the desired outcome?

Output analysis

  • Are the project outputs relevant to the outcome?
  • What are the quantities and qualities of the outputs, and their timeliness? What factors impeded or facilitated the delivery of the outputs?;
  • Are the indicators appropriate to link the outputs to the outcome?
  • Have the outputs been delivered as planned?
  • Which aspects of the project have been most effective so far and which ones are least effective?
  • What key challenges have hampered the delivery of intended outputs?
  • How can the effectiveness of the project be strengthened for future interventions?

Efficiency

  • Was UNDP support to the project appropriate to achieving the desired objectives and intended results? If not, what were the key weaknesses?
  • Has there been an economical use of financial and human resources?
  • Were the results delivered in a reasonable proportion to the operational and other costs?
  • Could a different type of intervention lead to similar results at a lower cost and how could this be incorporated in future programme designs?
  • Did the monitoring and evaluation systems that UNDP had in place help ensure that the project was managed efficiently and effectively?

Effectiveness

  • What evidence is there that UNDP support has contributed towards improvement in the State government’s capacity, including institutional strengthening?
  • Has UNDP worked effectively with other UN Agencies and other international and national partners to deliver the project objectives?
  • How effective has UNDP been in partnering with civil society and the private sector to promote and implement the project?;
  • Has UNDP utilized innovative techniques and best practices in its implementation?
  • Is UNDP perceived by stakeholders as a strong advocate for supporting victims of violent conflicts - particularly women, widows and youths in Plateau state and Nigeria at large?
  • Taking into account the technical capacity and institutional arrangements of the UNDP Country Office, is it well suited to provide and implement the project?
  • What contributing factors and impediments have enhanced or impeded UNDP performance in this area?

Sustainability

  • Will the outputs delivered through the project be sustained by State capacities after the end of the project duration? If not, why?
  • Will there be adequate funding available to sustain the functionality over the short, medium and longer term?
  • Has the project generated the buy-in and credibility needed for sustained impact?

Resources, partnerships, and management analysis

  • Were project partners, stakeholders and/or beneficiaries involved in the design of the intervention?
  • If yes, what was the nature and extent of their participation? If not, why not?
  • Was the structure and management of the project appropriate to achieving the desired objectives and intended results of the project? If not, what were the key weaknesses?
  • Has the intervention developed the necessary State capacities (both human and institutional) for sustainability?

The evaluation should also include an assessment of the extent to which project design, implementation and monitoring have taken the following cross cutting issues into consideration:

Human rights

  • To what extent have poor, indigenous and tribal peoples, women and other disadvantaged and marginalized groups benefitted from UNDPs work in support of livelihoods promotion and conflict prevention?

Gender Equality

  • To what extent has gender been addressed in the design, implementation and monitoring of the project?

 Methodology:

The evaluation will be carried out by an external evaluator, and will engage a wide array of stakeholders and beneficiaries, including State and local government officials, donors, civil society organizations, private sector representatives and community members. 

The evaluation is expected to take a “theory of change’’ (TOC) approach to determining causal links between the interventions that UNDP has supported, and observed progress in the State. The evaluator will develop a logic model of how the project is expected to lead to improved support and protection of victims of violent conflicts - particularly women, widows and youths in Plateau State. The evaluator is expected to construct a theory of change based on the stated objectives and anticipated results.

The following steps in data collection are anticipated:

Desk Review

A desk review should be carried out of the key strategies and documents underpinning the project. This includes reviewing the UNDAF and pertinent country programme documents, as well as, project reports to be provided by the UNDP. 

Field Data Collection

Following the desk review, the evaluator will build on the documented evidence through an agreed set of field and interview methodologies, including:

  • Interviews with key partners and stakeholders;
  • Discussions with UNDP and partner institutions’ senior management;
  • Interviews with relevant project staff;
  • Interviews with Plateau Government officials at State, LGA and community levels;
  • Interviews with other relevant stakeholders including civil society organizations;
  • Field visits to project sites and partner institutions;
  • Survey questionnaires;
  • Participatory observation, focus group discussions where appropriate, and rapid appraisal techniques.

Expected outputs and deliverables

The following deliverables will be expected from the evaluator:

  • Inception Report, detailing evaluation scope and methodology, including data collection methods, as well as, approach for the evaluation. The Inception Report should also contain a detailed work plan with timelines for agreed milestones;
  • The Draft Evaluation Report which will be shared with Plateau State Government, UNDP and funding partners for comments and input; and
  • The Final Evaluation Report, incorporating comments from stakeholders.

One week after contract signing, the evaluator will produce an inception report containing the proposed theory of change for UNDPs work on livelihoods promotion and conflict prevention in Plateau State, Nigeria. The inception report should include an evaluation matrix presenting the evaluation questions, data sources, data collection, analysis tools and methods to be used. The inception report should detail the specific timing for evaluation activities and deliverables, and propose specific site visits and stakeholders to be interviewed for discussion. The inception report will be discussed and agreed with UNDP and other stakeholders before the evaluator proceeds with site visits so appropriate protocols are established with respective stakeholders.    

The draft evaluation report will be shared with stakeholders, and presented in a validation meeting that the UNDP will organise. Feedback received from these sessions should be taken into account when preparing the final report. The evaluator will produce an ‘audit trail’ indicating whether and how each comment received was addressed in revisions to the final report. 

The suggested table of contents of the evaluation final report is as follows:

  • Title;
  • Table of Contents;
  • List of Acronyms and Abbreviations;
  • Executive Summary;
  • Introduction;
  • Evaluation Scope and Methodology;
  • Evaluation approach and methods;
  • Data analysis;
  • Findings and Conclusions;
  • Lessons Learned;
  • Recommendations; and
  • Annexes

Audience

The evaluation findings are intended mainly for the Plateau State Government, UNDP and funding partners.

  Evaluation consultant required competencies.

The desired consultant will work under the overall supervision of the UNDP Deputy Country Director (Programme) and the direct supervision of the Governance and Peacebuilding Team Leader. The consultant will have overall responsibility for the quality and timely submission of the final evaluation report. Specifically, s/he will perform the following tasks:

  • Develop the inception report, detailing the evaluation scope, methodology and approach;
  • Design the detailed evaluation scope and methodology and approach;
  • Conduct the project evaluation in accordance with the proposed objective and scope of the evaluation and UNDP evaluation guidelines;
  • Liaise with UNDP on travel and interview schedules’
  • Draft and present the draft and final evaluation reports;
  • Finalize the evaluation report and submit it to UNDP and partners.

Evaluation Ethics

The evaluation must be carried out in accordance with the principles outlined in the UNEG ‘Ethical Guidelines for Evaluation’ and sign the Ethical Code of Conduct for UNDP Evaluations. In particular, evaluators must be free and clear of perceived conflicts of interest. To this end, interested consultants will not be considered if they were directly and substantively involved, as an employee or consultant, in the formulation of UNDP strategies to the project under review.  The UNDP code of conduct and an agreement form will be signed by the consultant.

Implementation Arrangements

The UNDP Nigeria Country Office will select the consultant through a competitive process and will be responsible for the management of the evaluator. UNDP has designated a focal point for the evaluation and additional staff to assist in facilitating the process (e.g., providing relevant documentation, arranging introductory letters for visits/interviews with key informants). The Country Office will take responsibility for the approval of the final evaluation report. The consultant will take responsibility for setting up meetings and conducting the evaluation, subject to advanced approval of the methodology submitted in the inception report. The UNDP Country Office will develop a management response to the evaluation within two weeks of report finalization.

While the Country Office will provide some logistical support during the evaluation, for instance assisting in setting interviews with senior government officials, it will be the responsibility of the evaluator to logistically and financially arrange their travel to and from relevant project sites and to arrange most interviews. Planned travel and associated costs will be included in the Inception Report, and agreed with the Country Office. 

 Timeline and schedule:

The evaluation assignment will commence in December  2015. The duration of the assignment is up to a maximum of 25 working days, including writing of the final report.

Cost and financing:

The following anticipated costs of the evaluation will be financed by the UNDP. Breakdown of the resources required for:

  • Consultant’s – professional fees;
  • Local travel costs;
  • Draft Report presentation workshop

Documents for Review

UNDP Corporate Policy Documents:

  • Handbook on Monitoring and Evaluation for results;
  • UNDP Guidelines for Outcome Evaluators;
  • UNDP Result-Based Management: Technical Note;
  • UNDG RBM Handbook.

UN/UNDP Nigeria Country Office Documents:

  • Livelihoods promotion and conflict prevention programme document;
  • Project Annual Work Plans, Back to Office Mission Reports and Progress Reports;
  • Nigeria Development Assistance Framework (UNDAF) 2014 - 2017;
  • UNDAF 2010 – 2013;
  • UNDAF 2010 – 2013 Evaluation;
  • Country Programme Document for Nigeria 2014 – 2017.

Evaluation Report Template and Quality Standards

Title and opening pages — should provide the following basic information:

  • Name of the evaluation intervention;
  • Time frame of the evaluation and date of the report;
  • Countries of the evaluation intervention;
  • Names and organizations of evaluators;
  • Name of the organization commissioning the evaluation;
  • Acknowledgements

Table of contents — should always include boxes, figures, tables and annexes with page references.

 List of acronyms and abbreviations

Executive summary — A stand-alone section of two to three pages that should:

  • Briefly describe the intervention (the project(s), programme(s), policies or other interventions) that was evaluated;
  • Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses;
  • Describe key aspect of the evaluation approach and methods;
  • Summarize principle findings, conclusions, and recommendations;

Introduction — Should:

  • Explain why the evaluation was conducted (the purpose), why the intervention is being evaluated at this point in time, and why it addressed the questions it did;
  • Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why, and how they are expected to use the evaluation results;
  • Identify the intervention (the project(s) programme(s), policies or other interventions) that was evaluated — see upcoming section on intervention;
  • Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the evaluation and satisfy the information needs of the report’s intended users.

 Description of the intervention – provides the basis for report users to understand the logic and assess the merits of the evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation.

The description should:

  • Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address;
  • Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy;
  • Link the intervention to national priorities, UNDAF priorities, corporate multiyear funding frameworks or strategic plan goals, or other programme or country specific plans and goals;
  • Identify the phase in the implementation of the intervention and any significant changes (e.g., plans, strategies, logical frameworks) that have occurred over time, and explain the implications of those changes for the evaluation;
  • Identify and describe the key partners involved in the implementation and their roles;
  • Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component;
  • Indicate the total resources, including human resources and budgets;
  • Describe the context of the social, political, economic and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes; and
  • Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).

 Evaluation scope and objectives — the report should provide a clear explanation of the evaluation’s scope, primary objectives and main questions.

  • Evaluation scope — the report should define the parameters of the evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed;
  • Evaluation objectives — the report should spell out the types of decisions evaluation users will make, the issues they will need to consider in making those decisions, and what the evaluation will need to achieve to contribute to those decisions;
  • Evaluation criteria — the report should define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the evaluation;
  • Evaluation questions — Evaluation questions define the information that the evaluation will generate. The report should detail the main evaluation questions addressed by the evaluation and explain how the answers to these questions address the information needs of users.

Evaluation approach and methods — The evaluation report should describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation questions and achieved the evaluation purposes. The description should help the report users judge the merits of the methods used in the evaluation and the credibility of the findings, conclusions and recommendations. The description on methodology should include discussion of each of the following:

  • Data sources — the sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions;
  • Sample and sampling frame — If a sample was used: the sample size and characteristics; the sample selection criteria (e.g., single women, under 45); the process for selecting the sample (e.g., random, purposive); if applicable, how comparison and treatment groups were assigned; and the extent to which the sample is representative of the entire target population, including discussion of the limitations of the sample for generalizing results;
  • Data collection procedures and instruments — Methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source and evidence of their reliability and validity;
  • Performance standards — the standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales);
  • Stakeholder engagement — Stakeholders’ engagement in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results;
  • Ethical considerations — The measures taken to protect the rights and confidentiality of informants (see UNEG ‘Ethical Guidelines for Evaluators’ for more information www.unevaluation.org);
  • Background information on evaluators — The composition of the evaluation team, the background and skills of team members and the appropriateness of the technical skill mix, gender balance and geographical representation for the evaluation;
  • Major limitations of the methodology — Major limitations of the methodology should be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.

 Data analysis — the report should describe the procedures used to analyze the data collected to answer the evaluation questions. It should detail the various steps and stages of analysis that were carried out, including the steps to confirm the accuracy of data and the results. The report also should discuss the appropriateness of the analysis to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

Findings and conclusions — the report should present the evaluation findings based on the analysis and conclusions drawn from the findings.

  • Findings — should be presented as statements of fact that are based on analysis of the data. They should be structured around the evaluation criteria and questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results should be explained, as well as factors affecting the achievement of intended results. Assumptions or risks in the project or programme design that subsequently affected implementation should be discussed.
  • Conclusions — should be comprehensive and balanced, and highlight the strengths, weaknesses and outcomes of the intervention. They should be well substantiated by the evidence and logically connected to evaluation findings. They should respond to key evaluation questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision making of intended users.

 Recommendations — the report should provide practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations should be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. They should address sustainability of the initiative and comment on the adequacy of the project exit strategy, if applicable.

Lessons learned — As appropriate, the report should include discussion of lessons learned from the evaluation, that is, new knowledge gained from the particular circumstance (intervention, context outcomes, even about evaluation methods) that are applicable to a similar context. Lessons should be concise and based on specific evidence presented in the report.

 Report annexes — Suggested annexes should include the following to provide the report user with supplemental background and methodological details that enhance the credibility of the report:

  • Terms of Reference for the evaluation;
  • Additional methodology - related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate;
  • List of individuals or groups interviewed or consulted and sites visited;
  • List of supporting documents reviewed;
  • Project or programme results map or results framework;
  • Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators;
  • Code of conduct signed by evaluator/consultant.

Competencies

  • Demonstrated analytical, communication and technical report writing skills;
  • Strong working knowledge of the UN and its mandate in Nigeria, and more specifically the work of UNDP in support of government and civil society in Nigeria.

Required Skills and Experience

Education:

  • University degree in public administration, political science, conflict prevention & peacebuilding, economics, development planning, business administration, law or other relevant qualifications.

Experience:

  • Extensive experience in conducting evaluations, with a strong working knowledge on institutional capacity building/development and state building;
  • Extensive knowledge of results - based management (RBM) evaluation, and participatory monitoring and evaluation methodologies and approaches;
  • Minimum of 7 years professional expertise in national development co-operation, livelihoods promotion and conflict prevention support programming issues, programme/project evaluation, impact assessment/development of programming/strategies; gender equality and social services;
  • At least 5 years of experience in conducting evaluations of government and international aid organizations, preferably with direct experience with civil service capacity building;
  • Good professional knowledge of the Nigerian governance context.

Language:

  • Fluent in English.