Objectives of the assignment:
A mid-term evaluation will be conducted to measure qualitatively and quantitatively the project progress and will attempt to measure impact in terms of whether and how the CD support to different institutions, organizations and individuals resulted in the expected and needed impact at the enabling environment level, organizational level and individual level and whether and how the project is developing the public administration reform (PAR) management and coordination capacity of the Independent Administrative Reform and Civil Service Commission (IARCSC). The evaluation also aims at understanding the relevance of the project and its design. The result of the mid-term evaluation will be incorporated into the policies, strategies and plans of the project, and inform the future interventions of UNDP in the capacity building and national institutions building area.
Scope of Assignment:
The project initiated in January of 2010, is set to be completed by December, 2013. The scope of the mid-term evaluation shall have the following objectives:
- To examine to what extent the project has achieved the intended outputs, and in what specific areas the project excelled or failed in progress toward intended outputs.
- To indicate whether or not intended project impacts and outcomes are being met and/or, for specific outcomes, whether satisfactory progress was made.
- To analyse underlying factors that are influencing project impacts and outputs.
- To identify and analyse barriers and constraints that have delayed implementation, including challenges emanating from internal and external sources.
- To identify a list of ’lessons learned’ and make recommendations for corrections, including in relation to i) the specific CD services provided by the project (are they appropriate for the client and do they respond to the demands and felt needs of the client?); and ii) the sourcing of capacity (what is the general experience and are steps be taken to source capacity from institutions and networks beyond those considered under the project?).
- To state whether or not targets are being achieved and whether current and planned outputs can be sustained, including determination of measures needed to ensure continued sustainability of results in the future.
- To identify and extract the lessons learned and best practices that can be considered in the planning and design of future support activities for government and recommendations for future direction and areas of focus for the next phase of project.
- To identify and recommend opportunities and entry points to synergize with other UNDP projects’ CD interventions.
Mid-term Evaluation questions include:
- Are the intended outputs being achieved?
- To what extent has NIBP outputs and assistance contributed to the relevant outcomes?
- What internal or external factors have contributed to achieving or not achieving the intended results?
- Is the project design and intended outputs still consistent with the national priorities and in synergy with other similar interventions by other donors?
- What factors contributed to effectiveness or ineffectiveness?
- What are additional recommendations to improve service delivery within respected ministries and departments?
- What should be the UNDP approach in similar interventions in the future?
- What roles should NIBP play to establish effective synergies between other UNDP projects’ CD interventions?
The Evaluation Team will be composed of one international and one national consultant not involved with the formulation, appraisal, approval and/or implementation of daily management of the project. The consultants will be selected by the UNDP/NIBP evaluation support team.
An Evaluation Inception Report is required, but a final decision about the specific design and methods for the evaluation will emerge from consultations among the programme unit, the evaluators, and key stakeholders about what is appropriate and feasible to meet the evaluation purpose and objectives and answer the evaluation questions, given limitations of budget, time and extant data.
Key stakeholders are:
- Independent Administrative Reform and Civil Service Commission
- General Director of Programs’ Design and Management
Ministries in which NIBP is engaged at the national level include:
- Ministry of Agriculture, Irrigation & Livestock;
- Ministry of Education;
- Ministry of Labor, Social Affairs, Martyrs and Disabled;
- Central Statistic Organization
- Ministry of Transport and Civil Aviation;
- Deputy Ministry of Youth Affairs
Subnational partnerships are occurring Ministry of Agriculture, Irrigation & Livestock in the provinces of Bamyan, Herat, and Mazar e sharif; and in Labor, Social Affairs, Martyrs and Disabled in the province of Herat. NIBP’s subnational presence with the Independent Administrative Reform and Civil Service Commission is in Jalalabad and Mazar e sharif.
Evaluation team composition and required competencies:
The evaluation team will be composed of one international and one national consultant not involved with the formulation, appraisal, approval and/or implementation of daily management of the project. The team will be selected by an NIBP / UNDP Evaluation Support Team which will provide the logistical support during the evaluation.
The consultants are expected to be highly qualified in capacity development and governance. The consultants shall have minimum Masters level education from a accredited and recognized university in the field of international development, political science, public administration, public policy or governance, and at least 7 years’ experience in capacity development, institution building and / or governance. At least one other should preferably be an evaluation specialist and be experienced in using the specific evaluation methodologies that will be employed for that evaluation. The evaluation team should also possess a broad knowledge and understanding of the major economic and social development issues and problems in Afghanistan. Background or familiarity with conflict and post conflict situations may also be required, both for the conduct of the exercise itself, and for understanding the particular context of the evaluation.
Evaluations in UNDP will be conducted in accordance with the principles outlined in the UNEG ‘Ethical Guidelines for Evaluation.’ Evaluations should be carried out in a participatory and ethical manner and the welfare of the stakeholders should be given due respect and consideration (human rights, dignity and fairness). Evaluations must be gender and culturally sensitive and respect the confidentiality, protection of source and dignity of those interviewed.
Evaluation procedures should be conducted in a realistic, diplomatic, cost-conscious and cost-effective manner; must be accurate and well-documented and deploy transparent methods that provide valid and reliable information. Evaluation team members should have an opportunity to disassociate themselves from particular judgments and recommendations. Any unresolved differences of opinion within the team should be acknowledged in the report.
Evaluations should be conducted in a complete and balanced manner so that the different perspectives are addressed and analyzed. Key findings must be substantiated through triangulation. Any conflict of interest should be addressed openly and honestly so that is does not undermine the evaluation outcome. Evaluators should discuss, in a contextually appropriate way, those values, assumptions, theories, methods, results, and analyses that significantly affect the interpretation of the evaluative findings. These statements apply to all aspects of the evaluation, from its initial conceptualization to the eventual use of findings.
The rights and well-being of individuals should not be affected negatively in planning and carrying out an evaluation. This needs to be communicated to all persons involved in an evaluation, and its foreseeable consequences for the evaluation discussed.
The mission is expected to take a total of four weeks. Two weeks field mission to conduct the evaluation in Kabul and to provinces deemed integral to the evaluation, however it is subject to flight availability, security risk management and subsequent travel restrictions. One week to produce the Draft Report and one week for finalizing the report.
Logistical support, security clearance, and travel arrangements will be made by the evaluation support team. Office space and necessary equipment will be provided in the NIBP office in UNOCA on Jalalabad Road.
Time-frame for the evaluation process:
The evaluation will take place in 31 working days over a period of three months, including 4 days for international travel.
- 4 days Desk Review and Inception Report/Work Plan
- 4 international travel days to and from Kabul
- 15 days (maximum) in-country beginning on September 1st, 2012.
- a. includes in-country travel time to provinces which may take upwards of a day one-way
- b. includes days where security risks may inhibit or restrict travel
- 5 working days on the first draft of the Mid-Term Evaluation Report, (Due September 24th)
- 3 working days on the finalization of the Report Due (October 11th)
Activity Number of days Dates for the Activity By Whom
For the Evaluation
Desk Review 3 Days 21-24 August Evaluation Team
Inception Report and
Work Plan designed and
submitted to NIBP/UNDP
Evaluation Support Team 1 Day 25 August Evaluation Team
Approval of Work Plan 27 August NIBP/UNDP Evaluation
Notification of Stakeholders
of the Evaluation Schedule 28 August NIBP/UNDP Evaluation
Organizing the logistics and
travel for the Evaluation Team Ongoing NIBP/UNDP Evaluation
Travel Days 2 Days 31 Aug – 1 Sep International Consultant
Receiving the International
Consultant 1 September NIBP/UNDP Evaluation
Providing a Security Briefing
for the Evaluation Team 1 September NIBP/UNDP Evaluation
In-country evaluation mission 15 Days 2 – 16 September Evaluation Team
Travel Days 2 Days 17-18 September International Consultant
Preparing the draft report 5 Days 19-23 September Evaluation Team
Submit Draft Report 24 September Evaluation Team
Stakeholder meeting and review
of the draft report (for quality
assurance) 24 Sep–6 October NIBP/UNDP Evaluation
Incorporating comments and
finalizing the evaluation report 3 Days 8-10 October Evaluation Team
Submit Final Report 11 October Evaluation Team
In addition, the Evaluators may be expected to support UNDP efforts in knowledge sharing and dissemination.
- Consultants are requested to submit a proposal to conduct the Mid-term Evaluation of NIBP. In-country flights will be covered by the project.
The evaluation team will be accountable for producing the following:
- Evaluation inception report—an inception report should be prepared by the evaluators before going into the full-fledged evaluation exercise. It should detail the evaluators’ understanding of what is being evaluated and why, showing how each evaluation question will be answered by way of: proposed methods; proposed sources of data; and data collection procedures. The inception report should include a proposed schedule of tasks, activities and deliverables, designating a team member with the lead responsibility for each task or product. The inception report provides the programme unit and the evaluators with an opportunity to verify that they share the same understanding about the evaluation and clarify any misunderstanding at the outset. See Evaluation Matrix below.
- Draft evaluation report — the programme unit and key stakeholders in the evaluation should review the draft evaluation report to ensure that the evaluation meets the required quality criteria.
Final evaluation report: Recommendations on future support to the counterparts and strategic partners and stakeholders for enhancing the institutional capacity development at national and sub-national level to enable institutions to deliver services.
For the purposes of providing effective support to the covering line ministries, the areas of focus in the current phase should be evaluated and areas of future support with modified emphasis on the above-mentioned components should be highlighted.
Evaluation brief and other knowledge products or participation in knowledge sharing events, if relevant.
Table A. Sample Evaluation Matrix
Relevant Key Specific Data Data collection Indicator / Methods
evaluation Questions Sub - Sources Methods/Tools Success of Data
criteria Questions Standard Analysis