Background

In line with the commitments given to the donors and partner agencies in the approved project document between UNDP and the Government of Bangladesh, the independent final evaluation of the CDMP-II will assess the level of progress being made towards the achievement of project impact, outcomes and outputs. In addition to the formal requirements, it is best practice that a thorough review is undertaken to identify areas to continue, improve and design new similar project.

The Final Evaluation (FE) will focus on the relevance, effectiveness, efficiency, results and sustainability of project implementation; lessons learned about project design, implementation and management.

Duties and Responsibilities

Objective:

To lead the Final Evaluation (FE) of the Comprehensive Disaster Management Programme 2010-2015 (CDMP-II), with a specific focus on the impacts of the programme and management related issues.

For more detail regarding the Final Evaluation please see the Final Evaluation Terms of Reference (ToR) given Below.

Scope of Work:

  • Take responsibility for relevant questions and sub-questions falling under the relevant area of expertise and responsibilities;
  • Manage data collection and data analysis falling under the relevant area of expertise and responsibilities;
  • Assist with the focused group discussions at all levels as relevant to the area of expertise and responsibilities;
  • Conduct interviews at all levels as relevant to the area of expertise and responsibilities;
  • Participate in the briefing and debriefing sessions;
  • Be responsible for report writing covering relevant areas of responsibility;
  • Assist the International Team Leader as directed.

Expected outputs / Deliverables:

The Knowledge Management/Communication Expert will be responsible for delivering:

  • Evaluation inception report - the inception report will detail the evaluators’ understanding of what is being assessed and why, showing how the review objectives will be met by way of: appraisal methods and techniques; sources of data (and an assessment their quality); and data collection procedures. The inception report will include a proposed schedule of tasks, activities and deliverables, designating a team member with the lead responsibility for each task or product;
  • Draft evaluation report - the draft evaluation report will be reviewed by CDMP II project management, UNDP and MoDMR/DDM to ensure that the evaluation meets the required quality (as outline in Annex I: CDMP II Final Evaluation Terms of Reference);
  • Final evaluation report - a comprehensive analytical evaluation report in English. The length of the report shall not exceed 40 pages in total (not including annexes);
  • Presentation of initial findings and de-brief - the evaluation team will present initial findings at a debriefing at the conclusion of the FE mission to relevant officials of the Government of Bangladesh, UNDP, project management and donor representatives.

Supervision and Performance Review:

The International Consultant (Knowledge Management and Communication Expert) will work under the direct supervision of the International Team Leader for Final Evaluation of CDMP-II. The Consultant will deliver his/her output to the Assistant Country Director, Climate Change, Environment and Disaster (CCED), UNDP Bangladesh.

Competencies

Functional competencies

Professionalism:

  • Flexibility to make ad-hoc changes as and when the need arises;
  • Ability to perform under stress;
  • Willingness to keep flexible working hours.

Teamwork:

  • Ability to establish and maintain effective working relations as a team member, in a multi-cultural, multi-ethnic environment with sensitivity and respect;

Communication:

  • Excellent interpersonal and communication skills.

Corporate competencies

  • Displays cultural, gender, religion, race, nationality and age sensitivity and adaptability;
  • Highest standards of integrity, discretion and loyalty.

Required Skills and Experience

Education:

  • A minimum postgraduate degree with a focus on Public Relations, Communications, Journalism, Business Administration, Social Sciences or other relevant discipline.

Experience:

  • Minimum of 7 years of relevant professional experience, especially in developing countries;
  • Sound knowledge and experience in understanding of development, disaster management and risk reduction;
  • Sound understanding of knowledge management and experience with developing and implementing Practice Areas;
  • Superior ability to deliver strategic insight and analysis on communications and media strategies relevant to disaster management institutions;
  • Through understanding of key elements of results-based programme management;
  • Strong capacity in data collection and analysis, as well as report writing;
  • Demonstrated ability to work on a multicultural team;
  • Experience or knowledge of disaster management in South Asia and/or regional experience in the area of disaster management would be considered as an advantage;
  • Sound knowledge and understanding of gender issues;

Language:

  • Fluency in English.

Evaluation of Candidates:

Individual consultants will be evaluated based on the following methodology:

Cumulative analysis: The candidates will be evaluated through Cumulative Analysis method. When using the weighted scoring method, the award of the contract will be made to the individual consultant whose offer has been evaluated and determined as:

  • Responsive/compliant/acceptable; and
  • Having received the highest score out of a pre-determined set of weighted technical and financial criteria specific to the solicitation.

Only candidates obtaining a minimum of 70% marks i.e. 49 marks in the Technical evaluation would be considered for Financial evaluation.

Technical Evaluation Criteria (Total 70 marks):

  • Professional experience in review of comprehensive development project comprising of Knowledge Management and experience with developing and implementing Practice Areas - 25 marks;
  • Understanding of results-based programme management and strong capacity for data collection and analysis, as well as report writing - 20 marks;
  • Experience or knowledge of disaster management in Bangladesh and/or regional experience focusing on knowledge management and communication - 15 marks;
  • Working experiences with UN/UNDP, familiarity with UN/UNDP operations and relevant UN/UNDP’s policies - 10 marks.

Financial Evaluation (Total 30 marks):

Financial proposals from all technically qualified candidates will be scored out 30 marks based on the formula provided below. The maximum marks (30) will be assigned to the lowest financial proposal. All other proposals will receive points according to the following formula:

  • p = y (µ/z)

Where,

p = points for the financial proposal being evaluated;
y = maximum number of points for the financial proposal;
µ = price of the lowest priced proposal;
z = price of the proposal being evaluated.

Terms of Reference (ToR) for Final Evaluation of Comprehensive Disaster Management Programme 2010-2015 (CDMP- II):

Background and Context:

The second phase of the Comprehensive Disaster Management Programme 2010-2015 (CDMP II) aims to further reduce Bangladesh's vulnerability to adverse natural and anthropogenic hazards and extreme events, including the potentially devastating impacts of climate change. It is working towards this through risk management and mainstreaming. CDMP II is a natural expansion and a logical scaling up of the Project’s first phase (2004-2009), which laid the foundations for institutionalizing risk reduction approaches and frameworks developed through pilot testing.

CDMP-II aims to further institutionalize the adoption of risk reduction approaches, not only in its host Ministry of Food and Disaster Management, but more broadly across key ministries and agencies.

The project has six outcomes:

The development of strong, well-managed and professional institutions in Bangladesh those are able to implement a comprehensive range of risk reduction programmes and interventions:

  • Primarily working with: Ministry of Disaster Management and Relief – Department of Disaster Management and 41 academic and training institutions;

Reduced risk to rural populations through structural and non-structural interventions, empowerment of rural communities and improved awareness of, and planning for, natural hazard events, including the likely impacts of climate change:

  • Primarily working with: District authorities, Upazila and Union Disaster Management Committees, NGOs and communities;

Reduced risk to urban populations through structural and non-structural interventions, improved awareness of natural hazard events and the piloting of urban community risk reduction methodologies that target the extreme poor:

  • Primarily working with: Municipal authorities, life line authorities, first responding institutions, Disaster Management Committees, NGOs and communities;

Improved overall effectiveness and timeliness of disaster preparedness and response in Bangladesh by strengthening management capacity and coordination, as well as networking facilities at all levels:

  • Primarily working with: Ministry of Disaster Management and Relief – Department of Disaster Management, Fire Service and Civil Defense, Disaster Management Committees, CPP and NGOs;

Improved and broadened disaster-proofing of development funding in eleven government ministries by generating increased awareness of hazard risks and providing technical information, advisory services and resources to stimulate positive changes in planning and investment decisions over the long-term:

  • Primarily working with: National Curriculum and Textbook Board, Flood Forecasting and Warning Center, Bangladesh Metrological Department, Department of Agricultural Extension, Department of Livestock Services, Department of Fisheries, Department of Environment, Fire Service and Civil Defense, Geological Survey of Bangladesh, Department of Public Health Engineering, Department of Women Affairs, Directorate General of Health Services and the Ministry of Land;

Community-level adaptation to disaster risks from a changing climate is effectively managed.

  • Primarily working with: Planning Commission and Ministry of Environment and Forest.

CDMP-II seeks to extend and expand the risk reduction achievements of CDMP I, through a multi-hazard approach to disasters, including climate change risk management, to create and nurture the crucial paradigm shift in disaster management, away from relief and rehabilitation and to a more holistic approach to reducing risks and vulnerabilities. The programme aims to generate participatory community assessments, planning and implementation of risk reduction projects in more than 2,000 unions across the country.

CDMP-II represents the Government of Bangladesh’s efforts to integrate disaster risk reduction and climate change adaptation measures across the national development agenda. This USD 76 million project is supported by six development partners (DFID, EU, Norway, Sida, AusAID and UNDP) and is nationally implemented by the Ministry of Disaster Management and Relief.

The project was planned for a five year implementation period between 2010 and 2014, but has been granted a one year no cost extension and will now end in December 2015. The extension was granted to enable a) full use of remaining resources and b) enhanced sustainability of key results and better handover to the Ministry of Disaster Management and Relief.

Evaluation Purpose:

In line with the commitments given to the donors and partner agencies in the approved project document between UNDP and the Government of Bangladesh, the independent final evaluation of the project will assess the level of progress being made towards the achievement of project impact, outcomes and outputs. In addition to the formal requirements, it is best practice that a thorough review is undertaken to identify areas to continue, improve and design new similar project.

The Final Evaluation (FE) will focus on the relevance, effectiveness, efficiency, results and sustainability of project implementation; lessons learned about project design, implementation and management.

The evaluation is scheduled for early 2015, prior to the project end date. The results of the evaluation are intended to help identify the need for any further technical assistance to support the strengthening of resilience in Bangladesh. The evaluation should also help to understand the relative merits of different implementation options and thematic approaches.

Evaluation Scope and Objectives:

The FE will cover all aspects of the Project’s work both at a policy and institutional level and in the field since the inception of CDMP Phase II. The FE will cover:

  • Relevance – the extent to which the project is contributing to local and national development priorities and organizational policies viz. disaster risk reduction, building resilience and supporting climate change adaptation;
  • Effectiveness – the extent to which progress towards project objectives have been achieved or how likely it is to be achieved over the remaining implementation period;
  • Efficiency – the extent to which best-value has been obtained (results delivered versus resources expended) including the efficiency of financial management and procurement systems;
  • Results and outcomes – the positive and negative, and foreseen and unforeseen, changes and effects driven by project-supported interventions. Results include direct project outputs, short-to medium term outcomes, and longer-term impact including national benefits, replication effects and other, local effects;
  • Sustainability – the likely ability of project-supported intervention to continue to deliver benefits for an extended period of time after completion in the face of continuing climate pressures.  Projects need to be environmentally, as well as financially and socially, sustainable.

The FE will assess how the Project has engaged with the government at the policy level in terms of policy level impacts and how this translates to improved implementation and service delivery at local government level. Similarly, the FE will assess the Project’s work with civil society and vulnerable segments of society such as socio-economically marginalized groups, women and people with disabilities. It will also explore how the Project has promoted community participation, ownership, sustainability and empowerment.

In addition, the FE must examine whether and how the Project strengthened the application of a rights-based approach and mainstreaming of gender in development efforts.

Evaluation Questions:

The FE team, through an Inception Report, will set-out review questions within the agreed methodology framework. The questions, broadly, will cover:

Strategic orientation

  • Assess the effectiveness of implementation strategies for different programme outcome areas;
  • Evaluating the relevance of CDMP II in the context of current national priorities in the disaster management sector in line with HFA and National Plan for Disaster Management (2010 – 2015);
  • Examine the level of integration or comprehensiveness of programme outcome areas;
  • Assessment of trade-offs between investment in policy change and training.

Implementation performance

  • Assess progress against specified outputs, identifying the Project’s contribution from other unrelated factors;
  • Assess the perception of communities, beneficiaries and key stakeholders on the direct and indirect benefits derived from CDMP II to date;
  • Assess the outcome-wise allocation of funds and their efficient utilization for project activities with focus on value for money for the results generated;
  • Assess the specific information on what have been achieved in the field with funds for small schemes;
  • Assess the financial management and procurement procedures of the programme and how they are used to achieve value for money, how allegations of corruption and misuse of funds were addressed;
  • Analyze the organizational and institutional factors that influenced CDMP II’s performance;
  • Assess CDMP II’s monitoring and review strategy; how the NGOs and the Local Authorities (LAs) were selected and how was the monitoring done and improvement of the process and results/ impacts;
  • Assess the technical support available, internally and externally, in support of the Project’s implementation. Analyze the composition of the CDMP II team (whether it is optimal or not) as well as its effectiveness in carrying out project activities collectively as a comprehensive team;
  • Assess the extent to which systems, processes and capacities are being embedded into government with a view to eventual handover of the programme;
  • Examine the relationship between CDMP – II and MoDMR/ DDM and other institutional partners;
  • Effectiveness of small scale interventions versus large scale intervention e.g., DRH;
  • Whether UNDP’s partnership strategy been appropriate and effective?
  • What factors contributed to effectiveness or ineffectiveness?

Lessons learned

  • Identify innovative approaches and their adoption for replication;
  • Identify approaches that failed to achieve the desired results and document the reasons for this failure;
  • Identify best practices.

Sustainability

  • Assess the sustainability of results in light of the current policy and programmatic thrust of the Government of Bangladesh;
  • Review ongoing activities and their adequacy to sustain after the project has concluded;
  • Assess the degree of ownership at national, district, upazalia and union levels.

Partnership strategy

  • Assess the effectiveness of GoB as the implementing agency including role and capacity of local government bodies;
  • Assess relevance, quality and results of partnerships with various I/NGOs, institutions and agencies.

Knowledge Management and Communication

  • Assess how far knowledge management and networking aspects have been pursued;
  • Review the effectiveness of the communication and visibility strategy;
  • Assess to what extent awareness levels of communities and other stakeholders on disaster risk reduction and climate change adaptation has been increased.

The final formulation of the review questions will be agreed in consultation with the review team. Additional considerations may be added at the discretion of UNDP.

Methodology:

The FE methodology will cover three areas:

  • Review of key documents;
  • Interactions/ interviews with key stakeholders including GoB officials, CDMP II staff, donors, implementing partners, beneficiaries and other relevant informants;
  • Field based observations, including visits to implementing Ministries, Departments, Disaster Management Committees (DMCs) and beneficiary communities.

In both Dhaka and the field, all partners will be given the opportunity to speak with the FE team in the absence of CDMP II staff. In the field, community members will be given the same opportunity. The confidentiality of respondents should be respected throughout, in the field this implies interviews take place in the absence of implementing NGO staff.

Evaluation Products (deliverables):

The FE team will be accountable for producing:

Inception report

The inception report will detail the reviewers’ understanding of what is being reviewed and why, showing how the review objectives will be met by way of: appraisal methods and techniques; sources of data (and an assessment their quality); and data collection procedures. The inception report will include a proposed schedule of tasks, activities and deliverables, designating a team member with the lead responsibility for each task or product;

Draft report

The draft FE report will submitted to UNDP, CDMP II programme management team and donors for comment and to ensure that the evaluation meets the required quality.

Final report

A comprehensive analytical report in English. The length of the report shall not exceed 30 pages in total (not including annexes);

Presentation of initial findings and de-brief

The FE team will present initial findings at a debriefing at the conclusion of the FE mission to relevant officials of the Government of Bangladesh, project management and donor representatives.

Evaluation Ethics:

UNEG ‘Ethical Guidelines for Evaluation can be consulted as reference document in drafting review principles.

Evaluation Team Composition and Required Competencies:

The FE team will comprise of a mix of international and national experts:

  • Team Leader (disaster risk management focus - international);
  • Community Risk Management Expert (community-level mitigation / adaptation focus - international);
  • Knowledge Management/Communication Expert (behavioral change focus – international);
  • Disaster Preparedness Expert (early warning / ICT focus - national);
  • Institutional Development/Governance Expert (DRM policy/programme focus – national).

Implementation Arrangements:

CDMP-II will designate a focal point to support meetings and logistical arrangements.

The overall focal point for the assessment will be UNDP Assistant Country Director (Climate Change, Environment and Disaster) who will be ensured consultation, with programme donors and GoB focal points.

Time-Frame for the Evaluation Process:

Desk review - finalizing FE review design and methods, prepare detailed inception report; (7 working days);

  • Deliverable - Mission briefing and Inception report.

In-country review mission - interviews, field visits, prepare draft report (14 Days):

  • Deliverable - Draft report and Mission debrief.

Presentation of initial findings and debrief - presentation of initial findings to UNDP, government partners, donors and project staff (2 days:;

  • Deliverable - Presentation.

Review of draft report (for quality assurance) – incorporating comments and finalizing review report (7 days).

  • Deliverable - Final Report.

UNDP Report Template & Quality Standards:

http://stone.undp.org/undpweb/eo/evalnet/Handbook2/documents/english/pme-handbook.pdf.

This review report template is intended to serve as a guide for preparing meaningful, useful and credible review reports that meet quality standards. It does not prescribe a definitive section-by-section format that all review reports should follow. Rather, it suggests the content that should be included in a quality review/ evaluation report. The descriptions that follow are derived from the UNEG ‘Standards for Review in the UN System’ and ‘Ethical Standards for Reviews.

The FE report should be complete and logically organized. It should be written clearly and understandable to the intended audience. In a country context, the report should be translated into local languages whenever possible (see Chapter 8 for more information). The report should also include the following:

Title and opening pages - Should provide the following basic information:

  • Name of the final evaluation intervention;
  • Time-frame of the FE and date of the report;
  • Countries of the review intervention;
  • Names and organizations of evaluators;
  • Name of the organization commissioning the review;
  • Acknowledgements.

Table of contents - Should always include boxes, figures, tables and annexes with page references.

List of acronyms and abbreviations

Executive summary - A stand-alone section of two to three pages that should:

  • Briefly describe the intervention of the evaluation (the project(s), programme(s), policies or other intervention) that was evaluated;
  • Explain the purpose and objectives of the evaluation, including the audience for the evaluation and the intended uses.
  • Describe key aspect of the evaluation approach and methods;
  • Summarize principle findings, conclusions, and recommendations.

Introduction - Should:

  • Explain why the evaluation was conducted (the purpose), why the intervention is being evaluated at this point in time, and why it addressed the questions it did;
  • Identify the primary audience or users of the evaluation, what they wanted to learn from the evaluation and why, and how they are expected to use the final evaluation results;
  • Identify the intervention of the evaluation (the project(s) programme(s) policies, or other intervention - see upcoming section on intervention;
  • Acquaint the reader with the structure and contents of the report and how the information contained in the report will meet the purposes of the evaluation and satisfy the information needs of the report’s intended users.

Description of the intervention - Provides the basis for report users to understand the logic and asses the merits of the final evaluation methodology and understand the applicability of the evaluation results. The description needs to provide sufficient detail for the report user to derive meaning from the evaluation. The description should:

  • Describe what is being evaluated, who seeks to benefit, and the problem or issue it seeks to address;
  • Explain the expected results map or results framework, implementation strategies, and the key assumptions underlying the strategy;
  • Link the intervention to national priorities, UNDAF priorities, HFA and NPDM, corporate multi-year funding frameworks or strategic plan goals, or other programme or country specific plans and goals;
  • Examine how the mid-term evaluation recommendations were incorporated to improve/ modify the implementation process and programme design;
  • Identify the phase in the implementation of the intervention and any significant changes (e.g., plans, strategies, logical frameworks) that have occurred over time, and explain the implications of those changes for the review;
  • Identify and describe the key partners involved in the implementation and their roles;
  • Describe the scale of the intervention, such as the number of components (e.g., phases of a project) and the size of the target population for each component;
  • Indicate the total resources, including human resources and budgets;
  • Describe the context of the social, political, economic and institutional factors, and the geographical landscape within which the intervention operates and explain the effects (challenges and opportunities) those factors present for its implementation and outcomes;
  • Point out design weaknesses (e.g., intervention logic) or other implementation constraints (e.g., resource limitations).

Evaluation scope and objectives - The report should provide a clear explanation of the evaluation’s scope, primary objectives and main questions.

  • Final evaluation scope - The report should define the parameters of the final evaluation, for example, the time period, the segments of the target population included, the geographic area included, and which components, outputs or outcomes were and were not assessed.
  • Evaluation objectives - The report should spell out the types of decisions review users will make.
  • Evaluation criteria - The report should define the evaluation criteria or performance standards used. The report should explain the rationale for selecting the particular criteria used in the final evaluation.
  • Evaluation questions - Evaluation questions define the information that the evaluation will generate. The report should detail the main evaluation questions addressed by the review and explain how the answers to these questions address the information needs of users.

Evaluation approach and methods - The evaluation report should describe in detail the selected methodological approaches, methods and analysis; the rationale for their selection; and how, within the constraints of time and money, the approaches and methods employed yielded data that helped answer the evaluation/ review questions and achieved the final evaluation purposes. The description should help the report users judge the merits of the methods used in the evaluation and the credibility of the findings, conclusions and recommendations.

The description on methodology should include discussion of each of the following:

  • Data sources - The sources of information (documents reviewed and stakeholders), the rationale for their selection and how the information obtained addressed the evaluation questions;
  • Sample and sampling frame - If a sample was used: the sample size and characteristics; the sample selection criteria (e.g., single women, under 45); the process for selecting the sample (e.g., random, purposive); if applicable, how comparison and treatment groups were assigned; and the extent to which the sample is representative of the entire target population, including discussion of the limitations of sample for generalizing results;
  • Data collection procedures and instruments - Methods or procedures used to collect data, including discussion of data collection instruments (e.g., interview protocols), their appropriateness for the data source, and evidence of their reliability and validity;
  • Performance standards - The standard or measure that will be used to evaluate performance relative to the evaluation questions (e.g., national or regional indicators, rating scales);
  • Stakeholder participation - Stakeholders’ participation in the evaluation and how the level of involvement contributed to the credibility of the evaluation and the results;
  • Ethical considerations - The measures taken to protect the rights and confidentiality of informants (see UNEG ‘Ethical Guidelines for Evaluators’ for more information);
  • Background information on evaluators - The composition of the evaluation team, the background and skills of team members, and the appropriateness of the technical skill mix, gender balance and geographical representation for the evaluation;
  • Major limitations of the methodology - Major limitations of the methodology should be identified and openly discussed as to their implications for evaluation, as well as steps taken to mitigate those limitations.

Data Analysis - The report should describe the procedures used to analyse the data collected to answer the evaluation questions. It should detail the various steps and stages of analysis that were carried out, including the steps to confirm the accuracy of data and the results. The report also should discuss the appropriateness of the analyses to the evaluation questions. Potential weaknesses in the data analysis and gaps or limitations of the data should be discussed, including their possible influence on the way findings may be interpreted and conclusions drawn.

Findings and Conclusions - The report should present the evaluation findings based on the analysis and conclusions drawn from the findings.

  • Findings - Should be presented as statements of fact that are based on analysis of the data. They should be structured around the evaluation questions so that report users can readily make the connection between what was asked and what was found. Variances between planned and actual results should be explained, as well as factors affecting the achievement of intended results. Assumptions or risks in the project or programme design that subsequently affected implementation should be discussed;
  • Conclusions - Should be comprehensive and balanced, and highlight the strengths, weaknesses and outcomes of the intervention. They should be well substantiated by the evidence and logically connected to evaluation findings. They should respond to key review questions and provide insights into the identification of and/or solutions to important problems or issues pertinent to the decision-making of intended users.

Recommendations - The report should provide practical, feasible recommendations directed to the intended users of the report about what actions to take or decisions to make. The recommendations should be specifically supported by the evidence and linked to the findings and conclusions around key questions addressed by the evaluation. They should address sustainability of the initiative and its ownership. Recommendations should also provide specific advice for future or similar projects or programming. Management response template should be included in the evaluation.

Lessons Learnt - As appropriate, the report should include discussion of lessons learnt from the evaluation, that is, new knowledge gained from the particular circumstance (intervention, context outcomes, even about review methods) that are applicable to a similar context. Lessons should be concise and based on specific evidence presented in the report.

Report Annexes - Suggested annexes should include the following to provide the report user with supplemental background and methodological details that enhance the credibility of the report:

  • ToR for the evaluation;
  • Additional methodology-related documentation, such as the evaluation matrix and data collection instruments (questionnaires, interview guides, observation protocols, etc.) as appropriate;
  • List of individuals or groups  interviewed or consulted and sites visited;
  • List of supporting documents reviewed;
  • Project or programme results map or results framework;
  • Summary tables of findings, such as tables displaying progress towards outputs, targets, and goals relative to established indicators;
  • Short biographies of the evaluators and justification of team composition;
  • Code of conduct signed by evaluators;
  • UNEG, ‘Standards for Evaluation in the UN System’, 2005, available at: http://www.unevaluation.org/unegstandards;
  • UNEG, ‘Ethical Guidelines for Evaluation’, June 2008, available at http://www.uneval.org/search/index.jsp?q=ethical+guidelines;
  • The review criteria most commonly applied to UNDP reviews are the OECD-DAC (Development Assistance Committee) criteria of relevance, efficiency, effectiveness, and sustainability;
  • All aspects of the described methodology need to receive full treatment in the report. Some of the more detailed technical information may be contained in annexes to the report. See Chapter 8 for more guidance on methodology;
  • A summary matrix displaying for each of evaluation questions, the data sources, the data collection tools or methods for each data source, and the standard or measure by which each question was evaluated is a good illustrative tool to simplify the logic of the methodology for the report reader.