Background

Identifying, validating and documenting lessons learned have always been a necessary part of organizational learning for UNDP, and learning constructively from past mistakes and experiences is a critical part of Managing for Development Results (MfDR) and the UNDP accountability framework. These processes helps informing quality programming and allows the organization to test the relevance of its theories of change at the project, Country Programme and Corporate (strategic Plan) levels. In 2014, UNDP’s Executive Board’s encouraged the organization to “further analyze the data behind the results using information provided by relevant analytical tools of the organization, including, inter alia, evaluations and audits, to understand what drives or hinders performance and progress, and adjust programmes accordingly”.

Among UNDP’s various analytical tools, the Results Oriented Annual Report (ROAR), based on the principles of results-based management, and intended to help units measure and monitor their programmes and the difference they are making for sustained, transformational change, has always been a critical exercise for Country programme level lesson learning and annual stock-taking. Likewise, evaluations are a critical source of lesson learning at the Project/Country Programme and Corporate level, and the organization has put a lot of emphasis over the past years, on increasing the quality and utility of evaluations at the Country level.

Duties and Responsibilities

Under the overall supervision of the Development Impact Group Chief, the consultant will undertake a qualitative analysis of UNDP’s ROAR and evaluations from 2014 and 2015, in order to produce an analysis of lessons learned at the Country Office level in the first 2 years of UNDP’s Strategic Plan (2014-2015).

The key tasks for the consultancy are listed below:

  • Undertake a qualitative analysis of Country office and Regional ROARs for 2014 and 2015 and relevant evaluations published in 2014 and 2015 in order to identify lessons learned on the way Country Offices are achieving Development Results (Country Programme results and IRRF results).  Primarily, the study will cover the issue of what worked well or did not work well for UNDP over the first 2 years of the Strategic Plan (2014-2015) from a substantive and institutional point of view. The analysis should help answer the questions: “what has worked”, “what hasn’t worked” and “why” both from a development result and organizational point of view, and understand what the organization should do differently to achieve better results. The lessons have to be produced (distilled or extracted) from the various experiences reported by Country Offices in the ROAR (in 2014 and 2015), or from Evaluations (2014 and 2015) to help increase the knowledge and understanding on UNDP’s development and institutional approaches, choices, and capacities;
  • Produce a lessons learned report that will (1) inform the 2014-2017 Strategic Plan Midterm Review (MTR) and (2) be of use to Regional Bureaux and Country Offices to learn from UNDP’s evidence base over the past 2 years;
  • Provide key recommendations on the adequacy of the ROAR as a lesson generating tool for UNDP, and potential solutions to categorize lessons learned for the organization.

Timeline

The consultancy will be sequenced into 2 phases, in order to look at 2 different data sets:

  • Phase 1: November- December 2015  (30 working days, including 10 days based in NY) : 2014 data for ROAR, and decentralized and independent evaluations for 2014 (already available);
  • Phase 2: January-March 2016 (40 working days, including 10 days based in NY): 2015 ROAR data and evaluation data (available after January 15th 2016).

Key deliverables will include the following:

Deliverable 1-- Inception report for the 2014 data set with the following elements (15 pages):

  • Description of the review methodology;
  • Adjusted  review questions;
  • Clear work plan for completion of the review;

Indicative delivery date: November 25th, 2015:

Deliverable 2 - 2014 lessons learned analytical report

  • Up to 30 pages + annexes as needed) with key lessons learned, including issues, best practices, areas of improvement, conclusions and recommendations.

Indicative delivery date: December 21st, 2015.

Deliverable 3 - 2015 lessons learned analytical report

  • (Up to 30 pages + annexes as needed) with key lessons learned, including issues, best practices, areas of improvement, conclusions and recommendations.

Indicative delivery date: February 24th, 2015.

Deliverable 4 - A final analytical report

  • (Up 30 pages including an Executive Summary of 1-2 pages) that will present the key lessons learned for the first 2 years of the Strategic Plan and specific highlight on the entry questions (cf. Scope section).

Indicative delivery date: March 5th, 2015.

Deliverable 5 – Two recommendations briefs:

  • Conclusions on the ROAR as a reporting tool for lessons learned and recommendations for improvement (2 pages)
  • Concise recommendations on how to best categorize lessons learned for UNDP, which will potentially inform the ongoing development of UNDP-wide lessons learned database (2 pages).

Indicative delivery date: March 5th, 2015:

Methodology & Scope

This consultancy will be undertaken in parallel with other consultancies commissioned for the Mid-term review of the Strategic Plan and the consultant is expected to consult on the regular basis with the MTR team, along the way. The review is intended to impartially and systematically provide clear findings and conclusions from the analysis of lessons learned reported by Country Offices and Regional Bureaux through 2 sources of data:

  • ROARs 2014 and 2015;
  • Independent and decentralized evaluations published in 2014 and 2015.

The consultant will start with a separate synthesis of lessons for ROAR and Evaluations as a first step, then undertake any correlational analysis if feasible.

The consultancy will look at lessons learned from the perspective of which factors accelerated or impeded the achievement of development results at the Country level, and across the organization. The findings should be drawn from the analysis of Development and Institutional factors driving overall development performance.

Lessons learned include capturing best practices/ good practices for success in managing for development results, as well as reflections on failures in doing the same, and recognition of causation/associated factors. The lessons captured should be the ones that lend themselves to concrete actions that triggered or will trigger a change in future practice. The lessons learned identified should be the outcome of a learning process, which involves reflecting upon the experience. Neither an accumulation of "facts" or "findings" nor an accumulation of evaluations should yield lessons.

The data will be analyzed by asking the following questions:

The initial unit of analysis will be the Country Office:

On the basis of Country offices’ self-assessment of performance:

  • What worked well for Country Programmes which delivered expected results? What factors supported this success?
  • What did not work well for Country Programmes which did not deliver results as expected, and what specific challenges arose (anticipated or unanticipated) that impeded good performance?
  • What could have/needs to be done differently or better?
  • What changes need to be made in order to meet better results in the future?

Then based on feasibility / credibility of data, the unit of analysis will be the SP outcomes:

  • Can trends or patterns by SP outcome/region/country typology /utility be discerned?
  • What did not work well to achieve the SP outcomes and outputs?

Sources of data:

ROAR as a source of data for lessons learned:

Drawing from the 2014 EB decision, the ROAR went through a major redesign in 2014, which was aimed at shifting towards a more reflective exercise for Country Offices (COs). The ROAR was revised to become an improved lesson generating tool which would consider evidence around the factors facilitating and hindering progress, and from that analysis, draw lessons which would inform future actions. One of the key changes in 2014 was therefore to enable Country Offices to draw key lessons from their performance self-assessment (good and/or weak performance), and reflect and learn not only development results but also organizational performance results. The revised ROAR was designed in order for the findings from this self-assessment to feed directly into the COs’ Integrated Work Plans of the following year, thus becoming a lesson learning tool for improved programming.

The nature of the data captured through the ROAR is of mixed nature therefore the consultant will have to refer to mixed sources of self-reported and independent data:

  • Independent data (evaluations and externally-produced indicators from national systems and/or international data sources);
  • Quantitative and Qualitative data deriving from Corporate planning frameworks such as CPDs at the Country level and IRRF at the Corporate level;
  • Other types of data from additional corporate tools such as Integrated Work Plan, Atlas (Gender marker and financial data (linking resources to results);
  • Self-reported data.

Evaluations as a source of data for lessons learned:

Evaluations are a critical source of data for lesson learning for the organization. In order to draw lessons from the Country level, this consultancy will focus on UNDP’s Independent and Decentralized Evaluation data sets:

  • On independent evaluations, the consultant will distill lessons from all 2014 and 2015 ADRs, thematic and global evaluations, and also use these as a triangulation source for specific SP outcomes as relevant
  • On decentralized evaluations, lessons learned will be distilled from project and outcome level evaluations from 2014 and 2015

Categorization of lessons learned

While looking at lessons learned, the consultant will have to make sure that only certain types of lessons learned count for UNDP, and the following criteria will be applied to narrow down the analysis. The consultant will be able to propose new or revised criteria based on the findings of the analysis:

  • Cross-section distillation of lessons per SP outcomes and by regions (as an example: this is a simple analysis of where we can generate which lessons were generated in gender or for environment, let's say with GEF projects);
  • Thematic reflection for cross cutting themes (gender, SS, innovation) (using tagging done for AR 2014)
  • Trends or patterns by SP outcome/region/country typology /utility
  • Reflections that test a given theory of change for a theme, and/or provides an evidential basis for improving and adapting it (the Theories of change of the Strategic Plan outcomes will be used as entry points)

Reflections that can be correlated with a specific utility, such as:

  • Strengthened dialogue with partners to inform more responsive new programming;
  • Partnership Strategy Amended, and /or new partnerships built;
  • More gender-responsive and human rights based programming;
  • Strengthened results based management systems including improved use of indicators and evidence;
  • Prioritized and more focused programme;
  • Strengthened programme synergies;
  • Strengthened programmatic approach and long-term and exit strategies to capacity development;
  • Leveraging and up-scaling UNDP’s interventions to programmatic areas of comparative advantage.

 

  • Reflections that captures both the positive and negative externalities (“intended” or “unintended results”) of a given theme (as it applies to its respective theory of change). An example herein of a positive externality is strengthening/ mobilizing of women’s movement as a consequence of work in governance arena in elections.

Based on findings, the consultant will be expected to refine the analytical framework and propose an aggregate focus for the lessons learned.

MTR focus

The conclusions and findings should inform the 2016 SP MTR. The analysis of performance 2014-2015 will be informed by this analysis of lessons learned and allow to answer the following questions:

  • What factors contribute to high performance;
  • What factors may cause low performance;
  • Focus on UNDP’s comparative advantage while looking into factors of good performance and success,

Institutional Arrangements

  • The assessment is partly home based, partly based in the BPPS office in New York City. It will include a combination of face-to-face meetings, and analytical work;
  • The Consultant will report directly to, and will seek approval of outputs from the Development Impact Group Chief, and in close collaboration with the Executive office’s team;
  • The consultant will work closely with the members of the Performance Monitoring & Analytics team and the Executive Office team as needed. S/he will also work closely work with other consultants involved in all MTR-related analytical work done in parallel.
  • If needed, s/he will also liaise with regional bureaux and UNDP country offices;
  • The Consultant will be given access to relevant information necessary for execution of the tasks under this assignment including the raw data sets in excel format for all sources of information mentioned above, s/he will have access to the ROAR online system, the Corporate Planning System, along with lessons learned reports, annual results reports, powerpoints developed for communications purposes, and any other documentation;
  • The Consultant will be responsible for providing her/his own laptop;
  • Payments will be made upon submission of a detailed time sheet and certification of payment form, and approval and confirmation by the Chief of the Development Impact Group, BPPS on days worked and outputs delivered.

Travel

  • No travel is expected for the consultant in any case, should there be a need for travel, all mission-related travel expenses will be supported by the project travel funds and will be arranged for by UNDP in accordance with UNDP entitlements and travel policy. Costs for official mission travels should not be included in financial proposal;
  • Travel and expenses to join the duty station in New York should be included in the financial proposal, if applicable. Please refer to below section on Application procedures.

Competencies

Corporate Competencies

  • Ability to work effectively as part of a team;
  • Demonstrates integrity by modeling the UN’s values and ethical standards;
  • Promotes the vision, mission, and strategic goals of UNDP, and partner organizations;
  • Ability to work in a multicultural environment;
  • Sound judgment and discretion;

Functional Competencies

  • Excellent methods for content analysis of large bodies of qualitative information;
  • Demonstrates outstanding skilled knowledge of qualitative data analysis methods;
  • Demonstrates excellent ability to prioritize and focus research questions to produce relevant and useful recommendations;
  • Objectivity and ability to analyze large multi-country  datasets in short period;
  • Ability to handle highly complex dataset and drive forward multiple workstreams;
  • Excellent written and verbal communication skills, particularly to summarize complex findings and convey complex achievements in clear results language;
  • Produces timely, quality outputs;
  • Strong skills in data presentation;
  • Excellent understanding of international development issues and knowledge of the UN system
  • Fully acquainted with UNDP’s (or a related organization) business model, RBM practices, and planning/reporting frameworks;
  • Solid understanding of strategic planning and corporate results reporting both at the macro and meso levels;
  • Microsoft excel advanced skills.

Management and Leadership

  • Demonstrates written communication skills;
  • Focuses on impact and results for the client and responds positively to feedback;
  • Consistently approaches work with energy and a positive, constructive attitude;
  • Remains calm, in control, and composed even under pressure;
  • Demonstrates openness to change and ability to manage complexities.

Required Skills and Experience

Education:

  • Master’s Degree or equivalent in areas related to Social Sciences, Economics, International Development or related field;
  • Practical training in survey and research analysis, or applied statistical data.

Professional Experience:

  • A minimum of 10 years of relevant professional work experience, in areas such as qualitative research,  evaluation research with knowledge of innovative methods in social science analytics;
  • Proven experience in conducting development data reviews of both qualitative and quantitative datasets, and a proven record delivering professional results;
  • Analysis in social sciences an asset, specifically qualitative data analysis;
  • Previous experience determining and analyzing lessons learned and conducting similar qualitative analysis work;
  • Proven background of working with the UN system, with a track record of written deliverables;
  • Experience working collaboratively in small teams with tight deadlines;
  • Experience living and working in developing countries, particularly in the environment and/or international development fields, is a strong asset.

Languages:

  • Excellent oral and written communication and presentation skills in English are required;
  • A good working knowledge of French and Spanish is required (data comes in the 3 languages).

Scope of Price Proposal and Schedule of Payments:

The price offer should indicate a total lump sum amount, based on an all-inclusive daily processional fee in US dollars. The payments will be linked to deliverables. This contract price is fixed regardless of changes in the cost of components.

Application procedures:

The application is a two-step process (occurring in parallel). Failing to comply with the submission process may result in disqualifying the applications:

Step 1: Submission of technical proposal:

Technical Criteria weight: 70%.

Interested individuals must submit the following documents/information to demonstrate their qualifications

in UNDP Job Shop (Please note that only 1 (one) file can be uploaded therefore please include all docs in one file):

  • CV or Personal History Form (P11), indicating all past experience from similar projects, as well as the contact details (email and telephone number) of the candidate and at least three (3) professional references (the template can be downloaded from this link: http://sas.undp.org/Documents/P11_Personal_history_form.doc;
  • Technical Proposal (maximum 5 pages): a brief description and justification of the methodological approach proposed to conduct the analytical tasks required for this assignment; demonstrating the applicant’s qualifications and experience will enable him/her to successfully deliver against the requirements of this assignment within the required timeframe; and an indicative work plan including timings;
  • Two relevant examples of similar analytical work published (applicants may submit a web link or a copy of the report together with their application to UNDP job-shop, ensuring it enables clear identification of the relevant piece of work).

Step 2. Submission of Financial Proposal

Financial Criteria weight: 30%.

Applicants are instructed to submit their financial proposals in US Dollars for this consultancy to bpps.procurement@undp.org, using the financial proposal template available here: http://procurement-notices.undp.org/view_file.cfm?doc_id=45780. The proposals should be sent via email with the following subject heading: “Financial Proposal for Lessons Learned Analysis Consultant" by the deadline for this vacancy. Proposals to be received after the deadline may be rejected. In order to assist the requesting unit in the comparison of financial proposals, the financial proposal should be all-inclusive and include a breakdown. The term ‘all-inclusive’ implies that all costs (professional fees, non-mission travel related expenses, communications, utilities, consumables, insurance, etc.) that could possibly be incurred by the Contractor are already factored into the financial proposal.

Criteria for Selection of the Best Offer:

Only those candidates who meet the minimum level of education and relevant years of experience requirements will be considered for the  technical evaluation. The technical evaluation will include a desk review and may also include interviews with shortlisted candidates.

Method: cumulative analysis method will be used to evaluate proposals. Combined scoring method is where the qualifications and methodology will be weighted a max of 70%, and combined with the price offer which will be weighted a max of 30%.

When using this weighted scoring method, the award of the contract will be made to the individual consultant whose offer has been evaluated and determined as:

  • Responsive/compliant/acceptable; and
  • Having received the highest score out of a pre-determined set of weighted technical and financial criteria specific to the solicitation.

Only candidates meeting the language requirements and obtaining a minimum of 49 (70%) points on technical part will be considered for the Financial Evaluation.

Criteria for technical evaluation [70 points]:

  • Suitability of proposed work plan and input days to deliver required deliverables according to requested timeline [20 pts]
  • Suitability of proposed methodological approach [20 pts];
  • Previous experience determining and analyzing lessons learned and conducting similar qualitative analysis work on complex /multi-country datasets and large bodies of qualitative information [10 points];
  • A minimum of 10 years of relevant professional work experience, in areas such as qualitative research,  evaluation research with knowledge of innovative methods in social science analytics [10 points];
  • Demonstrates outstanding skilled knowledge of qualitative data analysis methods [10 points];

Criteria for financial evaluation (30 points maximum)

The following formula will be used to evaluate financial proposal:

p = y (µ/z), where

p=points for the financial proposal being evaluated

y=maximum number of points for the financial proposal

µ=price of the lowest priced proposal

z = price of the proposal being evaluated

Important note:

Please kindly make sure that you have provided all the requested materials. UNDP reserves the right to disqualify any incomplete submission.

UNDP is committed to achieving workforce diversity in terms of gender, nationality and culture. Individuals from minority groups, indigenous groups and persons with disabilities are equally encouraged to apply. All applications will be treated with the strictest confidence.