Background
The United Nations Development Programme (UNDP) Istanbul International Center for Private Sector in Development (IICPSD) has a mandate to leverage the role of the private sector in development. The IICPSD was established in Istanbul, Turkey on the basis of the Partnership Framework Agreement signed between the Government of the Republic of Turkey and the United Nations Development Program in March 2011. The center is one of the six global thematic centers of UNDP, advocating and facilitating the contribution of the private sector to human development and inclusive growth.
The SDG AI Lab is a joint initiative of UNDP Bureau of Policy and Programme Support (BPPS) teams and hosted under UNDP IICPSD. The Lab has a mission to harness the potential of frontier technologies such as Artificial Intelligence (AI), Machine Learning (ML), Geographic Information Systems (GIS) for sustainable development. SDG AI Lab provides research, development, and advisory services in the areas of frontier technologies and sustainable development. As well, the Lab supports UNDP’s internal capacity strengthening efforts for the increasing demand for digital solutions. To bridge the talent gap in the use of frontier technologies in development contexts, the Lab mobilizes volunteer data scientists, connecting UNDP teams and highly skilled data scientists to address development challenges with digital solutions.
Under the guidance and direct supervision of the IICPSD’s Technical Specialist, the Data Science Analyst will develop and implement ML/NLP based digital solutions. The Analyst will also manage the recruitment and engagement of online volunteer data scientists into the Lab’s ML/NLP projects using Scrum methodology and Agile project management practices.
Duties and Responsibilities
Summary of key functions:
- Conduct research on application of Machine Learning (ML) and Natural Language Processing (NLP) approaches to development issues.
- Develop Machine Learning (ML) and Natural Language Processing (NLP) solutions to tackle issues in the field of economic and international development.
- Support the team in developing and implementing ML and NLP solutions.
- Collaborate with the partners in identifying their needs and translating into technical requirements.
- Contribute to Data Visualization workstream projects.
- Support the SDG AI Lab Volunteer Developers community through effective engagement and management of online volunteers.
Manage development and implementation of ML/NLP projects for SDGs:
- Work closely with project stakeholders to identify their needs and preferences.
- Conduct literature review to identify suitable technical solutions and best practices.
- Contribute to the development of Concept Notes, Terms of References and other project documentation.
- Manage and document the complete ML/NLP project implementation process from conception to implementation.
- Apply advanced natural language processing techniques to build, maintain, and improve on analytics, organize large datasets to get actionable insights from them.
- Train and evaluate natural language processing models.
- Provide analytical briefs summarizing the project findings and methodologies in accessible to general public manner.
Provide advisory and technical support to Data Visualization workstream:
- Use Data Visualization tools such as Power BI, Tableau and others to support the development of digital products such as data dashboards and reports.
- Organize and visualize raw data, aggregated data and other types of data.
- Work closely with project stakeholders – to consult on their needs and data available.
Contribute to the Lab’s Volunteer Data Scientists Initiative through advancing the volunteer engagement in SDGs related projects and managing their performance:
- Support recruitment of online volunteer data scientists.
- Establish and manage volunteer teams.
- Oversee the projects sprints’ planning and implementation as a Scrum Master.
- Sound contributions to SDG AI Lab Volunteer Community.
- Organization of trainings for Volunteer Developer Community on relevant topics.
Competencies
Core | ||||||||||||||||||||||
Achieve Results: | LEVEL 1: Plans and monitors own work, pays attention to details, delivers quality work by deadline | |||||||||||||||||||||
Think Innovatively: | LEVEL 1: Open to creative ideas/known risks, is pragmatic problem solver, makes improvements | |||||||||||||||||||||
Learn Continuously: | LEVEL 1: Open minded and curious, shares knowledge, learns from mistakes, asks for feedback | |||||||||||||||||||||
Adapt with Agility: | LEVEL 1: Adapts to change, constructively handles ambiguity/uncertainty, is flexible | |||||||||||||||||||||
Act with Determination: | LEVEL 1: Shows drive and motivation, able to deliver calmly in face of adversity, confident | |||||||||||||||||||||
Engage and Partner: | LEVEL 1: Demonstrates compassion/understanding towards others, forms positive relationships | |||||||||||||||||||||
Enable Diversity and Inclusion: | LEVEL 1: Appreciate/respect differences, aware of unconscious bias, confront discrimination | |||||||||||||||||||||
Cross-Functional & Technical competencies
|
Required Skills and Experience
- Master’s or higher degree in Computer Engineering, Computer Science, Statistics, Econometrics, Mathematics, Economics or related field OR Bachelor’s degree in Computer Engineering, Computer Science, Statistics, Econometrics, Mathematics, Economics or related field is required.
- Minimum 0-1 year of experience in research and development in the field of data science, data analysis, data engineering, machine learning or other related areas; (applicable for candidates having master’s degree) OR Minimum 2 years of research and development in the field of data science, data analysis, data engineering, machine learning or other related areas (applicable for candidates having bachelor’s degree).
- Proven knowledge and expertise in development and implementation of projects in the field of machine learning, specifically with Scikit-learn, Tensorflow or Pytorch.
- Experience in development and implementation of projects in the field of natural language processing, specifically with NLTK, Spacy, Bert, GPT3, RoBERTa.
- Hands on experience in data science and data visualization projects using Power Bi/Tableau.
- Demonstrated knowledge on Agile project management and Scrum methodology.
- Strong proficiency in programming languages, preferably Python or R.
- Comprehensive knowledge of SQL and data modelling is required.
- Experience in web development (i.e. HTML/CSS, React, JavaScript, Django) is an asset.
- Familiarity with AWS technologies or Google Cloud is an asset.
- Experience with Big Data (Hadoop, Spark, Hive) is an asset.
- Participation in Data4Good projects.
- Experience in recruitment and management of remote teams is an asset.
- Experience with UNDP is an asset.