Geraldina Villalobos Quezada PhD
Email address: **********************@*****.***
Skype: gvillalobosquezada
Linkedin:
https://www.linkedin.com/in/geraldina-villalobos-quezada-b617946b/
Education:
●Doctor of Philosophy (PhD) in Educational Leadership with
specialization in Evaluation, Measurement, and Research (Monitoring & Evaluation).
Western Michigan University, Kalamazoo Michigan USA, December, 2005
Dissertation Work: Performance Evaluation Models for Strategic Decision-Making in the Health and Social Service Sectors: Towards a Hybrid Model. Review of six case studies on Balanced Scorecard and CIPP Evaluation Model.
*ProQuest Most-Accessed Dissertations and Theses June 2016
●Master of Arts (MA) in Public Health Education, Western Michigan University, Kalamazoo Michigan USA,
April, 2001.
Professional
Certifications/
Professional
Awards:
ACHIEVEMENTS
.
● More than 20 years of professional experience in in evaluation, monitoring, research, and related systems and capacity strengthening, working with UN public system, private consultancy, and non-profit organizations.
Expert. in international development specializing in Monitoring & Evaluation in Monitoring, Evaluation, Reporting and Learning (MERLA);, applying M&E Concepts, Principles, and Policies. I have supported program staff and national government partners and other organizations to design and implement complex projects and initiatives supporting evaluations.
• Relevant experience working with UNICEF LAC in the design, effective conduct, and uptake of evaluations and expertise in developing, team leading, managing and quality assuring evaluations.
•Provided technical guidance and quality assurance of evaluation activities in UNICEF’s Country Offices in Latin America and the Caribbean, and working in close collaboration with UNICEF Country Office evaluation staff, contributed to the design, effective conduct and uptake of evaluations of high-quality.
•Co-managed Multi-Country evaluations and Strategic Country Programme Evaluations.
•Supported evaluation capacity building of UNICEF Country Office Staff and national partners, as well as, institutionalization of the evaluation function in UNICEF Country Offices through prioritization, planning, and monitoring.
•Conducted quality assurance and provided methodological advice on evaluations.
•Contributed to UNICEF evaluation efforts and knowledge dissemination and utilization, including communication of UNICEF LAC evaluation findings and dissemination (i.e., evaluation briefs, infographics, synthesis reports of lessons learned and recommendations).
● Experience in managing, designing and/or implementing evaluative exercises for UNICEF and USAID.
● Experience in international development work at CDC, UNICEF and USAID, data analytics and data visualization skills (Tableau, and Qualtrics software) and database and AI applications (, and experience with l M&E plans and platforms. Provided technical assistance on M&E and key performance indicators (KPIs) data collection and reporting.
● Experience in quality improvement (conducting GEROS quality assurance system), and for Regional and Country Programme evaluations, project management, and research.
● Supported country programme program managers in all aspects of implementation of programmes to ensure timely and accurate completion of deliverables
• Experience with managing and supporting administrative aspects of the program including subawards, tracking program expenditures and monitor program budgets schedules and timelines.
● Strong Experience with analysis, development, and evaluation of migration and refugee evaluations and analysis of policies at the local, national, and international levels.
● Experience in applying a GBA lens-contributed in the Analysis and Review with Evaluation Reference Gender perspective
Review of projects from a gender lens and to know:
-Whether and how gender equality is covered by the specific CDC programs and projects.
-Whether and how CDC programs and projects incorporate gender-responsive and equity focused evaluative evidence. If not, what are the possible barriers.
-Whether perspectives from marginalized voices are included in the CDC Program Evaluations.
-How evaluative evidence on gender can be strengthened.
● Experience in developing and implementing results-based management tools including as logic models and log frames, performance measurement tools.
● Strong experience in writing evaluation reports. Worked on writing evaluation reports including data summaries and recommendations for future program improvement.
●Knowledge Management Expert. More than 10 years’ experience supporting coordination of the integration of practices that support knowledge capture and transfer, collaboration, organizational learning and strategic learning.
●Training/Capacity Building Specialist. Supported training and capacity building initiatives being responsible for supporting capacity building teams in the provision of instructional design, delivery, and capacity building trainings.
● Experience with training and capacity building in MEAL, development and implementation of systems for data collection, analysis and use.
●Technical content experience in Peacebuilding and Democracy, Health Systems Strengthening, HIV, Nutrition and Gender, Child Protection and Migration, Women Empowerment, Emergency Preparedness and Response, Covid-19 Responses, Energy and Climate Change.
● Strong Supervisory and Project Management Experience. Led and supervised research and M&E teams and work groups. Developed project evaluations including preparation and implementation of an evaluation plan from start to finish and in data collection and analysis of large databases: Assisted managers in strategies, plans, indicators; data collection and information management tools and training; quality assurance mechanisms; developed scopes of work, implementation of work plans to meet identified needs, requests for proposals, screenings of proposals, contracts for work. Experience managing research and evaluation projects as Project Director for public health and international development projects, including evaluations of child-focused interventions, including, immigration and child protection systems, food security, nutrition, education, health, and emergency emergencies evaluations, Real-Time Assessment response to Covid-19.
● Experience working in Africa, Latin America and the Caribbean.
●Project Management (PMP) certified.
●Certified in Kaplan-Norton Balanced Scorecard Certification, Building Dashboards: The Palladium Group., The New Science, San Diego, CA, October, 2004.
●Certified in Leadership Educators Program, Harvard University, Boston, MA, May 2001.
Key Qualifications
and
Experience:
I have provided consulting services for different profit and non-profit organizations including:
UNICEF
USAID
NASA Office of Education
Centers for Disease Control and Prevention (CDC)
W.K. Kellogg Foundation.
Western Michigan University
KPMG, Peat Marwick
Professional
Activities:
Outcome Harvesting Senior Consultant (2024-2025) 40 hrs per week
Provided technical and managerial oversight and capacity building on the implementation of Outcome Harvesting methodology.
Responsible for playing a key role in designing, implementing, and managing outcome harvesting processes for projects and programs. Guided the identification, collection, and analysis of outcomes and impacts to assess the effectiveness of interventions and inform evidence-based decision-making. Use of evidence from outcome harvesting to contribute to the organization's ability to measure, report, and improve organizational performance and effectiveness.
Collaborated in the design of a Systems Harvesting Practitioner’s Guide and Implementation Toolkit, including a blended approach using Systems Practice, Outcome Mapping, and Outcome Harvesting.
Key Tasks and Responsibilities included:
- Led the design of the outcome harvest by mapping current outcome harvesting processes and identify key gaps in the process.
• Built capacity within the organization to ensure effective implementation of outcome harvesting processes.
• Provided technical guidance and training to County Office (CO) teams and stakeholders on outcome harvesting methodologies, including the use of tools and techniques.
• Developed outcome harvesting data collection strategies, tools, outcomes, and guidelines to facilitate the outcome harvesting process.
• Coordinated the collection of qualitative and quantitative data using outcome harvesting techniques.
• Analyzed country data to identify outcomes, impacts, and key lessons learned.
• Collaborate with monitoring and evaluation teams to integrate outcome harvesting into broader monitoring and evaluation frameworks.
• Prepared high-quality reports, presentations, and other communication materials to effectively communicate outcome harvesting findings to diverse audiences.
• Substantiated outcomes through third-party sources.
• Contributed to knowledge sharing initiatives and ensure what is learned from outcome harvesting is disseminated within the organization and to external stakeholders.
• Facilitated learning sessions with CO staff to review outcomes harvested, reflect on preliminary findings, and develop preliminary conclusions about the influence of interventions on changes in Resilience Capacities and Areas of Flourishing.
• Identified information needs, respondents, and questions focusing on change and influence.
• Delivered a catalog of outcomes harvested and a brief of preliminary findings and conclusions.
UNICEF
As a Senior Evaluation Consultant at UNICEF (2020-2023) 40 hrs per week
Supported UNICEF in different evaluation efforts and activities, including:
Management of evaluations
a. Supported the Country Offices and national partners to develop and implement a plan of evaluation activities.
b. Supported Country Programme Offices (Cos) to scope and draft evaluation TORs.
c. Supported the management and/or backstopping of specific regional or country level evaluations.
d. Provided guidance to the country office to ensure that evaluation findings and lessons learned are incorporated in planning, and reporting documents.
e. Reviewed outputs and outcomes across the country and regional programmes .and provided guidance and support on evaluation performance indicators and on adaptive measures when these indicators are not on track
-Assessed sustainability of country and regional programmes based on conditions to scale.
-Captured MEL approaches that were employed, what worked/didn’t work and why, recommendations and lessons learned for future programming.
-Provided recommendations on ensuring sustainability, including lesson learned to inform future programming.
•Provided in-country technical M&E support to programs as needed
•Supported programs in determining how to use their data for program improvement and decision-making, and encourages the use of data for decision making.
Evaluation Conduct and Use
Supported the Country Offices in ensuring that UNICEF-supported evaluations were designed and implemented meeting the established UN quality standards, resulting in useful and high-quality evaluation reports.
Assisted Country Offices and their partners to formulate Terms of Reference and evaluation designs of high quality. Supported the recruitment of qualified consultant evaluation teams following UNICEF procedures and guidance.
a.Supported the conduct of evaluative exercises, including:
• Planning
• Preparation
• Data collection and analysis
• Reporting
b.Monitored the implementation of Multi-Country, Country, and Thematic evaluations, and oversight of the deliverables, and managed the quality assurance and inputs form Steering Committee; Reference Group into the quality assurance process.
Quality assurance.
Provided technical support for the quality assurance of regional and country level evaluation deliverables, to ensure evaluation teams deliver high quality reports evaluations comply with UNICEF and UNEG standards and guidelines.
a.Conducted quality reviews of evaluation documents such as terms of reference and reports. Quality assurance of key evaluative deliverables
Including terms of reference, inception reports, draft evaluation reports.
Evaluation dissemination and use
Supported evaluation efforts at the Country Programme Level -CO senior management to ensure that evaluation findings are useful for uptake and communication of the findings to key stakeholders and the wider public.
a. Led the development of evaluation-related communication documents (evaluation briefs, case studies, newsletters, etc.)
b. Provided supported to ensure that the evaluation management responses fully address the recommendations.
c. Provided support for the identification and dissemination of good practices, innovations and knowledge on evaluation related matters.
d. Provided support for the identification of additional opportunities for evaluation evidence use.
Capacity development of RO/CO staffs
a.Provided support for the evaluation capacity development for UNICEF CO staff by improving evaluation planning and management processes at country level.
Provision of management, technical, strategic oversight, and quality assurance for UNICEF LAC Country and Regional evaluation projects and development of TORs for the following Multi-Country, Regional, and Country Programme Evaluations, and Thematic Evaluations:
Supported Country Offices in the managing and implementation of Children on the Move (COM) Multi-Country Evaluation 2017-2021. Evaluación de las intervenciones de UNICEF en materia de migración en México, Guatemala, Honduras y El Salvador, Informe Multipaís.
Supported the UNICEF’s Response to the Venezuela Outflow in Brazil, Colombia, Ecuador and Peru between 2019-2021 (UNICEF Venezuela).
Supported the UNICEF’s A Real-Time Assessment (RTA) of the UNICEF ongoing response to COVID- 19 at country level – Latin America and Caribbean (LAC) region 2020-2021.
•Assisted Country Offices and their partners to formulate Terms of Reference and evaluation designs of high quality, examples include:
( 1) ToR and Inception report PLANEA (Argentina, 2022)., Country Programme Development,
( 2) CPD Final Evaluation (CPD Argentina, 2016-2021)., TOR Cash Plus Pilot Programme Assessment (Belize, 2022).,
( 3) Evaluación del Modelo Demostrativo Municipal de Desarrollo Integral de la Primera Infancia (DIPI, 2018-2021).,
( 4) TOR, Urban WASH Evaluation/Evaluación del Programa del Modelo de Saneamiento Descentralizado Urbano (PSSDU) implementado en UNICEF Bolivia 2018-2022.,
(5) UNICEF Brazil Country Programme Evaluation (CPD Brazil, 2016-2021).,
(6) TOR, Estudio de Caso de la Contribución de UNICEF al Subsistema Chile Crece Contigo (UNICEF Chile, 2018-2021).,
(7) Children on the Move (COM) Multi-Country Evaluation 2017-2021. Multi-Country Programme Evaluation Children on the Move (COM)/Evaluación de las intervenciones de UNICEF en materia de migración en México, Guatemala, Honduras y El Salvador, Informe Multipaís, Inception Report, Evaluación del Programa de País (CPD) UNICEF Costa Rica 2018-2022.,
(8) TOR and Final Evaluation Report, Evaluación del Marco de Asistencia de las Naciones Unidas para el Desarrollo 2018-2022 en la República Dominicana.,
(9) Inception Report, CPD EVALUACIÓN DEL PROGRAMA DE PAÍS 2019-2022, UNICEF ECUADOR.,
(10) TOR, Evaluación formativa del proyecto “Entornos Protectores de Aprendizaje y Crianza” (El Salvador, PLANE 2019-2023).,
(11) Evaluation of the Implementation of the Child Friendly Schools/Effective Schools Framework in Three Eastern Caribbean Countries (UNICEF ECA, 2021).,
(12) Evaluación Formativa del Programa Amor (PAMOR) para los más Chiquitos MINSA (UNICEF Nicaragua, 2015-2019),
(13) EVALUACIÓN FORMATIVA CENTRADA EN LA GENERACIÓN DE CAMBIOS DEL PROGRAMA DE LAS CONSEJERÍAS DE LAS COMUNIDADES EDUCATIVAS (UNICEF Nicaragua, 2015-2020).,
(14) TOR, Draft and Final Evaluation, UNICEF’s Response to the Venezuela Outflow in Brazil, Colombia, Ecuador and Peru between 2019-2021 (UNICEF Venezuela).,
(15) TOR, Evaluación formativa de medio término del programa “ Apoyo a las estrategia nacional para la protección de los derechos humanos de las niñas y los niños” (UNICEF Guatemala, 2015-2021).,
(16) Draft and Final Evaluation, AN EVALUATION OF UNICEF SUPPORTED CHILD ADVOCACY CENTRES IN GUYANA (UNICEF Guyana, 2021).,
(17) UNICEF HAITI Country Programme Evaluation (UNICEF HAITI CPD, 2017-2021).,
(18) Evaluation finale du projet « eau, assainissement et hygiène au bénéfice des enfants des départements de l’Artibonite et du Centre » HAITI 2014- 2019.
(19) Draft Evaluation Report, EVALUACIÓN DE LA IMPLEMENTACIÓN DEL PROGRAMA DE INCLUSIÓN SOCIAL DE UNICEF EN APOYO A LA IMPLEMENTACIÓN DEL SIGADENAH (UNICEF Honduras, 2017-2021).,
(20) Inception Report, EVALUATION OF THE CHILD PROTECTION SYSTEM IN JAMAICA (UNICEF Jamaica, 2021).,
(21) TOR, Draft, and Final Evaluation Report, Irie Classroom Toolbox Implementation Evaluation 2019-2022 (UNICEF Jamaica,2022)., Draft and Final Evaluation Report, CPD Peru, Evaluación Programa País 2017-2021 (UNICEF Perú).,
(22) Mid-Term Evaluation “DE ADOLESCENTES, EDUCACIÓN Y DERECHOS PARA LAS Y LOS ADOLESCENTES MARGINADOS EN EL PERÚ. UNICEF PERU, 2021.
(23) ToR. A Real-Time Assessment (RTA) of UNICEF’s contributions to COVID-19 vaccine roll out and immunization programme strengthening at the regional and national levels in Latin America and the Caribbean region 2021-2022.
(24) ToR. A Real-Time Assessment (RTA) of the UNICEF ongoing response to COVID- 19 at country level – Latin America and Caribbean (LAC) region 2020-2021.
(25) Final Evaluation Report, Formative Evaluation of the From Classroom to Differentiated Education Programme: Improving Enrolment, Learning and Transition from Frontal to Differential Learning 2018-2022.
(26) Final Evaluation Report. Evaluación de la “Tarjeta Alimentar” (TA), componente central del Plan Argentina contra el hambre (PACH), 2021.
(27) ToR. CPD UNICEF NICARAGUA. 2019-2023, UNICEF NICARAGUA.
(28) ToR, Draft and Final Evaluation Reports. Evaluation of Child Advocacy Centres - 2015 – 2021.
(29) Informe Borrador del Modelo de intervención para sanciones no privativas de libertad para adolescentes en Uruguay: evaluación de diseño e implementación. UNICEF URUGUAY.
(30) VALORACIÓN EN TIEMPO REAL DE LA RESPUESTA DE UNICEF A COVID-19 EN LA REPUBLICA BOLIVARIANA DE VENEZUELA UNICEF VENEZUELA. 2021.
(31) EVALUATION OF THE CHILD PROTECTION SYSTEM IN JAMAICA, 2022.
As a Monitoring, Evaluation, and Learning (MEL) Specialist at USAID (2017-2019) 40 hrs per week
Provided services to USAID portfolio in areas such as peacebuilding, resilience, environment, food security, nutrition, water security, sanitation, hygiene, climate, infrastructure, and energy. Provided support and tools to capture, share, and access knowledge in support of adaptive management across the entire portfolio and its efforts to protect and accelerate development progress in partner countries by addressing issues such as agriculture-led growth, energy, climate change, biodiversity conservation, pollution, sustainable urbanization, resilience, nutrition, and water security.
Provided technical and managerial oversight and capacity building of counterparts in the monitoring and evaluation of the effectiveness of energy programs and environmental services.
Supported the design and evaluation of project-supported interventions that promote private sector investment in renewable energy and nature-based solutions.
Conducted program document review with the technical expert and participated in the activities and deliverables required to successfully implement the evaluation activities as outlined in the approved work plan.
My core functions as an Evaluation Specialist included:
a. Supported the preparation of high-quality reports and other program documents – including drafting, commenting, and revising reports, evaluation, workplan, and learning / process documents.
b. Participated in several two-day launch workshops in Mexico.
c. Developed a complete understanding of what is being done through each mechanism and the results achieved beyond the reports generated by standard performance indicators (“results assessment”).
d. Prepared a work plan, in collaboration with team members, and undertook a results assessment.
e. Identified key ME questions for assessment of projects and questionnaires and protocols.
f. Developed protocols for recording results of document reviews, interviews and site visits.
g. Conducted program document review and prepared report on findings, in coordination with technical expert.
h. Reviewed all project documents (in depth), map activities and beneficiaries,
identified results reported, key design issues and key ME issues.
i. Conducted interviews and site visits, in coordination with the technical expert and USAID/ Technical Officer, with five implementing partners of the current USAID programs.
j. Developed metrics and measurement tools: Proposed, validated, and finalized key indicators on adaptive management to pilot with select USAID projects. beyond standard indicators in current use, to measure results.
k. Drafted the assessment report in close collaboration with the Project Director, including preparing and leading in-person and virtual presentations and debriefs to stakeholders as requested.
.
Collaborative Learning Designer:
Promoted organizational learning by managing systems for capturing and sharing results, lessons learned, and best practices.
Facilitated knowledge exchange across USAID teams and partners, enhancing program quality and effectiveness.
Promoted a culture of learning, reflection, adaptation, and evidence-based decision-making throughout the organization.
Provided CLA coordination support by planning and staging organizational reflection and learning opportunities, to external stakeholders and technical teams to foster internal discussions, reflection, learning and adaptation. Guided implementing partners’ teams in incorporating collaborating, learning and adapting (CLA) approaches and practices to support energy activities’purpose, as well as set it up for successful adaptive management throughout implementation in order for these teams to learn from and manage activities in an adaptive way, and that were able to adopt and implement CLA practices and approaches in their work, in order to achieve better outcomes, by asking
• How this activity contributed to their Project Purpose? What kind of collaboration with other activities under the project were required?
•With what other projects or other donor activities was important for this activity to coordinate and aligned with?
•Which elements of the CLA Framework were most important to emphasize in this activity to ensure its success?
• How will the monitoring, evaluation and learning (MEL) requirements better support collaboration and adaptive management?
I coordinated documentation and learning of USAID core case studies on adaptive management, support an Adaptive Management and communities of practice.
Senior Evaluation Consultant to NASA Office of Education. (2016-2017) 20 hrs per week Served Lead Evaluator in NASA project 21CTS, a FILMMS project, activities included:
Developing pre-post tests, evaluating lesson plans, reviewing results from focus groups, and writing evaluation report.
Conducted the NASA Network of States Year 1 and Year 2 evaluation. Collaborated with NASA Office of Education Team to conduct an evaluation of the NASA Network of States project led by the National Aeronautics and Space Administration (NASA).
Collaborated in the development of a detailed evaluation plan based on the high-level scope developed by NASA and an existing Internship logic model. Evaluation questions and evaluation design; description of the specific program activities and anticipated outcomes based on existing research evidence/logic model; sampling strategy; strategy for engaging stakeholders; data collection methods; data collection instruments and data analysis methods; approach to informed consent/protection of human subjects (as appropriate); design issues and risk mitigation strategy; data collection schedule and overall project timeline; and reporting, including a proposed table of contents for each major report deliverables.
Collaborated with NASA on developing the Network of States evaluation and performance indicators.
Collaborated on communicating evaluation results and identified needs and suggestions for improvement in the activities through different presentations to NASA project staff.
Collaborated in the data collection and qualitative analysis, as well as, the development and in writing the Final Evaluation Report for Year 1, NASA Network of States collaborative partnership activity project.
Conducted a meta-evaluation of the MUREP Aerospace Academy. The purpose of the evaluation was to synthesize MAA site progress in achieving MUREP goals and objectives, identify common measures/metrics, instruments and data sources that each individual grantee is using, and make recommendations for providing technical assistance building MAA site evaluation capacity to better support future synthesis MAA program evaluation efforts. The Site Evaluation Plan reviewed areas of technical assistance and support developed a TA Plan.
As a Senior Evaluator at The Centers for Disease Control and Prevention (CDC (2010-2015) 40 hrs per week
I was a member of The Community Guide Branch (CGB), with the Division of Public Health Information Dissemination (DPHID) in the Center of Surveillance, Epidemiology, and Laboratory Services (CSELS) at the Centers for Disease Control and Prevention (CDC) in Atlanta.
Collaborated with The Community Guide at CDC as project director to conduct an evaluation of the Community Guide’s relevance, effectiveness, and impact of their activities and products.
Collaborated with CSELS to conduct systematic review(s). Assisted in conducting systematic reviews of scientific studies on different public health topics, in order to: identify all relevant studies, assess their quality., and summarize the evidence. Identifying, screening, appraising, and analyzing the literature on the effectiveness or cost-effectiveness of various public health interventions.
Communicated the results of these reviews in diverse formats to diverse audiences, including CPSTF members, federal and non-federal partners of CPSTF.
CDC, Office of Emergency Preparedness and Response (OPHPR) Research Scientist. I served as Project Lead and provided guidance on monitoring and evaluation activities, to OPHPR Director and Program Staff at CDC. Supported the Preparedness and Emergency Response Research Centers (PERRCs) Evaluation in 2011-2013 and Translation of PEERC Research into Policy and Practice Tools.
Specific tasks included:
1. Provided Strategic Results Support to CDC, OPHPR
• Contributed to developing substantive/technical methodology and tools for monitoring and evaluation, in concert with CDC Evaluation Unit for development of monitoring and evaluation policies, standards, and guidance.
-Supported senior management to ensure that evaluation findings are useful and assistance for uptake and communication of the findings to key stakeholders and the wider public.
• Contributed to providing technical support to CDC Program Director and senior managers, to ensure their use of performance and impact measurement methods and indicators;
• Contributed to the implementation of results-based monitoring and evidence- based programming plans; and
2. Capacity Development
• Contributed to the development and adaptation of tools and guidance as well as training materials and manuals in monitoring and evaluation
• Contributing to building the capacity of CDC staff in the adoption and use of RBM techniques and tools;
3. Evidence and Knowledge Development and Dissemination
• Contributed to the development and dissemination of evaluation and policy briefs and the analysis, synthesis and dissemination of state-of-the-art knowledge and new approaches to monitoring and evaluation practice
• Contributed to maintain a roster of evaluators and support for CDC country offices in the formation of evaluation teams.
• Supported for the identification and dissemination of good practices, innovations and knowledge on evaluation related matters
• Contributed to the monitor, analyze and evaluate country evaluation reports, synthesize evaluation findings and identify and disseminate lessons learned in monitoring and evaluation to improve the effectiveness and efficiency
4. Quality Assurance on Evaluation
• Contributed to ensuring that evaluations meet high professional standards in line with CDC Evaluation Policy, Evaluation Norms & Standards;
• Contributed to monitoring and evaluation processes and quality assessments in all areas of work under the CDC, Division’s responsibility;
Responsible for overall accountability and responsibility for PERRC research project award and implementation, management, and quality control activities of each research projects, and assisted in the financial performance of PERRC Research Centers at CDC including billability, burn rates, metrics.
Provided oversight of the technical soundness of PERRC deliverables and ensuring the quality of the final output of each research projects under my direct management.
Provided expert advice to CDC, OPHPR Extramural Research Director and senior program staff members, and PERRC project on M&E planning and methodology for the PERRC mid-project review.