Quality Assurance Framework for Implementing Capacity Building Programs

Submitted by jyuan@worldbank.org on Tue, 01/10/2017 - 11:08
Image
QA

The purpose of this Quality Assurance (QA) Framework is to enable the development and implementation of high quality Capacity Development (CD) services and products by institutions participating in CLEAR and others interested in ensuring the quality of their services.

The QA framework guides the conceptualization, development, implementation, monitoring, and evaluation of CD services and products to achieve their desired objectives and intended results. Section II outlines the QA framework and process in detail.

Download File
Pdf as plain
`
Quality Assurance Framework for Implementing Capacity Building Programs
(Work in Progress- August 2014)
© 2014 Regional Centers for Learning on Evaluation and Results (CLEAR)
1818 H St., NW
Washington, DC 20433
http://www.theclearinitiative.org
The Regional Centers for Learning on Evaluation and Results
The Regional Centers for Learning on Evaluation and Results (CLEAR) is a global initiative that aims to help developing countries strengthen their capacity for monitoring and evaluation (M&E) and performance management (PM).
CLEAR responds to increasing government and civil society demands for practical and applied M&E and PM capacity-building services by supporting competitively selected regional institutions to host Centers that offer demand-driven and cost-effective services specific to each region. CLEAR works with six regional centers located across Africa, East Asia, Latin America, and South Asia. CLEAR is supported by nine donor partners and its Secretariat is based at the Independent Evaluation Group (IEG) of the World Bank.
CLEAR also aims to support and contribute to cross-country collaborations, to learn from experience, disseminate best practices and lessons learned, and to identify and disseminate practical M&E and PM knowledge.
ii
ACKNOWLEDGEMENTS
These Quality Assurance Framework guidelines were drafted on the basis of an extensive review of the literature on capacity development and related quality assurance frameworks. A CLEAR task force provided relevant information and guided the work, including: El Hadji Gueye (Coordinator, CLEAR Francophone Africa Center at the Centre Africain d’Etudes Superieures en Gestion), Diva Dhar (Associate Director of Evaluation Training, CLEAR South Asia Center at the Jameel-Abdul Latif Poverty Action Lab at the Institute for Financial Management and Research), Min Zhao (Deputy Director at the CLEAR Asia Pacific Finance and Development Center), Stephen Porter (Director of the CLEAR Anglophone Africa Center at the University of Witwatersrand), Cristina Cristina Galíndez (Executive Coordinator, CLEAR Center for Spanish-speaking Latin America), Kellie Plummer (Evaluation Manager, Australian Government, Department of Foreign Affairs and Trade), Elizabeth Robin (Head of Evaluation Capacity and the Quality Group at the U.K. Department for International Development), and Nidhi Khattri (Head, CLEAR Secretariat).
The guidelines were drafted by Nidhi Khattri and Cristina Galíndez, with support from Arianne Wessal. The background literature review and information collection was conducted by Margarita Clara Castelli.
The purpose of this Quality Assurance Framework is to assist the CLEAR centers in ensuring that the capacity building services they provide meet high quality standards. iii
TABLE OF CONTENTS
ACKNOWLEDGEMENTS ................................................................................................................................................ iii
GLOSSARY ............................................................................................................................................................................ v
INTRODUCTION ............................................................................................................................................................... vi
QUALITY ASSURANCE conceptual FRAMEWORK and Process ..................................................................... 1
A. QA Framework .................................................................................................................................................... 1
B. Components of the QA Framework ............................................................................................................ 1
Table 1: QA GUIDING QUESTIONS AND RESPONSIBILITIES, BY CATEGORIES OF CD ................... 5
Table 1-A: QUALITY ASSURANCE PROCESS FOR PROGRAMMATIC CAPACITY DEVELOPMENT ....................................................................................................................................................... 5
Table 1-B: QUALITY ASSURANCE PROCESS FOR LEARNING PROGRAMS ................................... 11
Table 1-C: QUALITY ASSURANCE PROCESS FOR PUBLICATIONS AND KNOWLEDGE PRODUCTS .............................................................................................................................................................. 16
C. CLEAR Forms .................................................................................................................................................... 19
Bibliography .................................................................................................................................................................... 38
ANNEX 1- QUALITY ASSURANCE TOOLS AND FRAMEWORKS OF MULTILATERAL AND BILATERAL DEVELOPMENT AGENCIES .............................................................................................................. 41
OECD ......................................................................................................................................................................... 41
GIZ’s Quality Management Framework ...................................................................................................... 41
AusAID and ADB’s Quality Assurance Framework ................................................................................ 42
Annex 1-A: Quality Assurance of Trainings ................................................................................................... 46
Annex 1-B: Quality Assurance of Advisory Services .................................................................................. 48
Annex 1-C: Quality Assurance of Evaluation Processes and Reports.................................................. 49
AusAID’s Performance Assessment Framework (PAF) ....................................................................... 49
DfID’s Quality Assurance (QA) Guidelines ................................................................................................. 50
The European Commission’s (EC) quality guidelines for their evaluation processes and reports ...................................................................................................................................................................... 51
ANNEX 2 –CLEAR CAPACITY DEVELOPMENT ACTIVITIES ......................................................................... 53
ANNEX 3 –TERMS OF REFERENCE FOR CLEAR REGIONAL ADVISORY COMMITTEES ................... 55
iv
GLOSSARY
AFDC Pacific Finance and Development Centre
AusAID Australian Agency for International Development
ADB Asian Development Bank
AfDB African Development Bank
CD Capacity Development
CESAG Centre Africain d’Etudes Supérieures en Gestion
CIDE Centro de Investigación y Docencia Económicas
CLEAR Centers for Learning on Evaluation and Results
DfID Department for International Development
EFQM European Foundation for Quality Management
GIZ German International Cooperation
GTZ German Technical Cooperation
IADB Inter-American Development Bank
IEG Independent Evaluation Group of the World Bank
ISO International Organization for Standardization
J-PAL Jameel Poverty Action Lab
ODE Office of Development Effectiveness
OECD Organization for Economic Cooperation and Development
PAF Performance Assessment Framework
QA Quality Assurance
RADAR Results Approach Deployment Assessment Review
SIDA Swedish International Development Agency
UNDP United Nations Development Program
WB The World Bank
WBI The World Bank Institute v
INTRODUCTION
The CLEAR (Centers for Learning on Evaluation and Results) initiative aims to strengthen countries’ capacities in monitoring and evaluation (M&E) and performance management (PM) to achieve development results. CLEAR collaborates with selected academic/training institutions that house regional centers. The regional centers develop and provide a number of capacity building and knowledge services on M&E and PM, including: (i) learning programs (e.g. training, workshops, conferences); (ii) technical assistance or programmatic capacity development (e.g. advisory services and training combined); and knowledge products (e.g. research, publications, etc.).
The purpose of this Quality Assurance (QA) Framework is to enable the development and implementation of high quality Capacity Development (CD) services and products by institutions participating in CLEAR and others interested in ensuring the quality of their services.
The QA framework guides the conceptualization, development, implementation, monitoring, and evaluation of CD services and products to achieve their desired objectives and intended results. Section II outlines the QA framework and process in detail.
Key Definitions
Capacity Development. For the purposes of this framework, CD is defined as an intentional change process aimed at enhancing the ability of social actors/constituents to achieve their desired objectives through the acquisition of new or improved skills, knowledge, information, or understanding. The change process can be instigated through various modalities, including, for example, training, peer-learning, technical assistance, and so forth (World Bank 2005; UNDP 2014). The main categories addressed in this document are:
1. Programmatic capacity development (a bundle of coherent CD services/products [e.g., advisory services combined with a series of workshops] for one client or a group of clients and aimed at meeting a specific objective);
2. Learning programs (e.g., face-to-face workshops, e-learning, conferences, training events, knowledge-exchange fora, etc.); and
3. Knowledge products and publications (these include papers/publications for general public, customized papers, knowledge products, advisory notes for specific stakeholders, evaluations, research, etc.)
Quality Assurance. QA is defined as a set of systematically planned activities that monitor different stages of a CD service or a product to ensure that quality standards are met. It comprises review, measurement, comparison with a standard, feedback loops, and ongoing assessments in CD development and implementation. Quality assurance is different from the concept of quality control, which focuses on the quality of the end service or product, whereas QA examines and seeks to improve the processes used to create the end service or product (ESS 2010; OECD 2010; Merriam-Webster 2014).
vi
The QA Framework
The QA conceptual framework is divided into four phases to correspond to the four stages of CD development and implementation: Plan, Do, Check, Act. These four stages are more commonly associated in QA literature with the following four quality assessment phases: Quality at Entry (Plan); Quality at Implementation (Do); Quality at Exit (Check); and the Feedback loop between Entry and Exit (Act).
The development and implementation of CD and the QA assessment phases are guided by six common international quality standards and their sub-dimensions identified in the QA literature1, namely: relevance, quality of content and design, effectiveness, efficiency, impact, and sustainability.
The framework is illustrated (Figure 1) and explained in detail in Section II.
1 These definitions are a summary of definitions used by OECD, AusAID, DfID, European Union, German International Cooperation (GIZ) and the World Bank. vii

QUALITY ASSURANCE CONCEPTUAL FRAMEWORK AND PROCESS
A. QA Framework
The QA framework outlined below is based on a review of the literature on QA and capacity development and the experience of the CLEAR centers’ own QA procedures2. Some of the QA frameworks referred to are summarized in Annex 1: Quality Assurance Tools and Frameworks of Multilateral and Bilateral Development Agencies.
The QA framework comprises six quality standards (relevance, quality of content, effectiveness, efficiency, impact and sustainability)3 at four CD development and delivery stages (Plan, Do, Check and Act). The stages are inter-related and guided by the overall objectives of the CD services and products.
The entire process itself is embedded in the implementing organization’s overall strategy, which delineates the types of CD services/products the organization will develop and provide, based on its long-term vision and expected results. See Figure 1.
The following section summarizes how the QA stages and quality standards are related.
B. Components of the QA Framework
The QA framework incorporates the six common international quality standards, which are:
1. Relevance: The extent to which the program components/activities are consistent with the broader objectives the program seeks to achieve. A key question to ask would be: Are we doing the right thing?
2. Quality of Content: The extent to which: a) the proposed content of the program reflects state-of-art knowledge and practice; b) the design and sequencing of the various components of the program are logical and optimally arranged to achieve impact. Some of the questions to ask include: Is the content accurate? Is the knowledge presented state of the art, does it reflect the latest thinking? Is the content consistent with local needs and context, and easy to understand?
3. Effectiveness: The extent to which the objectives are likely to be achieved. A key question to ask would be: Are we achieving our objectives and how do we know?
4. Efficiency: The extent to which resources devoted to the program (human, financial, and time) are commensurate with the objectives of the program. A key question to ask would be: Are our actions cost-effective?
2 Background for developing this QA framework was drawn from literature, questionnaires administered to the CLEAR centers, and from documentation illustrating quality standards applied by the centers. The background is available as a separate document.
3 Definitions for each of the quality standards are also provided in the attached tables. 1
5. Impact: The extent to which something will have a sustainable/durable impact. A key question to ask would be: Are we contributing to the achievement of overarching development objectives?
6. Sustainability: The extent to which the benefits of a program/activity are likely to continue after the program/activity (and funding) have ended. A question to ask would be: Is the capacity embedded in the change processes on an ongoing basis?
These definitions are a summary of definitions used by the Organization for Economic Cooperation and Development (OECD), the German International Cooperation (GIZ), the Asian Development Bank (ADB), the Australian Agency for International Development (AusAID), the United Nations Development Program (UNDP), and the World Bank (ECD DAC 2000; GTZ 2007; ADB 2008; AusAID 2012a; UNDP 2002; WB 2004 and 2012b).
These standards are applied to varying degrees during the four stages of CD development and implementation:
(i) PLAN: The conceptualization and development of a service or product;
(ii) DO: The implementation of a service or product;
(iii) CHECK: The monitoring and evaluation of a service or product; and
(iv) ACT: The integration of lessons learned and M&E recommendations in successive CD planning and design cycles (EFQM 2012).
These four stages are associated in QA literature with the following four quality assessment phases:
(i) Quality at Entry (Plan): assessment conducted at the initial design of a CD service or product. It examines how well the CD problem, needs, and objectives are identified, whether the most appropriate CD approach was selected, and how the CD will be implemented, monitored, and evaluated.
(ii) Quality at Implementation (Do): assessment conducted during the actual implementation of a CD service. It reviews whether the CD intervention is on the right track and whether the implementation process has encountered any difficulties.
(iii) Quality at Exit (Check): assessment conducted following the completion of a CD service or product. It assesses what has been achieved in terms of CD outputs and outcomes and what CD lessons can be identified.
(iii) The Feedback loop between Quality at Entry and Quality at Exit (Act):
assessment conducted during the design of the following CD cycle. It seeks to reflect and implement past CD lessons learned in the following QA cycle.
Figure 1 illustrates how the CD planning/implementation stages and the QA assessment phases are inter-related.
2
Figure 1: Quality Assessment (QA) Framework and Process
Quality AssuranceProcessSTRATEGYPLANDOCHECKACTQuality at entry:•Relevance•Quality of content and design•Efficiency•Effectiveness•ImpactQuality at exit:•Efficiency•Effectiveness•Sustainability •ImpactImplementation of capacity and knowledge, development services and products:•Efficiency •Effectiveness Customer Satisfaction SurveyFeedback processVision/missionCD ObjectivesPrioritizationResults chain•Trainings•Workshops•Reports/publications•Advisory services•Seminars
The figure shows that the efficiency and effectiveness standards are systematically examined during all quality assurance assessment stages. Other quality standards such as impact and sustainability are more commonly reviewed during Quality at Entry and at Exit.
Using the framework above, the attached tables delineate the QA processes for three main categories of CD:
1. Programmatic capacity development (a bundle of coherent CD services/products [e.g., advisory services combined with a series of workshops] for one client or a group of clients and aimed at meeting a specific objective);
2. Learning programs (these include face-to-face workshops, e-learning, conferences, training events, knowledge-exchange fora, etc.); and
3. Knowledge products and publications (these include products/publications for general public, knowledge products/advisory notes for specific stakeholders, evaluations, etc.)
A further delineation of the different types of Capacity Development activities is provided in ANNEX 2 – CLEAR Capacity Development Activities.4
The tables below incorporate the six quality standards and provide guiding questions about how they should be assessed. They also highlight how the QA process can be implemented (who is responsible and the process to be followed). The tables make reference to the corresponding CLEAR form(s) which can be found in section C for each of the quality assessment phases.
Based on a review of the literature, the following elements must also be considered in
4 These are also commonly referred to as Building Blocks of CLEAR’s Capacity Development Strategy. http://www.theclearinitiative.org/ECD-CLEAR-ChangeAgents-ICOs-CD-Activities_4October%202013.pdf 3
implementing this QA framework:
• Systematic and comprehensive needs assessments to determine the appropriate modality and content of CD products/services
• The identification of clear links between desired CD outcomes, objectives of CD clients, and the overall strategy of CD provider.
• Continuous feedback and inputs of CD clients and providers during CD processes.
• Continuous adaptation of the CD services and products, as CD lessons are identified.
• Systematic integration of previous CD lessons learned in planning and design of CD services and products.
The QA process should be embedded in the overall vision/mission and strategy of the organization. In the case of CLEAR, the development of this strategy (Form 1) encompassing the overall CD approach (ANNEX 2 –CLEAR CAPACITY DEVELOPMENT ACTIVITIES) and objectives is led by the Center head, in consultation with Center staff and stakeholders. The Centers’ Regional Advisory Committees (ANNEX 3 –ToRs for the CLEAR Advisory Committee), play a key role in advising the Center and providing broad directions to, and endorsement of, the strategy and annual work programs.
4
Table 1: QA GUIDING QUESTIONS AND RESPONSIBILITIES, BY CATEGORIES OF CD
Table 1-A: QUALITY ASSURANCE PROCESS FOR PROGRAMMATIC CAPACITY DEVELOPMENT
A programmatic capacity development (CD) package combines a variety of CD activities to achieve a specific objective for a given group of stakeholders (or clients). It may include a set of learning programs, advisory services, diagnostic studies, knowledge materials, and so forth. Each activity of the program (e.g., learning programs, diagnostic studies, etc.) should also undergo its own separate QA process, as it is developed and implemented in the programmatic approach.
An example of a programmatic approach is helping an agency establish its evaluation program. The programmatic approach could include a package of activities including advice to the head of the agency about options for the program, training for the staff in M&E methods, and manuals on how to implement the agencies‘ policies on evaluation. The overarching objective of this “package“ would be to help the agency take a results-oriented approach to its programs.
PLAN: QUALITY AT ENTRY
Concept Stage
Quality Standard
Guiding Questions
Responsibility and Process
RELEVANCE
The extent to which the program components/activities are consistent with the broader objectives the program seeks to achieve and with the overall strategy of the Center.
Why is this particular program being conducted by the Center?
Is it consistent with the overall strategy of the Center’s approach to capacity development?
Is it consistent with the overall capacity development objectives of the Center?
Is it relevant for the clients/stakeholders who are the intended beneficiaries of the program?
The assessment of the relevance should be conducted by Center head with inputs from key stakeholders (such as the organization demanding the program, the Center’s RAC, and key advisors).
The review could also be incorporated into the overall strategy/programming of the Center.
Form 2: Project/Activity Planning Note for Programmatic Capacity Development 5
Is the timing of the program appropriate?
Is it being demanded by key Stakeholders/constituents?
Are the different elements/components of the program relevant for the overall objectives of the program?
Services/Learning Programs and Knowledge Products
SUSTAINABILITY
The extent to which the program will directly or indirectly contribute to the participants’ needs/objectives.
IMPACT
The extent to which it will have a sustained/durable impact.
How does the program contribute to the stakeholders’ objectives and strategy? (Has a needs-assessment been completed?)
Is the program likely to have sustainable impact?
Are all elements of the program likely to contribute to impact?
Does the program enhance CLEAR’s overall goals and impact?
Development Stage
Quality Standard
Guiding Questions
Responsibility and Process
Quality of Content and Design
The extent to which: a) the proposed content of the program reflects state-of-art knowledge and practice; b) the design and sequencing of the various components of the program are logical and optimally arranged to achieve impact.
Does the further-specified program reflect state-of-the art knowledge and practice?
Are the content elements proposed comprehensive and aligned with the objectives of the program?
A review of the quality and content of the program should be coordinated by lead task manager at the Center. The review should be conducted by a panel of experts that includes: recognized content experts and professionals knowledgeable about the 6
Notes:
If the program is demand-driven and intended for a specific set of audiences, then the content and design must also reflect the needs and objectives of the organization/group that requested the program.
This part of the review process will be most in-depth and intense at the beginning of the program.
Subsequent reviews could be lighter to focus on specific issues of content and design and be guided by feedback from implementation and end-of-program review (see Sections Quality of Implementation and Quality at Exit).
Do the modalities for delivering the program reflect the best alternatives for achieving the objectives of the program?
Does the program plan include sufficient time and the right type of expertise for its implementation?
Does the design reflect lessons learned and recommendations from previous programs?
If the Program has been specifically requested by an organization/group:
Were clients demanding the program consulted regarding their specific needs and objectives?
Is the overall design and sequencing of the various components consistent with the needs and objectives of the clients?
design and delivery of similar programs.
If the program is demanded by a specific organization, the review panel should include a representative of the organization.
Once the program has been implemented the first time, subsequent reviews can be lighter, depending on information gathered during the implementation and completion stages of the program (see Sections Quality of Implementation and Quality at Exit)
EFFICIENCY
The extent to which resources devoted to the program (human, financial, and time) are commensurate with the objectives of the program.
Which key resources are required to implement the desired program and change process?
Are sufficient resources (time, human and material) allocated given program objectives?
Center head and task team leader, with program and resource experts. (Some of this review is likely to overlap with the review of content and design.) 7
Are resources being used efficiently?
EFFECTIVENESS
The extent to which the objectives are likely to be achieved.
Is the local/job/professional context conducive to achieving the overall objectives of the program?
Is there a plan in place to measure the effectiveness of the program?
How will desired outputs and outcomes be monitored and evaluated?
Center task-team leader with a specialist in M&E. (This review could overlap, and be conducted in conjunction with, the review of content and design.) The M&E plan should also be included in the overall M&E plan of the Center.
DO: QUALITY AT IMPLEMENATION [at mid-term]
Quality Standard
Guiding Questions
Responsibility and Process
QUALITY OF CONTENT AND DESIGN/RELEVANCE/EFFICIENCY
The extent to which the quality of content and design are conducive to the program’s capacity development objectives.
Review conducted for each component as it is implemented [see the QA standards for learning programs and knowledge products/publications]
Consolidated review conducted at mid-point in the program (or earlier, if it is seen to be important).
Does the information from the quality of implementation and quality at exit reviews show that:
- The quality of content and design is high?
- The program remains relevant?
- The program is being delivered efficiently?
What changes in program content and design are warranted as a result of the mid-term review?
Center task team leader, with the implementation team, by collecting and collating information on ongoing feedback collected on the specific components of the program as well as the team’s implementation notes. [See the QA standards for learning programs and knowledge products/publications.]
CHECK: QUALITY AT EXIT [after all program components have been implemented] 8
Quality Standard
Guiding Questions
Responsibility and Process
RELEVANCE
The extent to which the program components/activities are consistent with the broader objectives the program seeks to achieve and with the overall strategy of the Center.
Was the program relevant to the needs and objectives of its target beneficiaries?
Was it relevant for the Center’s own strategy?
Implementation task team leader administers an end-of-program evaluation form to collect information from the sponsoring/target organization(s).
This information should be combined with implementation notes to develop a full quality-at-exit report. The report should include observations on:
- The contextual factors that contributed to or hindered the success of the program
- Lessons for future pograms
If no specific organization sponsored the program, quality-at-exit information from each component/activity should be compiled, together with a self-assessment by the team, and included in an end-of-program quality-at-exit report.
Form 3: Project/Activity Review Note for Programmatic Capacity Development Services/Learning Programs and Knowledge Products
Form 7: Technical Assistance and Advisory Services Evaluation Form
EFFECTIVENESS
The extent to which the objectives are likely to be achieved.
IMPACT
The extent to which it will have a sustained/durable impact.
Does feedback from the participants/organizations involved in the program indicate that the program achieved its objectives?
Does the sponsoring organization’s feedback indicate that the program was useful for its overall strategy/objectives?
Are the results likely to be sustained?
EFFICIENCY
The extent to which resources devoted to the program (human, financial, and time) are commensurate with the objectives of the program.
Was the program delivered efficiently? Were resources – human, financial, and time – sufficient? Were they utilized efficiently? 9
ACT: IMPLEMENATION OF LESSONS LEARNED
Quality Standard
Guiding Questions
Responsibility and Process
Feedback Loop between Quality-at-entry and Quality-at-exit
Are findings from quality at entry, implementation, and quality at exit being integrated into planning for the next phase?
Center Head and team implementing the same or similar programs
10
Table 1-B: QUALITY ASSURANCE PROCESS FOR LEARNING PROGRAMS
These could include face-to-face workshops, e-learning, conferences, training events, knowledge-exchange fora, etc.
PLAN: QUALITY AT ENTRY
Concept Stage
Quality Standard
Guiding Questions
Responsibility and Process
RELEVANCE
The extent to which the learning program’s objectives and anticipated outcomes are consistent with the needs and objectives of the participants.The extent to which the learning program’s objectives and anticipated outcomes are consistent with the strategy of the Center.
Why is this particular learning program being provided?
Is it relevant for the clients/stakeholders who are the intended beneficiaries of the program?
Are the program objectives consistent with the Centers’ strategy (e.g., for stoking demand, contributing to a supportive enabling environment for evaluation?)
Is the program the Center’s approach to capacity development?
The assessment of the relevance should be conducted by Center Head and advisers to the Center. The review could also be incorporated into the overall strategy/programming of the Center.
Form 2: Project/Activity Planning Note for Programmatic Capacity Development Services/Learning Programs and Knowledge Products
IMPACT
The extent to which it will have a sustained/durable impact.
SUSTAINABILITY
The extent to which the learning program will directly or indirectly contribute to the participants’ needs/objectives
How does the learning program seek to contribute to the participants’ objectives and strategy?
Is the learning program likely to have sustainable impact?
Does the learning activity enhance CLEAR’s 11
overall goals and impact?
Development Stage
Quality Standard
Guiding Questions
Responsibility and Process
QUALITY OF CONTENT AND DESIGN
The extent to which: a) the content of the learning program is up-to-date, reflecting state-of-art knowledge and practice; and 2) consistent with pedagogical approaches appropriate for the participants intended.
Notes:
If the learning program is demand-driven and intended for a specific set of audiences, then the content and design must also reflect the needs and objectives of the organization/group that requested the program.
This part of the review process will be most in-depth and intense the first time the program is designed and implemented. Subsequent reviews could be lighter to focus on specific issues of content and design, not all, and be guided by feedback from participants and implementation review (see Sections Quality of
Does the content reflect state-of-the art knowledge and practice?
Is the content comprehensive and aligned with the objectives of the learning program?
Does the design of the program take into consideration pedagogical approaches consistent with lessons regarding adult learners (e.g., hands-on materials, group work, examples of practical cases, etc.)?
Does the design include methods for targeting and selecting participants appropriate for the program?
Are the materials for the program well-organized, clear, and easily accessible?
Are the flow, pacing, and sequencing of the program well-organized?
Does the design reflect lessons learned and recommendations from previous programs?
A review of the quality and content of the program should be coordinated by lead task manager at the Center. The review should be conducted by a panel of experts that includes: recognized content experts, experts in pedagogy, and professionals knowledgeable about the design and delivery of learning programs.
If the program is demanded by a specific organization, the review panel should include a representative of the organization.
After the program has been piloted, the subsequent reviews could be lighter, depending on information gathered during the implementation and completion stages of the program (see Sections Quality of Implementation and Quality at Exit.)
Form 4: Participant Registration Form
There are questions in this form that can 12
Implementation and Quality at Exit).
If the Program has been specifically requested by an organization/group:
Were clients demanding the program consulted regarding their specific needs and objectives?
Are the knowledge and skills targeted consistent with the capacity needs identified by the client?
Is the overall design consistent with the needs and objectives of the clients?
help with the planning and development of the activity.
EFFICIENCY
The extent to which resources devoted to the program (human financial and time) are commensurate with the objectives of the program.
What key resources are required to implement the program?
Are sufficient resources (time, human and material) allocated, given program objectives?
Are resources being used efficiently?
Center head and task team leader, with program and resource experts. (Some of this review is likely to overlap with the review of content and design.)
EFFECTIVENESS
The extent to which the objectives are likely to be achieved.
Is the local context conducive to achieving the outputs, outcomes, and overall objectives of the program?
Is there a plan in place to measure the effectiveness of the program?
How will expected or desired outputs and outcomes be monitored and evaluated?
Center task-team leader with a specialist in M&E. (This review could overlap, and be conducted in conjunction with, the review of content and design.) The M&E plan should also be included in the overall M&E plan of the Center. 13
DO: QUALITY AT IMPLEMENATION
Quality Standard
Guiding Questions
Responsibility and Process
QUALITY OF CONTENT AND DESIGN
The extent to which the quality of content and design are conducive to participants learning from the program
Are the participants learning from the program? What is their feedback about content? About the organization of the program? About the facilitators? About the facilities?
(Ongoing monitoring during implementation)
Center task team leader, with the implementation team, by collecting and collating ongoing feedback provided by participants on the quality of content and implementation, including the relevance of content to their professional needs and their jobs. These implementation notes should be documented and integrated with the quality-at-exit review and report (see Section Quality at Exit, below.)
RELEVANCE
The extent to which the program is relevant to the participants and their organizations’ needs.
Is the program relevant to the participants’ professional needs and to their jobs?
Is the program relevant for the needs and objectives of the sponsoring organization?
EFFICIENCY
The extent to which resources devoted to the program (human financial and time) are commensurate with the objectives of the program.
Are resources allocated – human, financial, time –sufficient for the implementation of the program? Are they being used efficiently?
Which additional resources are required to complete the implementation of the program? Why?
Ongoing observations regarding implementation by the implementation team.
CHECK: QUALITY AT EXIT
Quality Standard
Guiding Questions
Responsibility and Process
RELEVANCE
The extent to which the program’s
Was the program relevant to the participants’ needs and the sponsoring
Implementation task team leader administers an end-of-program feedback 14
objectives and anticipated outcomes are consistent with the needs and objectives of the participants.
organizations’ objectives and strategy?
form to collect information from the participants on the relevance and effectiveness of the program.
This information should be combined with the implementation notes to compile a full quality-at-exit report. The report should include observations on:
- Factors that contributed to or hindered the success of the program
- Lessons that can be drawn for the future
Form 5: Participant Activity Feedback Form
Form 6: Individual Instructor Evaluation Form
Form 8: Participant Evaluation Form (Tracer)
EFFECTIVENESS
The extent to which the objectives are likely to be achieved.
IMPACT
The extent to which it will have a sustained/durable impact.
Does participant feedback indicate that they:
• Learned from participating in the program?
• Plan to use the information/skills/knoweldge they gained?
Does the sponsoring organization’s feedback indicate that the program was useful for its overall strategy/objectives?
EFFICIENCY
The extent to which resources devoted to the program (human financial and time) are commensurate with the objectives of the program.
Does participant feedback indicate that they are satisfied with the quality of the:
• the materials
• facilities
• support during implementation
ACT: IMPLEMENATION OF LESSONS LEARNED
Quality Standard
Guiding Questions
Responsibility and Process
Feedback Loop between quality at entry and quality at exit
Are findings from quality at entry, implementation, and quality at exit being integrated into planning for the next phase?
Center Head and team implementing the same or similar learning programs 15
Table 1-C: QUALITY ASSURANCE PROCESS FOR PUBLICATIONS AND KNOWLEDGE PRODUCTS
These could include products/publications for general public, knowledge products/advisory notes for specific stakeholders, evaluations, etc.
PLAN: QUALITY AT ENTRY
Concept Stage
Quality Standard
Guiding Questions
Responsibility and Process
RELEVANCE
The extent to which the content of the publication/knowledge product is consistent with the strategy of the Center.
The extent to which the intended audience is consistent with the strategy of the Center.
Note:
If this publication/knowledge product is demanded by a particular organization, the extent to which the topic of the product and the approach to its development are relevant to the needs/objectives of the organization.
Why is this particular publication/knowledge being developed by the Center? What is its purpose?
Is it consistent with the overall strategy and objectives of the Center’s approach to capacity development?
If the knowledge product is demand driven:
Is the content relevant for the clients/stakeholders who are likely to benefit from the publication?
Is the approach to its development relevant to the capacity needs/objectives of the organization?
The assessment of the relevance should be conducted by Center Head, with inputs from Center staff and advisers.
The review could also be incorporated into the overall strategy/programming of the Center, and need not be conducted separately.
If the product/publication is demand-driven by a particular organization, the review should be conducted with a representative of the organization.
Form 2: Project/Activity Planning Note for Programmatic Capacity Development Services/Learning Programs and Knowledge Products
16
Draft/Final Stage
Quality Standard
Guiding Questions
Responsibility and Process
QUALITY OF CONTENT
The extent to which the content of the product or publication adds to or enhances knowledge in the area it addresses.
Does the content reflect and add to current knowledge in the field?
Does the content address the intended purpose of the publication/product adequately?
If the product has been specifically requested by an organization/group:
Does the publication/knowledge product address the knowledge gaps identified by the client organization?
Task team leader organizes a peer-review process, with well-recognized experts in the topic covered by the knowledge product.
The publication could also go through a peer-review process of a recognized journal or publishing house.
If the program is demanded by a specific organization, the review panel should include a representative of the organization.
EFFICIENCY
The extent to which resources devoted to the publication/knowledge product (human financial and time) are commensurate with the depth of the work to be performed
What key resources are required to develop the knowledge product/publication?
Have sufficient resources (time, human and material) allocated?
Center head and task team leader
CHECK: QUALITY AT EXIT
Quality Standard
Guiding Questions
Responsibility and Process
Quality of Content, reflected through:
Is the publication being disseminated
Center Head. Plan for feedback on 17
• Acceptance of publication in peer-reivewed publications
• Peer-review comments
• Use (especially is it has been demanded by a specific organization)
through website, other channels?
Is the publication (or set of publications) being used (as evidenced by downloads, use in other capacity development events, requests from Centers’ clients)?
If the knowledge product is demanded by an organization, are the recommendations/ideas contained in it being implemented?
publications/knowledge products included in the Center’s consolidated M&E plan.
Regular data on user downloads from website; rating system on the website.
For knowledge products/publications that are demanded by a specific organization, feedback on quality of content and usefulness of the ideas/recommendations
Form 3: Project/Activity Review Note for Programmatic Capacity Development Services/Learning Programs and Knowledge Products
EFFECTIVENESS/IMPACT/SUSTAINABILITY
The extent to which the findings and recommendations contained in the knowledge product/publication are being used in the organization that requested the product/publication
This step may not be applicable for a non-demanded publication.
Are the findings and recommendations contained in the knowledge product/publication useful?
Are the recommendations being implemented?
ACT: IMPLEMENATION OF LESSONS LEARNED
Quality Standard
Guiding Questions
Responsibility and Process
Feedback Loop between quality-at-entry and quality-at-exit
Are findings from quality at entry and quality at exit being integrated into planning for the next phase or program of publications/knowledge products?
Center Head and team planning future knowledge products/publications
18
C. CLEAR Forms
Form 1: Project Document
DATES: XX to XX
A. STRATEGIC CONTEXT AND RATIONALE
1. Regional and Sectoral Issues
2. Higher Level Objectives to which the Project Contributes
B. PROJECT DESCRIPTION
1. Project Development Objective and Key Indicators
Key objective of this project is to enhance….
The key indicators are noted below. By the end of the program:
• At least 50 percent of the clients are from …etc.
• Year-end review of the program yields satisfactory performance
Annex A provides the full results framework.
2. Project Components
The project comprises three components, as described below. Annex B provides the full set of outputs.
The three components are:
• Component one: Center developed and functional (USD XX).
• Component two: Center provides capacity development services. (USD XX)
• Discussion of strategic clients and activities
• Component three: Monitoring and evaluation. (USD XX)
Attach Annexes B and C
C. IMPLEMENTATION
1. Partnership Arrangements (if applicable)
Mention all existing partnerships, new ones to be created, etc.
2. Institutional and Implementation Arrangements
How the grant will be implemented – the organizational and staffing arrangements within the organization 19
Role of RACs
3. Monitoring and Evaluation Arrangements
How the activities will be monitored and evaluated.
4. Sustainability
Plans for sustainability through fee-structure, additional clients, additional fund-raising, etc
5. Critical Risks and Possible Controversial Aspects
A discussion of risks to the Center and the program as a whole (reputational, financial, political) and other controversial aspects of the program. How the risks will be managed.
Results framework and monitoring
(EXAMPLE)
PDO
Project Outcome Indicators
CLEAR XX provides high-quality capacity development in monitoring and evaluation (M&E) and performance management (PM) on a regional basis
By [DATE], the center and its partners provide M&E and PM capacity development services on a regional basis for at least xx countries –
Other core indicators
Intermediate Outcomes
Intermediate Outcome/Output Indicators
Component One: Center for M&E established and functional
1.1 Management and Administrative Capacity
1.2 Professional Capacity
Center’s administrative capacity is established:
• By [DATE], project administration and management established
• By [DATE], center’s website and communications established and operational
Center’s professional capacity is enhanced:
• By [DATE], regional advisory committee established and first meeting held
• By [DATE], Partnership established with xx
20
Component Two: Center provides capacity development services
2.1 Training/workshops
2.2 Advisory Services
2.3 Evaluations/applied research
Performance-Based Budgeting
• By [DATE],, two programs implemented with XX [NAME OF CLIENT]
Managing Data Collection and Analysis
• By [DATE],, two hands-on training programs implemented in basic data-collection and analysis; 80 trainees
• By [DATE], a series of advisory services provided to [NAME OF CLIENT] on how to evaluate their flagship program…
By XX, an evaluation of XX program conducted with [name of stakeholder]
Component 3: Monitoring and Evaluation Implemented
Monitoring and evaluation plan designed and implemented
Information on program implementation recorded electronically
Data on participant participation (disaggregated by gender, nationality, organization, and other key variables) routinely compiled
Quality of training and advisory services routinely monitored (with participant surveys data collected immediately following the capacity development service)
By [DATE], a full monitoring report provided to Secretariat
By [DATE], an independent program review commissioned
FULL DESCRIPTIONS OF ACTIVITIES
21
Form 2: Project/Activity Planning Note for Programmatic Capacity Development Services/Learning Programs and Knowledge Products
(Recommended Document)
CLEAR Center:
Date:
[THIS TEMPLATE SHOULD BE USED FOR PROJECT OR A CLUSTER OF A COHERENT SET OF ACTIVITIES THAT HAVE A COMMON IMMEDIATE OBJECTIVE AND/OR SET OF CLIENTS]
1. BASIC INFORMATION
Name or Title:
Number of Project/Service:
Project Coordinator:
Is the activity included in the original Project Document?:
Project / Activity start and end date:
2. CONTEXT
2a. Context and Relevance (Max. 200 words)
• Current situation (Situation targeted for change)
• Relevance to the Center’s workprogram, objectives, and strategic direction
• Center's comparative advantage
• Competing / complementary work on the topic by other institutions / agencies
2b. Nature of the Specific Demand for the Project / Activity (Max. 200 words)
• Nature of the demand for the activity and information about the client requesting the project/activity (Please provide evidence – communications, documents, etc. stating the demand)
• Description of clients or counterparts or main target audience
• Client’s contribution to the project / activity (The contribution could be monetary, or in kind through staff, resources, etc. If the activity does not have a unique client, please provide evidence and nature of the demand and stakeholder’s support off the activity)
• Contributions / engagement from other partner institutions and donors
3. DESCRIPTION AND LOGIC OF THE INITIATIVE
3a. Specific Objectives of the Project/Activity (Max. 100 words)
Describe the main objective of this project or activity, including (a) what situation/area is targeted for change, and (b) the nature (status, quality) and/or direction (increase, decrease) of the targeted change.
3b. Change Process/Design (Max. 200 words)
22
• Describe how the project / activity / sub activities (e.g., training, technical assistance, etc.) will achieve the objectives. Provide the rationale for the proposed set of activities and the audiences targeted.
• Describe the change process and assumptions by outlining how the activity fits in the theory of change for the Center
3c. Indicators, Baselines and Targets
Provide quantitative and/or qualitative data that would capture the objectives
3d. Risks and Mitigation (Max. 100 words)
• Foreseen risks that might prevent the objective from being achieved, including but not limited to political, policy-related, social/stakeholder-related, financial issues.
• Measures (if any) that the team plans to take to mitigate each risk
3f. Sustainability (Max. 50 words)
• Describe how the capacities developed are expected to be sustained after the engagement ends
4. EXPECTED IMPLEMENTATION
4a. Project or Activity expected start and completion date: (MM/DD/YYYY)
4b. Activities (or sub-activities) and deliverables and timeline
Please list concrete activities or deliverables (not all sections may apply). To add more copy and paste the sections below:
Activity / Sub-activity / deliverable 1 (eg. Conference, training, TA, etc.)
Description (Max. 50 words)
Sub-Activity / Deliverable Type:
1. Main Approach (Target audience/ profile)
2. Main area of skill / approach or method covered: (if different from the overall for the project / activity)
3. Estimated number of participants: (If conference, course, etc.)
4. Target participating countries:
5. Location of the Activity (Country and City)
6. Participating Partners: (Other institutions that will help / take on the implementation – if different from the overall for the project / activity).
7. Implementing institution: (Describe which will be the institution – center or partner – leading the implementation of the activity)
8. Government / Civil Society client or counterpart(s) (if different from the overall for the project / activity)
Dates: (MM/DD/YYYY) Start: End:
Duration: (in days if course or conference)
Other Key Dates: Deliverables, etc.
3c. Team related to activities and deliverables
• Staff expertise and type of involvement with the Project / Activity
23
3d. Budget Projections
Project / Activity / Sub Activity
CLEAR Funds (%)
Other Donors (%)
Counterpart or Client (%)
Income from the activity (amount)
Support to XX Presidency's M&E System
International Conference on Country M&E Systems with Presidency
Sub- Activity 2
Sub- Activity 3
24
Form 3: Project/Activity Review Note for Programmatic Capacity Development Services/Learning Programs and Knowledge Products
(Recommended Document)
CLEAR Center:
Date:
[THIS TEMPLATE SHOULD BE USEF FOR PROJECT OR A CLUSTER OF A COHERENT SET OF ACTIVITIES THAT HAVE A COMMON IMMEDIATE OBJECTIVE AND/OR SET OF CLIENTS]
1. RESULTS
1a. Was the objective of the project met? (Max. 200 words)
If it was not met, please explain.
1b. Current indicator values
If there are differences with the targets, please explain.
2. IMPLEMENTATION
3a. Description of project / activity / sub-activities implementation
If there are differences between activity original concept and implementation, please also explain (Max. 200 words)
3b . Project effective start and completion date: (MM/DD/YYYY)
If there are differences between activity dates in the original concept and implementation, please explain (Max. 200 words)
3c. Timeline (or agenda)
Please provide the implementation timeline or agenda for the project / activity
3f. Participants or clients and profiles (Max. 200 words)
Please provide information on audience, based on actual participant information forms
If there are differences between original and actual audience, please explain differences and reason for change (Max. 100 words)
3g. Staff
• Please list the center’s staff and partners that participated in the project / activity and in which capacity
• If there are differences with concept note, please explain differences and reason for change (Max. 100 words) 25
3h. Updated Budget
Project / Activity / Sub Activity
CLEAR Funds (%)
Other Donors (%)
Counterpart or Client (%)
Income from the activity (amount)
Support to XX Presidency's M&E System
International Conference on Country M&E Systems with Presidency
Sub- Activity 2
Sub- Activity 3
Explain differences between activity original expected budget and effective budget (Max. 200 words)
3. PERFORMANCE
4a. Participant /Client Feedback and results (200 words)
If a participant / client feedback form has been requested from the clients, please provide aggregated data in the excel format attached
4. LESSONS
5a. Problems encountered during implementation and actions taken to overcome them (Max. 200)
5b. Lessons learnt from the design and implementation of the project / activity (Max. 300 words)
5. ATTACHMENTS:
Original and effective participating CLEAR Staff CVs
Participants’ lists
Activity Materials (Agendas, Presentations, Deliverables, etc.)
26
Form 4: Participant Registration Form
REQUIRED (for workshops, conferences, courses)
Activity Name:
Activity Date:
Activity Location:
Dear Participant:
Please complete all questions on this form to register for [NAME OF ACTIVITY]
To answer the closed-ended questions please fill in the circle completely (). If you wish to change an answer, fully erase it or draw an (X) over the unwanted mark and fill in the circle indicating your preferred answer. Please choose only one answer per question.
(REQUIRED QUESTIONS)
1. Your name
First/given name
Last/family name
2. Email and mailing addresses
3. Are you
〇 Male
〇 Female
4. What is your nationality?
5. In which country do you work?
6. Which of the following best describes your position?
〇 Administrative
〇 Junior Staff
〇 Technical
〇 Mid-level Staff
〇 Managerial
〇 Senior Staff
〇 Political
〇 Top Leadership
〇 Other, please specify ____________________.
〇 Other, please specify ____________________.
7. What type of organization do you currently work for?
〇 Government
〇 Academia/Training/Research 27
〇 Donor/Bilateral/Multilateral
〇 Civil Society/Non-government organization
〇 Private Company
〇 Other, please specify ____________________.
8. What is your main reason for taking this course?
〇 To enhance performance in current or planned work assignment
〇 To network and share information
〇 For professional interest and growth
〇 Other, please specify ____________________.
(OPTIONAL QUESTIONS AS THEY MAY HELP YOU PLAN YOUR ACTIVITY)
9. Please describe your current work responsibilities
10. Please describe how your work responsibilities are related to this course
11. How do you plan to use what you will learn from this course?
12. Please provide any other information that might be useful for us in planning this activity
THANK YOU!
28
Form 5: Participant Activity Feedback Form
Activity Name:
Activity Date:
Activity Location:
Dear Participant:
Please complete this questionnaire to help us improve the quality of our services in the future. Your responses — no matter how positive or negative — are valuable to us. Your responses are anonymous and confidential.
To answer the closed-ended questions please fill in the circle completely (). If you wish to change an answer, fully erase it or draw an (X) over the unwanted mark and fill in the circle indicating your preferred answer. Please choose only one answer per question.
(REQUIRED QUESTIONS)
13.Which of the following best describes your main role in this activity?
〇 Participant
〇 Observer
〇 Resource person (organizer, presenter, facilitator, interpreter, administrative staff, etc.)
〇 Other, please specify ____________________.
14. How much of the activity were you able to attend?
〇 All of it (every day, all sessions)
〇 Most of it (most days and sessions)
〇 More than half
〇 Half or less
15. Are you
〇 Male
〇 Female
16.Which of the following best describes your position (answer both type and level questions)?
a. Type
b. Level
〇 Administrative
〇 Junior Staff
〇 Technical
〇 Mid-level Staff
〇 Managerial
〇 Senior Staff
〇 Political
〇 Top Leadership
〇 Other, please specify ____________________.
〇 Other, please specify ____________________. 29
17. What type of organization do you currently work for?
〇 Government
〇 Academia/Training/Research
〇 Donor/Bilateral/Multilateral
〇 Civil Society/Non-government organization
〇 Private Company
〇 Other, please specify ____________________.
18. What was your main reason for taking this training?
〇 To enhance performance in current or planned work assignment
〇 To network and share information
〇 For professional interest and growth
〇 Other, please specify ____________________.
Below, please rate each question on a scale of 1 to 5 by filling in the circle that best corresponds to your opinion.
If a question does not apply to you, or if you do not have enough information to express an opinion, fill in the last circle for the “no opinion” option.
1. Overall quality of the activity ●1 ●2●3 ●4●5 ●
2. Relevance of the activity to your current or planned work ●1 ●2●3 ●4●5 ●
3. Increase in your knowledge/skills as a result of participating in the activity ●1 ●2●3 ●4●5 ●
4. The extent to which you plan to apply the knowledge/skills gained to your current or planned work ●1 ●2●3 ●4●5 ●
5. The extent to which the course materials were useful for learning ●1 ●2●3 ●4●5 ●
6. The extent to which the structure (pacing and sequencing) of the activity was conducive to your learning ●1 ●2●3 ●4●5 ●
7. The extent to which the activity provided an opportunity to develop networks for future collaboration/knowledge sharing ●1 ●2●3 ●4●5 ●
8. What knowledge/skills from the activity do you plan to use on the job?
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
No Opinion
Maximum
Minimum
30
9. What type of support do you need to apply the newly acquired knowledge/skills to your current or future job?
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
10. What recommendations do you have for improving the activity?
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
11. Other comments:
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
THANK YOU!
31
Form 6: Individual Instructor Evaluation Form
Activity Name:
Activity Date:
Activity Location:
Session Title:
Dear Participant
This questionnaire asks for your opinions regarding the instructor of this session. Your responses — no matter how positive or negative — are valuable to us. The feedback will help the instructor improve his/her session in the future. Your feedback is anonymous and confidential.
Below, please rate each question on a scale of 1 to 5 by filling in the circle that best corresponds to your opinion. If a question does not apply to you, or if you do not have enough information to express an opinion, fill in the last circle for the “no opinion” option.
Instructor:
How would you rate the instruction in this session on each of the following aspects?
12. Extent to which the instructor was knowledgeable about the subject ●1 ●2●3 ●4●5 ●
13. Extent to which the instructor was effective in sequencing and pacing the session ●1 E AA●3E AA●4E AA●5E A A●E ●2
14. Extent to which the instructor was effective in stimulating useful discussion
A
●1 ●2 E AA●4E AA●5E A A●E ●3
15. Extent to which the instructor provided examples of how to apply the materials covered in the session to practical work
A●1E AA●2E AA●3E AA●4E AA●5E A A●E
16. Extent to which the instructor met the overall objectives of the session
A●1E AA●2E AA●3E AA●4E AA●5E A A●E
17. Please provide any other comments you might have
THANK YOU!
No Opinio
n
Maximum
Minimum
32
Form 7: Technical Assistance and Advisory Services Evaluation Form
REQUIRED (implemented 3-9 months after the service)
Service Name:
Service Date:
Name of Client:
Dear [XXX]
[NAME OF CENTER] provided [NAME/DESCRIPTION OF SERVICE] to [NAME OF ORGANIZATION] for [PURPOSE] from [DATES]
We would like to obtain your feedback about the quality, relevance, and usefulness of our services. Your feedback will help us improve the quality of our services in the future. Your responses — no matter how positive or negative — are valuable to us.
To answer the closed-ended questions please fill in the circle completely (). If you wish to change an answer, fully erase it or draw an (X) over the unwanted mark and fill in the circle indicating your preferred answer. Please choose only one answer per question.
1. What were the objectives of hiring the Center’s services? Please summarize top 1-3 objectives below
Objective 1: _______________________________________________________________________________________________
Objective 2: _______________________________________________________________________________________________
Objective 3: _______________________________________________________________________________________________
2. Were the objectives related to (check all that apply)
〇 Improving the skills or knowledge of your organization’s staff
〇 Designing or improving your organization’s processes and/or procedures
〇 Designing or improving the methods, approaches, and tools of your organization’s work
〇 Designing or improving how your organization works with other organizations and
Below, please rate each question on a scale of 1 to 5 by filling in the circle that best corresponds to your opinion.
If a question does not apply to you, or if you do not have enough information to express an opinion, fill in the last circle for the “no opinion” option.
3. The extent to which the services met objective #1 ●1 ●2●3 ●4●5 ●
4. The extent to which the services met objective #2 ●1 ●2●3 ●4●5 ●
5. The extent to which the services met objective #3 ●1 ●2●3 ●4●5 ●
6. The overall quality of the service ●1 ●2●3 ●4●5 ●
No Opinion
Maximum
Minimum
33
7. The extent to which your organization’s staff use the knowledge/skills they gained as a result of the service ●1 ●2●3 ●4●5 ●
8. The extent to which your organization’s functions improved as a result of the service ●1 ●2●3 ●4●5 ●
9. The extent to which your organization’s work methods/approaches/tools improved as a result of the service ●1 ●2●3 ●4●5 ●
10. The extent to which your organization’s coordination/work with other organizations/institutions improved as a result of the service ●1 ●2●3 ●4●5 ●
11.Would you consider contracting the Center’s services again in the future?
〇 Yes
〇 No
〇 Maybe
12.Would you consider recommending the Center’s services to another organization?
〇 Yes
〇 No
〇 Maybe
13. Please provide any other comments you might have
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
THANK YOU!
34
Form 8: Participant Evaluation Form (Tracer)
REQUIRED (for workshops, conferences, courses)
Activity Name:
Activity Date:
Activity Location:
Dear Participant:
It has been [3-12 months] since you participated in [name of the activity] on [date] and we would like your feedback on the practical value of the [training/course/workshop] you attended. Your feedback is important to us.
Please complete this questionnaire to help us improve our [courses/workshops] in the future. We look forward to your candid opinions. Your responses are confidential and anonymous.
To answer the closed-ended questions please fill in the circle completely (). If you wish to change an answer, fully erase it or draw an (X) over the unwanted mark and fill in the circle indicating your preferred answer. Please choose only one answer per question.
1. How much of the activity did you attend?
〇 All of it (every day, all sessions)
〇 Most of it (most days and sessions)
〇 More than half
〇 Half or less
2. Are you
〇 Male
〇 Female
3. Which of the following best describes your position?
〇 Administrative
〇 Junior Staff
〇 Technical
〇 Mid-level Staff
〇 Managerial
〇 Senior Staff
〇 Political
〇 Top Leadership
〇 Other, please specify ____________________.
〇 Other, please specify ____________________.
4. What type of organization do you currently work for?
〇 Government
〇 Academia/Training/Research
〇 Donor/Bilateral/Multilateral 35
〇 Civil Society/Non-government organization
〇 Private Company
〇 Other, please specify ____________________.
5. How many colleagues or others with whom you work closely attended this activity?
〇 0
〇 1-2
〇 3-4
〇 5-10
〇 Over 10
6. In which country do you work? ___________________________________________
7. Thinking about the activity you attended, do you use the knowledge and skills you acquired in any of the following areas?
〇 Teaching
〇 Conducting evaluations and/or research
〇 Implementing monitoring and evaluation strategies in your organization
〇 Designing new projects or programs
〇 Integrating monitoring and evaluation into the implementation of your projects or programs
〇 Raising public awareness of monitoring and evaluation
〇 Using monitoring and evaluation strategies/methods for accountability
〇 Other, please specify ____________________.
Below, please rate each question on a scale of 1 to 5 by filling in the circle that best corresponds to your opinion.
If a question does not apply to you, or if you do not have enough information to express an opinion, fill in the last circle for the “no opinion” option.
To what extent did the activity
8. Raise your awareness and understanding of the subject of the training/course ●1 ●2●3 ●4●5 ●
9. Provide you with knowledge and skills relevant to your work ●1 ●2●3 ●4●5 ●
10. Provide you with understanding of how to apply the knowledge and skills you gained to your work ●1 ●2●3 ●4●5 ●
11. Help you improve your work ●1 ●2●3 ●4●5 ●
12. Influence changes in your organizations’ practices and procedures? ●1 ●2●3 ●4●5 ●
13. Help you develop contacts, partnerships, or peer-to-peer learning relationships? ●1 ●2●3 ●4●5 ●
14. To what extent was the activity useful overall for your work? ●1 ●2●3 ●4●5 ●
No Opinion
Maximum
Minimum
36
15. Have you participated in any follow-up activities with the Center?
〇 Yes
〇 No
16. Please provide any other comments you might have
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
_____________________________________________________________________________________________________________
THANK YOU!
37
BIBLIOGRAPHY
ADB. 2008. Effectiveness of ADB’s CD Assistance: How to Get Institutions Right? Manila: ADB.
AED Center for Community-Based Health Strategies. 2002. Annie E.Casey Foundation Community Health Summit Toolkit: Process Evaluation as a Quality Assurance Tool. (http://www.worldbridgeresearch.com/files/Process_Evaluation.pdf).
AusAID. 2012a. AusAID’s Quality Systems for Effective AID. Canberra: AusAID.
2012b. Policy: Performance Management and Evaluation. Canberra: AusAID.
2011. ODE Performance Framework. Canberra: AusAID.
Consortium for International Development in Education. 2012. Assessing Sustainability. Quebec: CIDE.
DfID. 2012. Quality Assurance (QA) Guidelines. London: DfID.
DfID. 2011. Learning Strategy 2011-15. London: DfID.
EFQM. 2012. An Overview of the EFQM Excellence Model. Brussels: EFQM.
EC. 2008. Managing quality assurance and quality control. http://ec.europa.eu/regional_policy/sources/docgener/evaluation/evalsed/guide/designing_implementing/managing_evaluations/quality_en.htm
ESS 2010. Quality Glossary 2010, Developed by Unit B1 "Quality; Classifications", Eurostat. http://www.eqavet.eu/qa/gns/glossary/q/quality-assurance.aspx
G. Ogiogio. 2005. Measuring performance of interventions in Capacity Building: Some fundamentals. Occasional paper no.4. Harare: The African Capacity Building Foundation.
GTZ. 2007. “GTZ Services for Rural Development”, in Bulletin, No.15, pp.1-40. Eschborn: GTZ.
2001. Reform of Social Services: A Conceptual Framework for Orientation. Eschborn: GTZ.
International Standard Organization. 1999. Quality Management – Guidelines for training. Geneva: ISO.
JAMEEL Poverty Action Lab. 2012. Translating Research into Action – Event Planning. New Delhi: J-PAL.
Niederberger, A.Q. and L.Yiu. 2003. “JIQ Discussion Platform: Management Systems for Capacity Building”, in Joint Implementation Quarterly, October 2003, pp. 8-9.
N. Smithers. 2011. The Importance of Stakeholder Ownership for Capacity Development Results. Washington DC: World Bank.
OECD and Development Assistance Committee. 2008. Survey on the Paris Declaration: Survey on 38
Monitoring the Paris Declaration. Making Aid More Effective by 2010. Paris: OECD.
2006. The Challenge of Capacity Development: Working Towards Good Practice. Paris: OECD.
1991. Principles for Evaluation of Development Assistance. Paris: OECD.
OECD. 2005. Paris Declaration on Aid Effectiveness. Paris: OECD.
2000. The Glossary of Evaluation and Results Based Management (RBM) Terms. Paris: OECD.
2010. Glossary of Key Terms in Evaluation and Results Based Management. Paris: OECD
“Quality Assurance.” Merriam-Webster.com, Merriam-Webser n.d. Web. June 24, 2014. http://www.merriam-webster.com/dictionary/quality%20assurance
Saner, R. and L.Yiu. 2007. The need for verifiable and robust training management system for the capacity building projects and TRTA within the trade/WTO context. (http://www.wto.org/english/tratop_e/devel_e/a4t_e/csend_saner_e.pdf).
Taylor, P. and P. Clarke. 2008. Capacity for a Change. Sussex: IDS.
UNDP. 2014. “Capacity Development.” Our Work. Web. June 24, 2014. http://www.undp.org/content/undp/en/home/ourwork/capacitybuilding/overview/
2002. Handbook on Planning, Monitoring and Evaluating for Development Results. New York: UNDP.
UN-HABITAT. 2007. Training and Capacity: Needs Assessment – Impact Evaluation – Strategy Development in an Urban Context. Nairobi: UN-HABITAT.
Wikipedia. 2012. Searched for ‘database management system’, on 12 August 2012, and accessed at http://en.wikipedia.org/wiki/Database.
World Bank. 2012a. A Guide to Assessing Needs: essential tools for collecting information, making decisions and achieving development results. Washington DC: World Bank.
2012b. Leveraging Knowledge into the Africa Region’s Quality Assurance Process. http://siteresources.worldbank.org/AFRICAEXT/Resources/qapfin.pdf.
2008. Using Training to Build Capacity for Development. Operations Evaluation Department. Washington D.C: World Bank.
2006a. Review of Development Effectiveness. Independent Evaluation Group. Washington D.C: World Bank.
2006b. Implementation Completion and Results Report: Guidelines. Washington D.C: World Bank.
2005a. Capacity Building in Africa: an OED evaluation of World Bank support. Operations Evaluation Department. Washington D.C: World Bank. 39
2005b. “World Bank Support for Capacity Building in Africa”, in OED Reach, March 2005, Operations Evaluation Department. Washington D.C: World Bank.
2004. Quality at Entry in FY 03 (QE A6): Annexes to the QAG Assessment January 14, 2004, Vol 1. Washington D.C: World Bank.
World Bank Institute. 2010. Institutional Capacities and Their Contributing Characteristics: Capacity Development Resource for Institutional Diagnostics, Program Design and Results Management. Washington D.C: World Bank.
2009. World Bank Institute Renewal Strategy: An Emerging Direction. Washington D.C: World Bank.
2006. Developing Capacities in Countries: WBI Annual Report 2006. Washington D.C: World Bank. 40
ANNEX 1- QUALITY ASSURANCE TOOLS AND FRAMEWORKS OF MULTILATERAL AND BILATERAL DEVELOPMENT AGENCIES
OECD
The OECD Development Assistance Committee (DAC) established in 1991 five principles for evaluating development assistance (relevance, effectiveness, efficiency, impact and sustainability), which the development community at large is currently applying and using as a guiding framework during evaluations and quality assurance processes.
GIZ’s Quality Management Framework
GIZ applies OECD DAC’s five evaluation criteria to all its evaluations and provides the following quality features and questions across all areas of intervention:
BOX 2: OECD DAC five evaluation principles:
• Relevance: The aid activity is suited to the priorities and policies of the target group, recipient and donor.
Useful questions: Were activities cost-efficient? Were objectives achieved on time? Was the program or project implemented in the most efficient way compared to alternatives?
• Effectiveness: The aid activity attains its objectives.
Useful questions: To what extent were objectives achieved or are likely to be achieved? What were the major factors influencing the achievement or non achievement of the objectives?
• Efficiency: The least costly resources used to achieve desired inputs.
Useful questions: Were objectives cost efficient? Were objectives on time? Was the program or project implemented in the most efficient way compared to alternatives?
• Impact: The direct, indirect and intended or unintended positive and negative changes produced by the development intervention.
Useful questions: What has happened as a result of the program or project? What real difference has the activity made to the beneficiaries? How many people have been affected?
• Sustainability: The benefits of the activity are likely to continue after donor funding.
Useful questions: To what extent did the benefits or a program or project continue after donor funding ceased? What were the major factors which influenced the achievement or non-
41
According to GIZ, services are of good quality if they contribute to the achievement of overarching objectives, are managed efficiently, and delivered in an economical way, results oriented and durable. These quality criteria are referred to as success factors of their CD interventions and used as a framework for reporting the status of their CD interventions.
GIZ provides a useful framework to assess the quality of their most widely CD service offered, namely their advisory services. The quality of a service is defined as client satisfaction with a product or a service and the objective of quality management is to examine whether this is achieved. GIZ assesses the quality of their advisory services according to two important pillars: (i) product-based quality; and (ii) user-based quality.
Product-based quality seeks to understand the quality of a service from the producers’ position. The latter is related to the capability and the willingness of the service supplier to assure a certain level of quality at all times. User-based quality examines quality of a service from the customers’ point of view. The quality of a service is derived by comparing customer’s expectations with their perception of the actual service provided. The expectations are determined by individual needs, past experience and various forms of communication related to the services. The determinants of customers' perceptions of quality in service supply are defined in terms of reliability, responsiveness, sovereignty and empathy of service provider. The user based assessment is crucial for GIZ and is assessed through the following equation: satisfaction = perception – expectation (GTZ, 2001).
In addition, GIZ examines service quality at three different levels, which are later on suggested in the conceptual framework, namely the service potential, the service process and service results. The first level assesses local capabilities and pre-conditions for delivering the service. The second level examines how the service is implemented (the quickness of the service, responsiveness, quality of methodology and approach). The third level, service result explores the exact result achieved and whether set objectives were effectively and efficiently realized (e.g., Is the customer satisfied and was there optimal and maximum use of resources available?) (GTZ, 2001).
AusAID and ADB’s Quality Assurance Framework
AusAID’s has adopted OECD DAC’s five criteria for quality at entry, implementation and at exit assessments for all their monitored activities and has introduced two additional criteria defined in the box below.
Box 3: GIZ’s quality evaluation criteria
1. Values geared to sustainability
2. Economic use of resources
3. Efficient steering
4. Compliance with rules and regulations
5. Positive results achieved
Source: GTZ, 2007; www.gtz.de/en/unternehmen/31523.htm.
42
ADB proposes nine quality standards when assessing the quality at entry of their CD interventions, which relate to the above concepts of relevance, sustainability, effectiveness and efficiency. In addition, two other dimensions are perceived as crucial to ensure quality at entry: (i) adequate diagnostic assessment and cooperation; and (ii) harmonization with other development agencies.
Box 4: AusAID’s Quality Assurance Framework
Relevance: Why are we doing this? Based on the needs assessment, the strategy/program/initiative is the most appropriate way to meet high priority goals that Australia shares with its development partners within a given context.
Effectiveness: Will it work? Do we know where we want to be and what is our objective? The strategy/program/initiative is meeting or will meet its objectives, and is continually managing risk.
Efficiency: How will we do it? The resources allocated by AusAID and its partners are appropriate to line objectives and context and are achieving intended outputs.
M&E: How will we know? An appropriate system provides sufficient information and is being used to assess progress towards meeting objectives.
Impact: Are we contributing to the achievement of long-term goals and overall strategy? An assessment of the positive and/or negative changes (directly or indirectly, intended or unintended) produced by the strategy/program/initiative.
Sustainability: Will benefits last? Significant benefits will endure after AusAID’s contribution has ceased, with due consideration given to partner systems, stakeholders ownership and plans for phase out?
Analysis and learning: How well have we thought this through? The strategy/program/initiative is based on sound technical analysis and continuous learning.
Gender equality: How will we achieve gender equality? The strategy/program/initiative incorporates appropriate and effective strategies to advance gender equality and promote women and girls empowerment.
Source: AusAID, 2012b. 43
AusAID and ADB are the first development agencies to underline how assessing the quality at entry and at exit according to standardized set of quality criteria is insufficient to guarantee the quality of results achieved. One must also take into account the quality at implementation providing key contextual information on which and why certain processes, tools, methods and approaches were successful or not and possible corrective actions that are required during the implementation phase.
ADB’s quality at implementation examines three different aspects: (i) whether there is sufficient and qualified staff for implementation and supervision; (ii) the degree of flexibility during implementation and supervision; and (iii) whether there were any substantial delays in the implementation process (ADB, 2008).
Whereas, AusAID’s conducts annual implementation assessments against the criteria of relevance, effectiveness, efficiency, sustainability, gender, monitoring and evaluation based on feedback processes. Furthermore, AuAIDs also conducts an assessment of current cross-cutting implementation issues and a reflection on four key questions to improve performance of the next implementation phase: (i) Which actions do we need to take to improve the initiative? How will the change happen? Who will be responsible for improving? When will the improvement occur? (AusAID, 2012a).
Finally, UNDP and the World Bank, have integrated an additional dimension to their QA frameworks: (i), the quality of their knowledge products; (ii) use of new knowledge and skills. Three common knowledge quality criteria are identified through the literature review: accuracy, consistency and adequacy. The first criteria refers to whether: (i) the knowledge targeted reflects clients’ needs and priorities identified in needs assessments or stakeholder consultations; (ii) lessons learned from previous activities have been taken to account and incorporated; (iii) a review process was undertaken by internal or/and external experts and other relevant stakeholders such as management. The second examines whether the learning content, material and approach used is adapted to local context and needs. The third criteria
Box 5: Quality at Entry of ADB’s Capacity Development Interventions
1. Clear results framework
2. Strategic direction with realistic CD objectives
3. Adequate diagnostic baseline assessments at all CD levels (individual, organizational, network and contextual levels)
4. Long-term continuity to institutionalize CD, careful phasing, sequencing and exit strategy
5. Appropriate mix of modalities
6. Mainstreaming project implementation management unit activities into target agencies normal operations
7. Adequate staff, time, skills and financial resources
8. Inclusive participatory approach with strong commitment of/and ownership by target agencies
9. Cooperation and harmonization with other development agencies
Source: ADB, 2008. 44
reflects how well the learning content and materials used are clearly written and easily understood (UNDP 2002; World Bank 2004, 2012b).
In sum, the literature review of QA processes and tools applied by OECD, GIZ, ADB, AusAID, UNDP and World Bank reveals how six common international quality standards applied by the development community: relevance, quality of content, effectiveness, efficiency, impact and sustainability. Moreover, the literature review illustrates how these six quality standards are assessed during four different types of organizational processes: i) when an organization determines and plans which service or product to implement; (ii) when an organization designs their product or service; (iii) during implementation of CD activities; and (iv) during monitoring and evaluation (M&E) of outputs and outcomes achieved to improve planning and design of future CD interventions.
Based on the above findings, the following QA questions were identified and proposed to CLEAR’s regional centers and participating bilateral agencies to collect information on their QA processes and tools:
• How are decisions made regarding which service or product to produce?
• How do you determine the objective and approach (processes and methods) of your service/product?
• How do you ensure the quality of the content of your service/product? Which quality standards are used and do you have a review process?
• How do you monitor and evaluate whether your service or product is achieving the desired result? Which information is gathered and from whom?
• Have you ever taken actions to enhance the quality of your products and services?
45
Annex 1-A: Quality Assurance of Trainings
CD generally aims to promote learning and training programs are a common tool used to transfer knowledge and newly required skills. However, there is growing recognition that training programs do not always lead to improved organizational performance because of the wrong assumption that individual learning is sufficient and that no additional resources are needed to transfer individual learning into positive organizational level outcomes (Niederberger and Yiu, 2003; Taylor and Clark, 2008).
A 2008 World Bank Independent Evaluation Group study on the use of training to build local capacity found that half of the time, training programs do not sufficiently lead to improved sustainable organizational performance because of insufficient targeting of training to organizational needs, and managerial support to apply newly learned skills. Training assessments were based on the number of people trained and the amount of money spent and not whether the organizational setting was conducive and encouraged staff to learn and apply new skills (World Bank, 2005 and 2008; Taylor and Clark, 2008).
In the past decade, quality assurance frameworks of training programs have changed their focus on what trainees are actually learning and whether training programs objectives are integrated to broader organizational strategic objectives (Niederberger and Yiu, 2003). An illustration is the quality training standard guidelines developed in 1999 by the SWISS based International Organization for Standardization (ISO – 10015).
The objective of the ISO framework is to provide a step by step guidance on how to identify and analyze training needs, design and plan the training, provide for the training, evaluate training outcomes, and monitor and improve the training process in order to achieve organizational objectives (ISO, 1999).
STEP 1: Conduct an organizational performance gap analysis exploring organizational gaps and contextual factors influencing the organization’s underperformance.
Example: Illustration of a problem tree analysis
Ministry of Trade is underperforming (why?)
Inadequate organizational structure – Ineffective Leadership – Recruitment of staff with wrong skills – Inadequate competencies and skills of current Civil Servants – Remuneration not market competitive – Inadequate motivation (overload)
Which new skills are required for the Ministry of Trade to perform better?
STEP 2: Define the skills needs of the organization
Compare existing organizational competencies with those required through (interviews, questionnaires with employees, observations, group discussions, inputs from experts)
STEP 3: Identify which method would be appropriate to close competence gap (training; e-forums; workshop; new organizational procedures/policies; turnover of staff) 46
STEP 4: Define training needs
Document the objectives and the expected outcomes of the training.
STEP 5: Training Plan
Clear understanding of the organizational needs, the training requirements and training objectives that define what the trainees need to do differently as a result of the training.
STEP 6: Select training provider
Internal or external training provider should be subject to critical examination before being selected to provide training.
STEP 7: Provision of the training
Determine training support from the organization (relevant tools, equipment, documentation, software or accommodation to the trainee).
Are relevant and adequate opportunities given to the trainee to apply the skills/knowledge?
Is feedback requested from the trainee, the trainer and managers?
STEP 8: Evaluation of training outcomes
The purpose of the evaluation is to confirm that organizational and training objectives have been met.
The inputs for the evaluation of training outcomes are the specifications for training needs and for the training plan and the records of training.
Evaluations should be conducted on a short and long term basis (e.g. which knowledge was acquired and trainees on the job performance has increased, impact on the trainees’ organization).
Evaluation report should include: training needs; evaluation criteria and description of sources; methods and schedule for evaluation; analysis of data collected and interpretation of results; review of training costs and; conclusions and recommendations.
Step 9: Monitoring and improving the training process
Based on information of the evaluation report, determine if procedures and requirements are met. If not, determine corrective actions that may be required to improve training process or develop other non training CD solutions (ISO, 1999; Saner and Yiu, 2007). 47
Annex 1-B: Quality Assurance of Advisory Services
Instruments used in GIZ’s quality management processes:
Staff survey:
The staff survey forms a key component of the EFQM model. It is implemented at GIZ annually and involves all staff members in Germany and abroad, as well as selected national staff. The results of the staff survey are a key input for self-assessment.
Client survey (also includes survey of counterpart institutions):
The client or customer survey is key to quality management. It yields numerous pointers for improvement measures. It is carried out through interviews or questionnaires. In the medium term, aggregate assessments of clients and partners at country level are provided GIZ’s eVAL software.
Self-assessment:
GIZ conducts also self-assessment highlighting both the strengths of the organization, and potential improvements it might make. Self-assessment also forms the basis for the preparation of action plans for improvement and their implementation.
External assessment (forthcoming):
External assessment comprises a review of the self-assessment process and is conducted by EFQM certified assessors. For each criterion the team of assessors agrees to award a certain number of points on the basis of the EFQM rating scale. The team of assessors may also conduct site visits to permit more in-depth assessment. The assessors' report documents what they see as the organisation's strengths and potential for improvement. The external assessment provides the assessed business with numerous pointers on possible improvements, as seen from the perspective of specialist EFQM external assessors.
Application of EFQM in GIZ projects and programs:
EFQM is also applied in Technical assistance projects and programs. It has been thus far applied in health care and economic oriented projects and programs. The EFQM is used both to analyze and assess projects (for instance within the scope of a Project Progress Review), or is used as a quality management model for partner institutions. Instruments used are generally staff and client surveys.
Key to the successful application of the EFQM model is support from management. Unless and until such support is forthcoming, EFQM is not applied. Key to the success of the EFQM model is the perception on the part of the workforce and the management team that the EFQM model constitutes a systemic framework in which all change and improvement processes have their place. EFQM then ceases to be seen as an add-on or one-off activity to be implemented laboriously once a year, and comes to be understood as a permanent process of improvement.
Source: (GTZ 2007, p.19; http://www.giz.de/en/SID-3370D071-4C8739F6/aboutgiz/quality_management.html) 48
Annex 1-C: Quality Assurance of Evaluation Processes and Reports
AusAID’s Performance Assessment Framework (PAF)
AusAID’s Office of Development Effectiveness examines whether ODE products and services in general are achieving the desired impact through the PAF, which seeks to answer two sets of performance questions, relating to achievements against intended outcomes where application and use of knowledge provided in ODE trainings and workshops is assessed. In addition, one important foundational principal is how effectively ODE produced credible, timely and accessible information.
The following set of quality standards are used when assessing the use of knowledge provided by ODE trainings and quality of ODE information.
1. How effectively does ODE influence senior AusAID staff/Senior WoG staff/PEPD to take up recommendations, key messages and advice, to inform systems, policies and programs?
Highly Effective
Senior managers and PEPD see ODE products as crucial and highly relevant sources of information.
Recommendations are typically acted upon
Very Effective
Program designs and strategies reflect ODE findings
ODE materials formally referenced in policy/design documents
AusAID processes and guidelines created or revised as the result of ODE input
New resources (new funding, new positions created, new programs initiated, new research commissioned) allocated
Other decisions made as the result of ODE briefing and input into other key decision-making documents
Effective/Adequate
Management provides responses to major pieces of evaluation and analysis
Senior AusAID staff/Senior WoG staff/PEPD routinely accept input from ODE on common areas of work
Poor/Detrimental
Senior AusAID staff/Senior WoG staff/PEPD considers ODE’s analysis irrelevant and do not accept or engage with major pieces of work.
Poorly received recommendations and analysis diminish ODE’s reputation and standing
2. How effectively did ODE produce credible, timely and accessible information?
Highly Effective
ODE’s audience (inside and outside of AusAID) significantly expands [visits to website used as proxy]
ODE is recognised as a ‘go to’ source of relevant information relating to aid and 49
development effectiveness.
Peer review of ODE products is very high [target scores TBD)
Very Effective
Senior AusAID staff/ Senior WoG staff/ PEPD see ODE’s forward work plan as relevant to agency needs.
Final reports for ODE evaluations and analysis are published within X months of finalising fieldwork
Effective/Adequate
All final reports from major pieces of analysis are published in the public domain.
Final reports from major pieces of analysis are accompanied by short, stand-alone summaries.
Peer reviewers consistently rate ODE’s evaluations to be credible and robust.
Poor/Detrimental
Senior staff consider ODE forward work plan irrelevant to agency needs
ODE audiences and peer reviewers consider product quality to have declined
Source: AusAID, 2011, p.1-2.
DfID’s Quality Assurance (QA) Guidelines
DfID has established quality assurance guidelines at entry and at exit based on OECD DAC’s five quality standards (relevance, effectiveness, efficiency, impact and sustainability) for development evaluation, to enhance the quality of its evaluation processes and products. Quality at entry and at exit focus mainly on the rationale, scope, context, evaluation framework and data collection of the evaluation process and report.
The following paragraphs outline quality at entry and at exit applied by DfID to ensure that DfID’s evaluation products meet internationally agreed standards. These standards are assessed by an independent evaluation expert in DfID and in partner institution and by technical experts of DfID’s steering/advisory group.
Quality at Entry:
1. Are the rationale, objectives, scope and context of the evaluation well defined and consistently stated?
2. Is there an analytical framework? Are the evaluation questions clearly identified, related to OECD DAC evaluation criteria, sufficiently focused and related to the intended purpose of the evaluation?
3. Are the expected data sources established, and do they allow for multiple lines of enquiry? Are the techniques for data collection appropriate and will they allow for a mix of qualitative and quantitative disaggregated data?
4. Do the methods address issues of impartiality, propriety, inclusion and ethics?
5. Does it look like the evaluation will lead to a clear delineation of evaluation findings, evidence-based conclusions, lessons to be learned, and recommendations that can be implemented?
6. Are expectations realistic, given time and resources that will be allocated? 50
7. Does the evaluation design identify challenges and risks, and does it make appropriate proposals for dealing with these?
8. Is the evaluation designed to meet the information and decision-making needs of the intended users and other stakeholders?
9. To what extent have the Paris Declaration Principles been incorporated into the design of the evaluation?
10. Have main cross-cutting issues of gender, poverty, social exclusion, human rights. HIV/AIDS and the environment been incorporated in the analytical framework? Do the questions reflect a sound understanding of the political economy?
11. Are the proposed approach and management arrangements appropriate?
12. Is there a Communications and Dissemination Plan and will it enable a transparent process that engages and meets the needs of all users, including primary stakeholders?
Quality at Exit:
1. Is there a clear rationale for why the study is being done, why now and who it is for?
2. Does the report describe the scope and coverage of the evaluation? Is the rationale of the intervention or policy clear?
3. Are the policy, development and institutional context of the intervention clearly assessed, including political economy, poverty, gender, environment and rights issues?
4. Is the evaluation framework established and was it appropriate, ensuring that diverse views were heard?
5. Was the data collected sufficiently disaggregated to enable diverse views to be reflected? Was it collected in an appropriate manner and was the information sufficiently triangulated?
The European Commission’s (EC) quality guidelines for their evaluation processes and reports
A distinction is made between quality control and quality assurance. The latter examines the process through which the evaluation activities are performed. Whereas, quality control, assesses the quality of products of the evaluation process.
The EC examines the quality of its evaluation processes and reports at three different stages (at entry, at implementation and at exit). The quality of an evaluation process is determined by:
• the quality of the planning and design phase, including the commissioning of the evaluation;
• the quality of the implementation of the evaluation report;
• the quality of the Monitoring and Evaluation (M&E) system and of the available data. 51
Quality Assurance
Quality Control
1. Well drawn terms of reference (TOR)
2. Transparent and sound tender selection process
3. Coherent and evaluable objectives
4. Open process - Effective dialogue and feedback with stakeholders during the design of the evaluation and discussion of results
5. Good management and co-ordination by evaluation team
6. Effective dissemination of reports/outputs to steering committee and policy/programme managers
7. Effective dissemination to stakeholders.
1. Meeting needs as laid out in TOR
2. Relevant scope and coverage where the rationale has been carefully studied.
3. Methods and tools make use of existing research and analyses
4. Defensible design where the design was appropriate and adequate for obtaining the required results to answer the main evaluative questions.
5. Reliability of the data has been checked
6. Sound analysis where quantitative and qualitative data was analysed in accordance with established conventions
7. Credible results that relate to analysis and data.
8. Impartial conclusions showing no bias and demonstrating sound judgement.
9. Clear report, describing the context and goal as well as the organization and results of the programme and is easily understood.
10. Useful recommendation for stakeholders and are detailed enough to be implemented.
52
ANNEX 2 –CLEAR CAPACITY DEVELOPMENT ACTIVITIES
Activity Types
Definitions
Outputs of activities
Training (Learning)
Delivery of learning events (F2F and electronic media), such as courses, either directly or with national, regional or global partners. Could involve direct delivery by centers and/or co-sponsorship with others (e.g., CSOs) who are leading the training.
• F2F courses, seminars, workshops
• E-learning courses
• Webinars, short videos on a topic (e.g., logic models)
• Conferences (for learning)
Knowledge Exchange (Sharing)
Facilitate and support exchanges of information among practitioners on global, regional, and local M&E topics and issues. Knowledge Exchange is differentiated from Training in that the Knowledge Exchange participants have a primary role in knowledge flows, whereas in Training, the knowledge flow comes mostly from facilitator(s)/ trainer(s)/ presenter(s) to participants.
• Roundtables, conferences, forums
• Study tours, visits, etc. etc., where exchanging information and lessons is key (emphasis is on participant sharing more than facilitator/ trainer/ presenter-led exchanges)
• South-south learning, exchanges
Technical Assistance and Organization Capacity Building
(Advisory / Development / Maintenance)
Provide technical assistance (typically involving advisory services, development support, and or maintenance) to assist organizations (national, sub-national, government, civil society, etc.) to help build their M&E capacity across a range of areas. These technical assistance areas could include, but is not restricted to, M&E planning, technology (e.g., M&E systems), training/course content development, delivery of learning and knowledge events in partnership with these organizations, and strategic planning.
• TA support to government agencies (e.g., South Africa Presidency)
• TA support to CSOs building out M&E functions
• Strategic planning
Evaluations, Assessments, Project Advisory
Conduct evaluations or other assessments directly and/or in partnership with others. Assist with design of M&E aspects of projects, programs or policies under development. Advise on choices on methods.
• Evaluations
• Needs assessments
• Development of logic models, indicator plans, surveys, etc.
Diagnostics
Work with partners to diagnose and advise on M&E systems, policies, methods in place or to be developed, etc. This work will often (but not only) involve analysis of national,
• Advice on M&E systems (e.g., MIS, indicators to track, exchanges and use of info collected)
• Recommendations on 53
Activity Types
Definitions
Outputs of activities
regional/state, or municipal M&E systems and the vertical and horizontal interplay of systems.
methods, approaches and so on.
Knowledge Resources
Assemble ideas and knowledge into knowledge tools. Purpose will be to expand on M&E research (e.g., on methods, tools, etc.) and experiences (e.g., of country experiences with M&E systems)
• Books
• Journals (sponsorship and contributions to)
• Policy notes
• Case studies
• Reports/studies
• Toolkits
• Website development/support
• Blogs
• Applications
Networks/ Partnership/ Association Development
(Advisory / Development / Maintenance)
Create, administer or sustain networks/ partnerships/ associations of M&E practitioners (e.g. by playing a “connector” role and bringing stakeholders together, by providing the “space” using social networking platforms, etc.), in real time over sustained periods. Includes support and cooperation for associations/networks. Different types include: those where people come together without much of a web presence (e.g., AfREA); those that also have a strong web-based presence (e.g., RELAC); those that are networks of networks (e.g. IOCE); and so on. These networks generally have a formal structure (with documented objective, scope and membership). Contribute to the development of M&E leadership.
• National/regional M&E supports (e.g., AfREA, SAMEA, RELAC)
• Contributions to other M&E associations and/or initiatives (e.g., IOCE, Eval Partners, 3ie, etc.)
• Participate in standard setting (e.g., certification efforts, etc.)
• Contribute to professionalization of M&E as a field
• Support to association websites for sharing and connecting people on M&E topics and issues
• Leadership development of M&E practitioners and also of policymakers using M&E resources
Grants, Competitions and Awards
(Development and Implementation)
Support and manage local grants and competitions for the furtherance of M&E in different settings. Encourage the sharing of innovations (e.g., M&E system development). Provide awards for good practices.
• Develop and/or administer grants and competitions (e.g., on the behalf of centers, with donors, etc.)
• Sponsor awards to policymakers/agencies who are promoting M&E.
54
ANNEX 3 –TERMS OF REFERENCE FOR CLEAR REGIONAL ADVISORY COMMITTEES
Each Regional Center will appoint a Regional Advisory Committee (RAC), the membership of which will be subject to approval by the Board. The RACs will serve in an advisory capacity to the Centers for developing and implementing a strategic and high-quality regional program. Upon request, RACs will also provide advice to the Board regarding the Program’s strategic directions, particularly as they pertain to regional issues.
Specifically, the RACs will be responsible for:
1) Providing strategic guidance and support
• Provide strategic guidance regarding the Center’s overall vision and direction, focusing on its relevance for critical development issues in the region, innovation, and technical excellence
• Network on behalf of the Centers for both non-financial and financial support and to help position and establish its reputation and profile as a leader in the field
• Facilitate key connections and contacts in the region
• Assist the Center in establishing a sustainable model for future growth and continued relevance
2) Providing advice on capacity development strategy
• Providing nuanced and well-informed knowledge on the demands and needs for RBM and M&E capacities in their regions and/or specific countries, within their socio-political contexts
• Providing advice regarding the overall content of the work program, focusing on demand, supply, and influence strategies
• Guiding the Centers regarding effective methods for developing and delivering capacity building services in the region
3) Providing guidance on implementation approaches
• Advising Centers regarding a strategic assessment of their approaches to capacity development, including approaches to collaborating with partners and networks
• Helping to identify opportunities for partnerships with other institutions in the region
• Helping to identify individuals or organizations that could help design and deliver capacity building in the region
• Providing advice and feedback regarding Centers’ performance
Each Center will determine the relative emphasis it places on the areas of advice noted above, based on its overall strategy, needs, and level of development. 55
Size and Composition
Each Center will determine the size and composition of its own RAC (a maximum of seven individuals recommended). The RAC members should collectively provide the credibility, capacity, and recognized profile to be able take on responsibilities outlined in this terms of reference. In addition, the composition of the RACs should reflect diversity and the ability to function cohesively as a group. The RAC members will serve in their professional capacities and not as representatives of their organizations. The Chair of the Advisory Committee will be determined by the Center.
Appointment and Tenure of Members
The Centers will propose their RAC members for approval by the Board, as part of their workprogram plans. The term of the RAC members will be determined by the Centers. The Centers will also determine whether RAC membership is renewable.
The RACs will meet face-to-face at least annually, with additional meetings being determined on the basis of the Centers’ needs. Members’ travel costs will be borne by the Centers’ from “Centers’ Development” the portion of their grants. Any bilateral, multilateral, and foundation financial donor organizations’ representatives on the advisory committees will fund their own costs. The RAC members will not be compensated for their time. If RAC members are invited for CLEAR Board meetings, the travel costs will be paid for by the Program funds held centrally by the Secretariat.
Reporting and Accountability
RAC business may also be carried out virtually and in sub-groups. Minutes of the RACs’ meetings and RAC’s recommendations will be recorded by the Centers and submitted to the CLEAR Board with their annual reports. 56
Date
Regional Center Tag
Resource Type Tag
Language Type Tag
Image Thumb
QA