Developing State Monitoring Systems: Measuring Results

Submitted by jyuan@worldbank.org on Wed, 04/13/2016 - 12:44
Image
developing

Presentation Overview

  • Traditional v. Result based M&E Approach
  • Government effort towards Result based M&E
  • J-PAL SA effort towards a result focused Monitoring System

Developing State Monitoring Systems from clearsateam

Download File
Pdf as plain
Developing State Monitoring Systems
1. Developing State Monitoring Systems: Measuring Results
2. Presentation Overview  Traditional v. Result based M&E Approach  Government effort towards Result based M&E  J-PAL SA effort towards a result focused Monitoring System
3. The ‘M’ and the ‘E’ Monitoring is used to continuously gauge whether the project or intervention is being implemented according to plan/targets Evaluation gives evidence on whether targets and outcomes have been achieved. And impact evaluations seeks to ascertain whether these can be attributed to the program
4. TRADITIONAL v. RESULT BASED M&E
5. Traditional M&E approach Addresses compliance—“did they do it” question  Did they mobilize the needed inputs?  Did they undertake and complete the agreed activities?  Did they deliver the intended outputs (the products or services to be produced)? So What?  So what that activities have taken place?  So what that the outputs from these activities have been counted? Is that enough to ascertain that whether the project/program/policy was a success or a failure?
6. Changing Context Focus on Result What are the results and impacts of government actions? Governments are increasingly being called upon to demonstrate results in the face of • Citizen accountability • Donor focus on results • Political climate Additional how do we know • If policies, programs, and projects led to the desired results and outcomes? • How do we measure progress? How can we tell success from failure?
7. Result based M&E approach  A results-based approach can provides feedback on the actual outcomes and goals of government actions  Result-based monitoring and evaluation (M&E) is a management tool that if properly used can help Systematically track progress of project implementation, demonstrate results on the ground, and assess whether changes to the project design are needed in view of evolving circumstances (World Bank)  Results-based M&E differs from traditional implementation-focused M&E in that it moves beyond an emphasis on inputs and outputs to a greater focus on outcomes and impacts.
8. Revisiting the ToC: Immunization Incentives Example Situation/Context Analysis: High health worker absenteeism, low value of immunization, limited income and time INPUT OUTPUT OUTCOME GOAL Immunizatio n Camps Increased Immuni-zation Incentives for Immunizatio n Camps are reliably Open Parents bring children to the camps Parents bring children to the camps repeatedly Incentives are delivered Parents value incentives Incentives paid regularly Parents trust camps Camp provides immunizations
9. Increased Immunizatio n Rates Periodic Monitoring : Of use of inputs/activities and process, intermediate outcomes Parents bring children to the camps repeatedly Measurement along a ToC Reporting / Routine monitoring: On expenditures, activities, coverage (targets) Immunization Camps + Incentives Camps are open and incentives are delivered Parents bring children to the camps Evaluation: To assess long-term outcomes and impact through studies After 9 months, camps were running on a monthly basis in 90% of the planned villages. 100% incentives were delivered to these camps 70-75% of Parents brought children to be immunized in the camps that were open and reported receiving incentives. 90 -95% of parents who immunized the children during the first round of immunization, brought them to be immunized for the second round At the end of the program immunization rates were 39% in the intervention villages as compared to 6% in comparison villages After 6 months, camps were established and equipped to run in 90% of program villages. All health workers were trained to offer parents the appropriate incentives at their visit
10. Increased Immunizatio n Rates Parents bring children to the camps repeatedly Measurement along a ToC Immunization Camps + Incentives Camps are open and incentives are delivered Parents bring children to the camps Traditional approach After 9 months, camps were running on a monthly basis at 90% of the planned villages. Incentives were delivered to these camps 70-75% of Parents brought children to be immunized in the camps that were open and reported receiving incentives. 90 to 95% of parents who immunized the children during the first round of immunization, brought them to be immunized for the second round At the end of the program immunization rates was 39% in the intervention villages as compared to 6% in comparison villages After 6 months, camps were established and equipped to run in 90% of program villages. . All health workers were trained to offer parents the appropriate incentives at their visit Result based approach
11. Result based M&E approach  The other major deviation from the traditional approach is that it moves away from scheme-wise monitoring to resource/sector-based monitoring. It aims to tracks outcomes at the State-level that multiple schemes may be targeting.  The focus of Results-based approach is therefore on better planning, targeting and allocation of resources achieve certain targets and it links such allocation to performance or results  Thus, this approach is the first step to introducing performance management and performance based budgeting
12. GOVERNMENT EFFORT TOWARDS RESULT BASED M&E
13. Government effort towards Result based M&E  Concurring with the recommendations of India’s second Administrative Reforms Commission(ARC), in 2009 the PM announced The introduction of Performance Monitoring & Evaluation System (PMES), to be implemented through a Performance Management Division (created inside) the Cabinet Secretariat.  The Result Framework document(RFD) is the essence of PMES- its a performance agreement through which targets are to be agreed upon, success indicators listed and performance evaluated.  The Process • Generic RFD framework and guidelines were developed and an Adhoc task force set up to help Ministries/departments at the centre conduct their RFD exercise. • The same exercise was to be replicated by states
14. The Result Framework Document  The RFD seeks to address three basic questions: • What are ministry’s/department’s main objectives for the year? • What actions are proposed by the department to achieve these objectives? • How would someone know at the end of the year the degree of progress made in implementing these actions? That is, what are the relevant success indicators and their targets which can be monitored  Key idea is to enable departments to transit from an input driven approach to results/outcomes orientation
15. RFD Process ToC Process
16. Current status of RFD exercise While the RFD exercise is a step in the right direction, its utilization remains a big question  At the national level, while a majority of Ministries have created the RFD document, they have still not started reporting against the targets and success indicators.  At the state level, this is an optional exercises , hence some states have opted not to do this. Even with states such as Chattisgarh, who have made department level RFD, these are merely lying as documents with the Line Departments and have not helped them realign the targets set in their Annual Plans.
17. J-PAL SA EFFORT TOWARDS RESULT FOCUSED MONITORING: SCHOOL BASED MONITORING IN HARYANA--- ABRC CASE STUDY
18. ABRC Case Study: Background  During the pilot evaluation of CCE and RE-LEP program in Haryana, implementation challenges of newly launched educational programs noticed in schools  Lack of monitoring and mentoring for teachers in schools seen as key reason for poor program implementation. Field level monitoring structure unclear and not a widespread practice  ABRCs have the mandate of monitoring, but have not carried out these roles. Time spent acting as ‘couriers’ of information
19. Situational Context: quality of monitoring  ABRCs have the mandate of monitoring, but have not carried out these roles. Time spent acting as ‘couriers’ of information • Each ABRC is assigned 1 cluster to oversee – 12 to 15 schools  Preliminary results from JPAL Process Evaluation from Jan-Mar 2012: • 13% of sampled schools have never been visited by ABRCs in 2011-12 • 56% respondents reported ABRC visits last less than 1 hour • 55% respondents reported never having received feedback from ABRCs • activities
20. Existing System of Monitoring & Mentoring in Haryana Director of Elementary Education DEE Director of Secondary Education DSE State Project Director SPD District Elementary Education Officer DEEO District Education Officer DEO District Project Coordinator DPC Block Elementary Education Officer BEEO Block Education Officer BEO Block Resource Coordinator BRC Assistant Block Resource Coordinator ABRC Schools • Focus on information gathering related to small number of inputs and financial flows. Focus on outputs and outcomes almost non-existent. • Data collected in an ad-hoc manner, not timely, collected mainly from the perspective of reporting • No mechanism to use information to influence functioning of schools NO systematic way of teaching information collection, review or flow from schools to blocks to districts
21. System of Monitoring & Mentoring in J-PAL Study Areas Director of Elementary Education DEE Director of Secondary Education DSE State Project Director SPD District Elementary Education Officer DEEO District Education Officer DEO District Project Coordinator DPC Block Elementary Education Officer BEEO Block Education Officer BEO Block Resource Coordinator BRC Assistant Block Resource Coordinator ABRC Schools • Re-orient focus of monitoring – collect data on a variety of aspects. Information collected is focussed on teaching practices/programs . • Set-up mechanisms to allow for use of collected information - double headed arrows depict two way information flow- emergence of a feedback loop. • Set-up mechanisms to foster accountability for not just financial aspects but also higher order outcomes Systematic & regular information collection , review and flow from the field to blocks to districts to HQ on a monthly basis
22. Key Changes to the System  Resources : Lack of clarity on role of monitors: • Clarified roles and responsibilities of District, Block and Cluster officials --Fosters accountability  Lack of knowledge on monitoring tools, activities and presentation of results • Piloted and created monitoring tools • Trained ABRCs on monitoring, basic data analysis and report writing. • Trained district and block level officials on the same as well as performance management of ABRCs  Lack of forum to share experiences and “best-practices” • Monthly review meetings set up for data sharing and identification of issues
23. Data Collected On  Training and availability of materials  Attendance  Non-academic activities  Use of TLM  Teacher practices and student behavior  CCE documentation  LEP implementation  Block and district school observation
24. Findings :Teacher Practices and Student Behavior  Most of the teachers asked questions while teaching and assigned in-class exercises • Students responded to questions in more than 85% of schools • However students asked questions in only about 62% of schools • Students were also found repeating the teacher’s answers in a large percentage of schools Are the students comprehending?
25. Findings :Teacher Practices and Student Behavior  ~ 50% of teachers review in-class work of students • ABRCs corroborate in MRM that teachers do not check homework thoroughly  Very few teachers used local context to explain concepts  Very few teachers adopted the practice of ‘group work’  No uniformity in syllabus coverage • Some teachers have completed the syllabus already • Many teachers have no plan – teach whatever they feel like teaching Should there be a prescribed Primary syllabus schedule?
26. Examples of ABRC Presentations at MRMs Block Pehowa, Kurukshetra 1. Factual Data Number of School where teachers were out of class Number of Teacher out of class Number of schools Insufficient Material Number of schools where CCE and RE record Not maintained CCE RE-LEP CCE RE-LEP 1st visit- Sep 2012 25 98 14 1 43 11 5th visit- Feb 2013 04 08 0 0 2 0 285 First Assessment Final Assessment 103 121 102 167 125 147 129 126 251 Nothing Letter Word Para Story 3. Observation of teaching practices a) Mostly unused Teaching Method  Models, charts  Project based learning  Workbooks  Activity based learning b) Common issue highlighted by the teacher and headmaster related to quality of education  Lack of teaching staff/extra work (official on duty, BLO, daak and construction work  Issues with RE-LEP syllabus  Separate teacher for nursery class  Student irregularity 2. On spot testing of learning quality
27. Pictorial Depiction of the Monitoring System ABRCs visit school ABRCs consolidate data from visits Data shared with other ABRCs, block and district officials Unresolved issues to be elevated to higher levels Discussions on action to be taken To monitor and provide inputs on implementation based on previously collected data and discussions. Collect data on inputs, outputs, outcomes Data collated systematically, major issues identified Issues not resolved at district level elevated to state level officials for inputs Course correction advice, inputs identified and discussed
28. Action Taken by GoH Based on Monitoring Course Correction:  Visits by block and district officials to resolve issues with “problem” schools and teachers  Timely delivery of textbooks and other program related materials  Greater focus on teacher attendance and in-class activities • ABRCs shared best-practices in teaching  Request for re-fresher training for both programs
29. The Power of Measuring Results If you do not measure results, you cannot tell success from failure. If you cannot see success, you cannot reward it. If you cannot see success, you cannot learn from it. If you cannot recognize failure, you cannot correct it. Source: Adapted from Osborne & Gaebler 1992
Date
Regional Center Tag
Resource Type Tag
Language Type Tag
Image Thumb
3