Ronald Abraham
IDinsight
Blog
The Department of Education in Himachal Pradesh is using improved data systems to focus on achieving learning outcomes instead of budgetary goals.
Various Indian states are undertaking difficult systems-level reforms in the education sector. We use a simple but foundational initiative currently taking place in Himachal Pradesh, to illustrate the complexity of such reform. This study was conducted pre-COVID-19, however, the lessons remain relevant as education departments navigate the challenges posed by the pandemic.
Most children in India enrol in schools but fall far behind in terms of learning outcomes. In the last decade, there has been a recognition that ambitious, systems-level reforms can unlock an accelerated change in learning. There is a significant opportunity to use data and evidence to help education departments spend scarce resources while maximizing learning outcomes.
In Himachal Pradesh (HP), the Department of Education (DoE) has implemented a multi-faceted transformation program called HP Samarth. The program was launched in 2016 and is supported by the Michael and Susan Dell Foundation (MSDF) and Samagra Governance, a governance consulting firm. HP Samarth is an “ambitious state-wide systemic transformation programme to improve the quality of education” in the state’s 15,000+ government schools. This programme has two key objectives:
HP Samarth comprises a mix of academic and governance initiatives. The initiatives include, but are not limited to: remedial programs for basic learning competencies, strengthening pedagogical and classroom management skills of teachers, ensuring teacher availability in schools with sufficient teaching time, supporting meritorious students to prepare for higher studies, increasing parent engagement in schools using SMSes related to their child’s education, ensuring on-time textbook delivery, and strengthening data-driven decision-making through tech-enabled tools.
In mid-2019, IDinsight joined the HP Samarth team as a learning partner. Using rapid surveys and evaluations, we are providing rigorous, regular, and actionable feedback to inform the department’s reform programme.
As part of this learning partnership, IDinsight’s first focus area was a governance intervention called “Review and Monitoring” (“R&M”). Starting in 2017, the R&M process was strengthened to evolve toward a robust monitoring mechanism, focused on improving academic performance. The heart of this governance intervention is three-fold:
IDinsight conducted a mixed-methods rapid evaluation of R&M. Our findings and recommendations were directed towards providing granular feedback to improve R&M’s design and implementation. However, this experience also highlighted lessons that are relevant for education reform across India. Using R&M as a case study, below we discuss three such lessons from our research.
Inputs in the education system, such as school infrastructure, administrative processes, and expenditure of state funds dominate the mind space of education officials. Student learning receives much less attention. In a resource-constrained environment, educators are forced to prioritise difficult funding decisions: Should they fix their broken boundary wall or hire an extra teacher? These decisions are urgently important, but often aren’t based on which investments will actually improve learning outcomes.
With R&M, we found that implementing processes to regularly collect, discuss, and act upon learning data is necessary but insufficient towards using data effectively and fostering an outcomes orientation within the department. Sub-optimal data-use not only reduces the effectiveness of R&M but also frustrates staff who engage in the data collection process.
In India, the state education leadership’s performance is often reviewed based on whether or not expenditure targets (such as the procurement of school infrastructure), not learning targets, are met. In some review meetings that we observed in HP, up to 75 percent of the time was spent on administrative or input focused discussions. This input-oriented objective ripples through the department, and influences how the department’s middle management perceives and performs their role.
The DoE is taking small but firm steps to change this. Based on our recommendations, the state leadership has agreed to make discussion on learning outcomes the first agenda item for all review meetings across the state (and the data from the R&M app will provide specific discussion points). Starting with outcomes will encourage more time and energy to be spent on learning vs. inputs. We also recommended that the meetings focus on a “Learning Outcome of the Month” – to help drive more targeted discussions on what classroom practices proved effective to drive that outcome. Finally, the state leadership will consistently and regularly send messages to all staff that the raison d'être for the department is to educate children (instead of building and running schools and managing staff). These subtle tweaks have the potential to drive a long term behavioural shift among the department’s employees.
Designing an effective intervention is not a one-time process. It needs frequent iteration and course-correction after being rolled out. This can happen when the implementation context changes, when new information becomes available, or based on feedback from teams on the ground. This constant adaptation is critical to ensure sustained impact.
As part of this rapid evaluation, we conducted semi-structured interviews with over 50 government officials at the state, district, block, and school levels. These interviews provided invaluable feedback on ways to improve R&M on the ground. We elaborate on one example of this involving the changing roles of two key stakeholders, Block Resource Center Coordinators (BRCCs) and Cluster Head Teachers (CHTs)2 , below.
In R&M, BRCCs and CHTs have similar responsibilities even though their authority structures and capabilities are different. Our interviews revealed a few challenges with this. Both BRCCs and CHTs are expected to collect high-quality data and provide academic mentoring to schools. However, school staff occasionally refused to share sensitive information such as teacher attendance with the relatively junior BRCCs. In addition, some BRCCs found it challenging to provide academic mentoring to teachers who are more senior than themselves. Meanwhile, CHTs spent a lot of time doing data collection, and their considerable teaching experience was not getting used effectively.
Instead of trying to stick to the original policy which conflicts with the organically evolved roles of BRCCs and CHTs, it would be more effective for the R&M intervention to adapt its design. Going forward, the HP DoE is considering limiting BRCCs’ role in R&M to collection of high-quality, non-sensitive data from the school, with reduced expectation of academic mentoring. Additionally, the data collection burden for CHTs will be reduced to only the more sensitive data points, while they will be expected to ramp up academic mentoring to teachers in their schools.3
As with many teachers globally, we found that many stakeholders in HP, including school teachers, CHTs, and BRCCs, complained of heavy administrative workload and inability to focus on child learning in their respective roles. However, among the state’s leadership, there isn’t sufficient consensus on this issue. They question the extent of non-academic workload that staff face and whether it is a big enough problem to explain the lack of focus on teaching and learning. We saw an opportunity to collect data to align these stakeholders and ensure they fully grasped how much time was spent on administrative tasks versus teaching.
Reform processes need widespread buy-in across an organisation’s staff members. If HP is to aggressively focus on reducing the non-academic workload of their staff, it is important to get beyond the current ambiguity and reliance on anecdotal data. To achieve this, the state leadership can collect systematic data on teachers' non-academic workload by (i) surveying teachers to list the range of tasks they have to do; (ii) asking these teachers to maintain a log, or a diary, of the amount of time they spend on different administrative tasks as opposed to teaching; and (iii) identifying how the administrative burden on teachers can be reduced, potentially by removing non-essential but time-consuming tasks or delegating certain things to other staff members.
Governance reforms in education represent the need of the hour. These reforms have the potential to impact millions of children. This type of reform doesn’t lend itself to neat experimental evaluations. But it is still vital to find ways to learn from ongoing efforts to collectively improve the speed and impact of these initiatives. This article is a small step in that direction and we hope it catalyses more conversations and questions across our community of practice.
Ronald Abraham and Vishan Pattnaik are with IDinsight, and Mohit Bahri is with Samagra Governance. Views belong to the authors. We would like to thank Pradyot Komaragiri, Prabhat Kumar and Marc Shotland for their valuable contributions as part of the research team.
Michael and Susan Dell Foundation: The Michael & Susan Dell Foundation is dedicated to transforming the lives of children living in urban poverty through improved education, health, and family economic stability. In India, the foundation works closely with several states to drive sustained improvements in learning outcomes.
Samagra Governance: Samagra is a mission-driven governance consulting firm. In the education sector, Samagra partners MSDF on the HP Samarth and Saksham Haryana state education transformation programs.
IDinsight: IDinsight is a global advisory, data analytics, and research organization that helps development leaders maximise their social impact. The organization tailors a wide range of data and evidence tools, including randomised evaluations and machine learning, to help decision-makers design effective programs and rigorously test what works to support communities.
RISE blog posts and podcasts reflect the views of the authors and do not necessarily represent the views of the organisation or our funders.