Can providing information on school performance lead to improvement?

This page in:

Also available in: Español | العربية

Image

In high-income countries, learning outcomes have improved as a result of an intervention that increases transparency and accountability through the use of test scores.  In a previous blog, I mentioned examples of ‘high-stakes testing’ accountability systems, such as No Child Left Behind.  A high-stakes test has important consequences for the test taker, school, or school authorities. It carried important benefits if the test is passed, such as a diploma, extra resources to the school, or a positive citation. Some of these interventions also follow the “naming and shaming” of school leaders, which is done in England.

There is also evidence that suggests that even just providing information on test scores will lead to improvement.  This is the case in school choice systems such as in the Netherlands.

However, even in a ‘low-stakes’ environment (which carries none of the consequences of a high stakes test; even poor performance isn’t penalized), information can have a powerful effect.  So far the evidence on low stakes accountability has been mixed, but there are some promising signs.
 
Evidence from India, Chile and Pakistan
 
Experimental evidence for India shows that a program that provided low-stakes diagnostic tests and feedback to teachers had no effect on student learning outcomes. In Chile distributing information regarding schools’ value added had no effects on enrollment, tuition levels or the socioeconomic composition of students.  However, in a randomized study in Punjab, Pakistan, evidence shows that providing test scores to households and schools led to an increase in subsequent test scores by 0.11 standard deviations a year after the intervention was implemented.
 
Accountability and transparency in Mexico
 
In Impact of an Accountability Intervention with Diagnostic Feedback: Evidence from Mexico, Rafael de Hoyos, Vicente Garcia-Moreno and I report on a low-stakes mechanism designed to improve student performance in 100 poorly performing primary schools in Colima, Mexico.
 
Colima is a small state along the Pacific in the center-west of Mexico.  Since the decentralization of the education system that began in 1992, Colima has built an efficient school system adjusting the national educational programs to the state’s specific characteristics and needs. 
 
Throughout the 1990s, Colima undertook innovative education policies such as the implementation and dissemination of one of the country’s first standardized tests.  By 2003, Colima outperformed all Mexican states in PISA test and actually approached the OECD average. However, by 2009, according to Mexico’s standardized test (ENLACE), Colima ranked below the national average. And while all the schools in Colima had access to this information since 2006, the situation was not improving.  Colima’s education authorities began the design of Programa de Atención Especıfica para la Mejora del Logro Educativo (PAE).
 
PAE was implemented in the state between January 2010 and late 2011.  Originally designed as a comprehensive schooling intervention, PAE was cut short when a similar Federal program was enacted in order to avoid duplication.  As such, only a subset of the components were implemented.  Schools were informed of their test scores and of their participation in the program.
 
Once the program was launched and the list of PAE schools was publicly disseminated, participating schools were assigned to a technical adviser. This individual would then visit the school and help diagnose the test score results and design a school improvement plan.
 
Taking advantage of PAE’s strict eligibility rule, we employed a regression discontinuity design (and a difference-in-difference approach). We find that PAE increased test significantly – by 0.12 standard deviations – only a few months after the program was launched.
 
We argue that a process of self-evaluation and analysis is triggered when students, teachers and parents are aware that their scores are low. The process itself may lead to an improvement in learning outcomes.  Even without punitive measures, information on school quality –within a supportive and collaborative environment – sufficiently improved learning outcomes.
 
The fact that the PAE program was halted after only 18 months of implementation suggests that the main intervention of the program was limited mainly to the public announcement and the detailed information provided to the schools.  In other words, it was the public appraisal of PAE school directors of their performance that plausibly made the schools make small but significant learning gains.
 
Evidence from other sectors
 
Unlike the high-stakes accountability interventions sometimes leading school closures in the United States, or the sacking of school directors in England, or the lead with your feet school choice in the Netherlands, the policy (and the de facto events) in Colima bore no punitive actions against schools or school directors. 
 
Evidence from other sectors suggest that, in some cases, information on poor performance when properly disseminated can lead to improvement.  For example, restaurants displaying hygiene quality grade cards led to an increase of restaurant health inspection score. This is because restaurants made hygiene quality improvements themselves.
 
From the energy sector, there is experimental evidence that shows that providing feedback to customers on home electricity and natural gas usage with a focus on peer comparisons can lead to significant reductions in energy consumption at a low cost. This can be compared to when test scores are prominently displayed at public schools. These remind teachers, parents and the school director of the mandate and priority of the school: to provide high quality education services for all.
 
More research is needed
 
One may still wonder why schools did not improve before the PAE program, given that the same information was already disclosed publicly. Perhaps the information was not well understood or disseminated. Perhaps beleaguered school leaders in poorly performing schools could not, without the right logistical support and networking, begin to proactively use the results from the standardized test to trigger a discussion and design a school improvement plan. These are all areas of future research, but information provision is still the first step towards improvement.
 
 
Follow Harry Patrinos on the Education for the Global Development Blog and Twitter.
Follow Rafael de Hoyos on Twitter at @rafadehoyos

Follow the World Bank Group education team on Twitter @wbg_education


Authors

Harry A. Patrinos

Senior Adviser, Education

Rafael de Hoyos

Program Leader for Human Development in the European Union, World Bank

Vicente Garcia-Moreno

Deputy Director of Pensions and Social Security at the Ministry of Finance in Mexico

Join the Conversation

The content of this field is kept private and will not be shown publicly
Remaining characters: 1000