Blog

The Stages of PISA Grief

It’s painful to see evidence that your country’s learning outcomes are low, but the road to alleviating that pain starts with acknowledging the problem—and using the shock of low PISA scores as a spur to action.

Authors

Image of Kirsty Newman

Kirsty Newman

Consultant

Image of Lant Pritchett

Lant Pritchett

RISE Directorate

Blavatnik School of Government, University of Oxford

In 2013, three weeks after he had taken on the role of Minister of Education for Peru, Jaime Saavedra learned that Peru was at the absolute bottom of the rankings of the international learning outcome test PISA. This created enormous pressures on him. He reports that a first reaction was annoyance that there were not more developing country participants to act as a buffer from the ignominy of last place. He knew that Peru was by no means the worst country in the world on learning outcomes; it was just that the others did not do PISA. But then, putting that annoyance aside, he set to work to use the pressure to start fixing things.

In an interview with Saavedra in 2019, he reflects:

We could have decided to play the results down to say, “But Peru has improved since 2009” (which it had) or “But we are better than many countries that didn’t even take PISA” or “It is an OECD [Organisation for Economic Co-operation and Development] examination that is alien to our culture and priorities.” We have seen that in other countries, and some countries have left PISA after bad results. But we didn’t go down that route. Instead we decided to own the problem. To use these results to say, “Look, we’re not in trouble. We’re in deep trouble.”

Learning crisis denial is alive and well

Unfortunately, this response is by no means inevitable. There is a long history of governments reacting to reports of poor learning outcomes by ‘shooting the messenger’. Sometimes this happens in the open. For example, the Indian government dismissed India's PISA 2009+ results on the grounds that of course Indian kids could not do well on tests that required them to apply knowledge that did not match how they were taught and so the test was "unfair."

A similar pattern was seen recently in the Philippines with the government issuing a press release seeking an apology from the World Bank for publishing a report containing previously published learning outcome data claiming that the data were out of date and that the learning crisis has since been ‘ameliorated’.1 In fact, the data presented in the report are some of the most up-to-date learning outcome data of any country. They are drawn from three independent international assessments of learning all of which showed very low learning levels. It is certainly not the case that this learning crisis has been solved in the last couple of years; even if learning improved as fast as any country has ever managed, it would take decades for the Philippines to achieve learning outcomes similar to regional neighbours such as Vietnam or Malaysia.

Other governments react with equal amounts of denial but in a far more covert manner. There are a number of countries which have quietly shut down organisations that are working to gather and/or publicise unpalatable learning outcome data.  

Unfortunately, these responses—whether overt or subtle—are pretty successful in fuelling confusion and ignorance about the extent of the learning crisis. We have seen this amongst some who work within the global education architecture and who continue to believe that the learning crisis can’t really be as bad as the data suggests.

And this confusion exists within governments affected by the crisis too; a new study from CGD shows that ministry of education officials from a range of developing countries vastly overestimate the level of learning outcomes in their own country.

Acceptance can be a gradual process

However, it is not all doom and gloom. The story of Pratham/ASER in India shows that it can just take a long time for people to work through the stages of grief and move on to acceptance and action. A key to ASER's success was just re-doing their community-based learning outcomes exercise year after year after year.  In the first year the Indian government was in complete denial.  Then, in the second year, a prominent non-education government official (Montek Singh Ahluwalia) joined the launch, and then, a few years later, the education minister was there.

What are the lessons?

For those in the education sector who want to drive progress in learning, the first lesson is that data can shock policy makers into action, but that it won’t necessarily happen fast. And there is a good chance that initial attempts will be met with denial and even suppression. It is important that evidence about learning is gathered and publicised, but there may be ways to do this that have a higher likelihood of (gradual) acceptance. In particular, locally or regionally gathered data may be more acceptable to governments than that from international organisations. Triangulation can be important, with multiple sources and approaches showing similar results.  It is important that, whoever gathers the data, it is gathered rigorously; recent research shows rampant cheating in some learning outcome measurement. Organisations such as the South East Asia Primary Learning Metric and the PAL Network are examples of south-south consortia who are achieving good policy maker traction on the basis of rigorously gathered data.  In the end, information that meets the three Rs—Regular, Reliable, and Relevant—can be a foundation for system improvement.

And what are the lessons for governments in the early stages of grief from a PISA shock? Well, we think the main message is one of empathy. They are certainly not alone; it really is difficult to accept that learning levels are so low and many other countries have gone through the stages of denial and anger. But they can also draw hope from the countries who have emerged through to the final stage of acceptance. Just look at Peru. Since Minister Saavedra came to the conclusion that they were in deep trouble in 2013, Peru has slowly but steadily moved up the PISA rankings. In the latest 2018 PISA round, on the reading score, they came in 64th place out of 77th countries. And the country in 77th position? The Philippines—meaning that for that country, the only way is up.

Footnotes

  • 1The World Bank did eventually issue an apology that the report had been published early—neatly sidestepping any comments about the content of the report—and thereby allowed everyone to save face.

RISE blog posts and podcasts reflect the views of the authors and do not necessarily represent the views of the organisation or our funders.