The latest PISA results: Seven key takeaways

This page in:
Image
International assessments aren’t perfect but they offer useful insights into how countries can help all students learn to high levels. (Photo: Dominic Chavez / World Bank)


Results for the Organization for Economic Cooperation and Development’s (OECD) 2015 Program for International Student Assessment (PISA) exercise were released on December 6. The results are instructive, not only because of what they tell us about the science, mathematics, and reading knowledge and skills of 15-year-olds around the world, but also in terms of how they compare to the 2015 Trends in International Mathematics and Science Study (TIMSS) results, which were released a week ago (click here to read my blog on key takeaways from the TIMSS results).

While PISA tests 15-year-olds, TIMSS primarily focuses on students in grades four (10-year-olds) and eight (14-year-olds). However, TIMSS covers two of the same subject areas (mathematics and science) as PISA and many of the same countries. In addition, TIMSS, like PISA, is used by many countries as a way to measure the relative effectiveness of their policies and programs for improving student learning outcomes at different stages of schooling.

Some 540,000 students in 72 countries or regions took part in the 2015 PISA exercise. PISA began in 2000 and has taken place every three years since then. Every PISA has a different focus. In 2015, science was the main focus, and math and reading were treated as minor areas of assessment.

Here are my seven takeaways on the PISA 2015 results and, where possible, I make comparisons with the results for TIMSS 2015.

Singapore tops the PISA 2015 rankings in all subject areas. The country is roughly 18 score points (equivalent to half a year of schooling) ahead of the next highest-scoring country on the PISA science test, eight score points ahead of the next highest-scoring country in reading, and 16 points ahead of the next highest-scoring country in math.

Singapore also dominates the TIMSS 2015 math and science rankings at the fourth and eighth grade levels. The country seems to be an all-star when it comes to the performance of its students on international tests, regardless of the types of questions being used or the age or grade level being tested.

At the same time, many people feel that there are limited lessons for other countries to learn from Singapore given the country’s small size and unique features that do not necessarily translate well to other education systems, particularly those in developing countries.

Other generally high-performing countries on PISA 2015 include (in alphabetical order) Canada, Chinese Taipei, Estonia, Finland, Hong Kong (China), Japan, and Macao (China). East Asian countries dominate here, but Estonia is also a standout.

Over the last decade, the Finnish education system has been viewed by many as a model for how to achieve both excellence and equity in learning outcomes (defined as high overall scores for most of their 15-year-olds on PISA). The country’s decline on PISA in recent times, however, raises the question of whether its education system will continue to hold such an attraction for other countries, and whether performance on PISA will continue to be a key signal used by countries seeking to learn from others. With this in mind, is Estonia the new Finland?
 
No low-income countries participated in PISA 2015 while five lower-middle income countries (Indonesia, Kosovo, Moldova, Tunisia, and Vietnam) participated. Vietnam was the top performer among these lower-middle income countries, followed by Moldova. While Vietnam’s scores dropped significantly since PISA 2012, its performance is still impressive given its GDP per capita.
 
Indonesia is the only lower-middle income country that participated in both TIMSS 2015 and PISA 2015, scoring well below the international average on both assessments. While these results are no doubt disappointing for Indonesian policymakers, hopefully a deeper analysis of the data will yield useful insights on an appropriate way forward for their country’s education system.
 
Only two North African countries participated in PISA 2015 (Algeria and Tunisia). No Sub-Saharan African countries participated. This is similar to participation levels for African countries in TIMSS 2015.
 
For the majority of countries with comparable data, science performance on PISA 2015 remained virtually unchanged since 2006 (the last time it was the main assessment area). In fact, according to the PISA 2015 report, only a dozen countries showed real improvement in the science performance of their 15-year-olds.
 
These included high-performing education systems, such as Singapore and Macao (China), and low‑performing ones, such as Peru and Colombia. These findings are in contrast to TIMSS 2015 where overall student achievement levels in science (and math) improved since the first TIMSS exercise in 1995, with the majority of countries that participated in both TIMSS 1995 and TIMSS 2015 seeing increased achievement at both grade levels over this time period.
 
Apart from the different timelines and grade/age levels being tested, these trends illustrate a more general fact that it tends to be easier for countries to show improvement on TIMSS than on PISA. This may be because TIMSS is seen as more focused on basic knowledge and understanding, while PISA is seen as more focused on application to real-life situations, which tends to be more difficult for students to master.
 
Even though gender differences in science performance on PISA tend to be small, on average, the share of top performers in science on PISA 2015 was larger among boys than among girls. The TIMSS 2015 results tell a similar story.
 
Also, similar to TIMSS 2015, the PISA 2015 results illustrate that some countries are doing a particularly good job of helping ALL of their students succeed (and not just some). In the case of PISA, these countries include Canada, Estonia, Finland, Hong Kong (China), Japan, Macao (China), and Singapore. In each, at least four out of five 15-year-old students master the baseline level of proficiency in science (the focal assessment area for PISA 2015) as well as reading and mathematics, which is pretty impressive.
 
Three of these countries (Hong Kong SAR, Japan, and Singapore) also demonstrated a similar ability to get most of their students to perform at or above baseline levels of proficiency on the TIMSS 2015 math and science tests at the fourth and eighth grade levels.
 
Things to keep in mind
 
The above takeaways are superficial in many respects, focusing primarily on means, change scores, and relative performance. There are likely many more interesting stories to uncover in the data once countries start to explore the relationships between their students’ performance on the test and various background factors.
 
It’s also important to note that PISA offers but one picture of learning levels around the world and by no means a complete picture at that. For one thing, many lower-income countries still do not participate in PISA (or TIMSS).
 
Both the OECD and the International Association for the Evaluation of Educational Achievement (IEA) (which runs TIMSS) are trying to address some of the technical and financial barriers to these countries’ participation in their assessments.
 
One such OECD initiative is PISA for Development, which aims to better align the PISA instruments with the needs of developing countries. Five lower-middle income countries (Cambodia, Guatemala, Honduras, Senegal, and Zambia) and three upper-middle-income countries (Ecuador, Panama, and Paraguay) are participating in this pilot.
 
The aim is for the instruments and approaches being trialed to be offered as options for countries in future regular PISA exercises. Whether this will encourage more lower-income countries to participate in PISA remains to be seen.
 
PISA, like TIMSS and many other assessments, only measures the achievement levels of those in school. Children who have dropped out or never enrolled in school are not captured by these exercises (although this is something that PISA for Development is trying to address by piloting approaches to collecting data on the achievement levels of out-of-school 15-year-olds).
 
A final point is that, even for countries that regularly participate in PISA and other international assessments, tweaks to the methodology can make it difficult to reliably track progress over time. For example, PISA 2015 was the first time that computer-based testing was the default option for students in all participating countries. This undoubtedly affected the scores for some participating countries.
 
Bottom line, none of these assessments are perfect, but when used appropriately, and with sufficient caution, they offer useful insights into how countries can help all students learn to high levels.
 
You can learn more about the PISA 2015 results, including the performance of several World Bank client countries, by going to the PISA website.
 
Please also watch this space for more briefs and blogs from the World Bank on the PISA results.
 
Find out more about World Bank Group education on Twitter and Flipboard.

 


Authors

Marguerite Clarke

Senior Education Specialist

Join the Conversation

The content of this field is kept private and will not be shown publicly
Remaining characters: 1000