Responses to the 2012 PISA Results in Latin America

˙ PREAL Blog

This post is also available in: Spanish

The results of the 2012 PISA examination were released on December 3, 2013, immediately prompting reactions. Several actors, including PREAL, reported on the main takeaways for Latin America. Governments released statements, gave interviews, and offered press briefings.

The media published expert analyses, interviewed key actors, produced infographics, and linked to OECD reports online. And like PREAL, civil society organizations launched analyses of the numbers and informed anyone willing to listen of the implications of the results. What is the key message? The eight Latin American countries scored in the lowest third in reading, mathematics, and science among the 65 countries that took the exam.

But what did the governments have to say? Did they take action and make proposals? And was the media positive or negative? Did they cover anything other than just the country rankings? And among civil society organizations, did they offer solutions and call for urgent change?

After analyzing 221 articles appearing online between December 3, 2013 to March 30, 2014 in the eight Latin American countries that participated in the 2012 PISA exam (136 from the media, 53 from civil society, and 32 from government), we can make some conclusions about trends in coverage of the exam’s results. (Please see our note at the bottom of this article for information on our search methodology.)

Trends in Content:

No time to lose: First, in one way or another, the three sectors in all countries covered the PISA results the same day that they were released. This could indicate the level of interest and importance that has been given to the international evaluation in the region.

Quantity and quality: Among all countries, the media covered the results more than either of the other sectors, but they also covered them less critically and analytically (see Table 1). In contrast, the government released fewer articles, but placed the most emphasis on linkages between the results and government programs, as well as solutions for improving education systems. Civil society placed the most emphasis on the urgent necessity for change to improve the state of education.

English Table

Trends by Actor:

Show me your political situation and I’ll tell you your reaction to the results: One of the most notable trends was the relation between the political situation of a country and the tone of its government coverage. For example, in Mexico, the government used the results to promote and justify the education reform led by President Peña Nieto. Uruguay, which is approaching presidential elections and is perhaps trying to show strong leadership, publicized broadly the firing of the country’s head of secondary education because of the poor test scores. And perhaps because it is dealing with more urgent issues, Brazil’s government offered scant coverage of the results—primarily only rankings, with little critical analysis, mention of programs, or solutions for improving the situation. Peru, which has a new minister of education, placed the most emphasis on the need to change and improve education results. The government of Colombia appeared to be more apolitical, and reported on the need to focus on closing regional and urban/rural inequalities.

NGOs—Staying the course: There were few surprises with regard to civil society organizations’ coverage of the PISA results. As would be expected, the sector was the most critical of the results, heavily emphasizing the urgent need to improve the state of education. The case of Mexicanos Primero in Mexico is particularly noteworthy. Along with the Lemann Foundation in Brazil, their director was one of the most outspoken—and cited—experts referenced by the media, demonstrating a major effort by the organization to continue their message that change is needed.

Media—A bit of everything: As expected, the media was the sector that provided the most coverage of the results and that had the most variety in terms of type of coverage, despite offering the least critical analysis.  However, there are some instances in online newspapers that include more insistent and critical coverage of the results. Colombia and Costa Rica place the most emphasis on poor performance in mathematics, linking this result with poor employment prospects for students in the future. With regard to quantity, Brazil’s O Globo, Argentina’s La Nación, and Mexico’s La Reforma each had more than 25 articles on the subject from December 3 to March 30.

Trends by Country:

As Figure 1 shows, the majority of countries are critical and analytic in their coverage of the results. Chile and Brazil tended to be less analytical than the other countries, while Argentina stands out for being the country that most avoids taking responsibility.

English Graph

Do the reactions contradict the results?

Another question that we asked was whether the sectors’ reactions were aligned with the PISA results. That is, did they accept the results or interpret them differently?

In general, all participating Latin American countries were critical of the results, regardless of their ranking relative to the others or their level of improvement relative to the 2009, 2006, and 2003 PISA exams. For example, although Brazil improved in all three subjects and Chile and Mexico improved in mathematics and reading, these three countries focused more on their low ranking relative to the other 65 participating countries than their improvement over time. In the case of Chile, all three sectors went beyond self-criticism and reported on the initiatives in the country in order to learn from countries that achieved better PISA results.

Peru and Uruguay were the most negative and critical of their performance, even though Peru had the largest improvement in reading scores among all Latin American countries, and the fourth highest among all 65 participating countries. Uruguay’s response matched its results, however, given that it was the only Latin American country to decline in all three subjects. In fact, the majority of coverage in Uruguay was discouraging.

As for Costa Rica and Colombia, despite having nearly opposite results (Costa Rica did not improve in any of the three subjects, while Colombia was particularly successful in improving scores among low-performing students), the countries had very similar reactions. Both were critical of the results, calling for improvement to the state of education and expressing hope that it can be achieved.

Despite offering self-criticism, governments in all of the countries offered at least some justification for the low scores. Within this category, Argentina showed the least amount of self-criticism and the most justification for poor results, and avoided making comparisons with other countries. Among the reasons given to explain their lack of improvement in the three subjects, they mentioned the diversity of circumstances among countries, studies, and education budgets, in addition to the fact that they added many students to the system since the last PISA exam. Nevertheless, it is important to note that there are several cases in Argentina in which there was self-criticism and impartial analysis of the results through the media and some civil society organizations.

Although our analysis lacks statistical validity, we believe it offers important evidence of the trends in content of coverage of the results of PISA 2012 in Latin America. Knowing the way in which each sector covers the issue helps to show their levels of impartiality, transparency, and accessibility, improving the quality of information on the state of education that citizens in each country receive.

Note on methodology:
 
We carried out our search for coverage of the 2012 PISA exam in eight participating Latin American countries in two stages. First, we conducted a Google search for each sector (media, government, and civil society) in each country using key words in Spanish (Resultados PISA 2012 + nombre del país). We limited search results to the period of December 3, 2013 onward. We conducted the searches between January 27 to March 30, 2014, and only included entries listed in the first three pages of Google results.
 
In the second phase, we searched for PISA coverage directly on the webpages of three prominent newspapers and three civil society organizations that work on education (if applicable, and based on PREAL’s knowledge). For the direct search for government coverage, we searched within the webpage of the ministry of education. This second phase of searching was not exhaustive, as one of the objectives was to know if the coverage of PISA results in each country was easily accessible. For direct searches that returned large numbers of results on a particular site, we analyzed the first three articles and then noted the number and dates of the remaining articles. We did not include any international coverage of PISA results.
 
We used the following questions to classify the content within the PISA articles: Do they compare the 2012 results with previous PISA results? Do they compare results with other countries? Are they linking results or information with country policies or programs? What is the tone of the coverage (neutral, critical, political, negative, positive, informative, very little information, or other)? Do they mention solutions? Do they call for change and improvement to current policies?
 

Suggested Content