Programme for International Student Assessment

From Metapedia
Jump to: navigation, search
Programme for International Student Assessment
Abbreviation PISA
Formation 1997
Purpose/focus Comparison of education attainment across the world
Headquarters OECD Headquarters
Location 2 rue André Pascal, 75775 Paris Cedex 16
Region served World
Membership 59 government education departments
Head of the Indicators and Analysis Division Andreas Schleicher
Main organ PISA Governing Body (Chair - Lorna Bertrand, England)
Parent organization OECD
Website PISA

The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations of 15-year-old school pupils' scholastic performance on mathematics, science, and reading. It was first performed in 2000 and then repeated every three years. It is done with view to improving educational policies and outcomes. The data have increasingly been used both to assess the impact of educational quality on incomes and growth and for understand what causes differences in achievement across nations.[1]

470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing 9 nations were tested in 2010.[2]

The Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) by the International Association for the Evaluation of Educational Achievement are similar studies.


PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science.

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts"[3]

Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

Every period of assessment focusses on one of the three competence fields reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading is again the main domain in 2009.

Period Main focus # OECD countries # other countries # students Notes
2000 Reading 28 4 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27
2009 Reading 34 33? Results made available on 7 December 2010 [4]

PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA's Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

Method of testing


The students Pen15 tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are less than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.


PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

In selected countries, PISA started also experimentation with computer adaptive testing.

National add-ons

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: on the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in both the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[5]

Data Scaling

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be scaled to allow meaningful comparisons. This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences.[6]

One and the same scale is used to express item difficulties and student competences. The scaling procedure is tuned such that the a posteriori distribution of student competences, with equal weight given to all OECD countries, has mean 500 and standard deviation 100.


The official reports only contain domain-specific scores and do not combine the different domains into an overall score. The final scoring is adjusted so that the OECD average in each domain is 500 and the standard deviation is 100.[7]

Historical tables

All PISA results are broken down by countries. Public attention concentrates on just one outcome: achievement mean values by countries. These data are regularly published in form of "league tables".[citation needed]

The following table gives the mean achievements of OECD member countries in the principal testing domain of each period:[8]

In the official reports, country rankings are communicated in a more elaborate form: not as lists, but as cross tables, indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[citation needed]

In some popular media, test results from all three literacy domains have been consolidated in an overall country ranking. Such meta-analysis is not endorsed by the OECD. The official reports only contain domain-specific country scores. In part of the official reports, however, scores from a period's principal testing domain are used as proxy for overall student ability.[9]


Top results for the main areas of investigation of PISA, in 2000, 2003 and 2006.

2000[3] 2003 2006
Reading literacy Mathematics Science
1.  Finland 546
2.  Canada 534
3.  New Zealand 529
4.  Australia 528
5.  Ireland 527
6.  South Korea 525
7.  United Kingdom 523
8.  Japan 522
9.  Sweden 516
10.  Austria 507
11.  Belgium 507
12.  Iceland 507
13.  Norway 505
14.  France 505
15.  United States 504
16. File:Flag of Denmark.svg Denmark 497
17.  Switzerland 494
18.  Spain 493
19.  Czech Republic 492
20.  Italy 487
21.  Germany 484
22.  Hungary 480
23.  Poland 479
24.  Greece 474
25.  Portugal 470
26.  Luxembourg 441
27.  Russia 462
28.  Latvia 458
29.  Mexico 422
30.  Brazil 396
1.  Finland 544
2.  South Korea 542
3.  Netherlands 538
4.  Japan 534
5.  Canada 532
6.  Belgium 529
7.  Switzerland 527
8.  Australia 524
9.  New Zealand 523
10.  Czech Republic 516
11.  Iceland 515
12. File:Flag of Denmark.svg Denmark 514
13.  France 511
14.  Sweden 503
15.  Austria 506
16.  Germany 503
17.  Ireland 503
18.  Slovakia 498
19.  Norway 495
20.  Luxembourg 493
21.  Poland 490
22.  Hungary 490
23.  Spain 485
24.  United States 483
25.  Italy 466
26.  Portugal 466
27.  Greece 445
28.  Turkey 423
29.  Mexico 385
1.  Finland 563
2.  Canada 534
3.  Japan 531
4.  New Zealand 530
5.  Australia 527
6.  Netherlands 525
7.  South Korea 522
8.  Germany 516
9.  United Kingdom 515
10.  Czech Republic 513
11.  Switzerland 512
12.  Austria 511
13.  Belgium 510
14.  Ireland 508
15.  Hungary 504
16.  Sweden 503
17.  Poland 498
18. File:Flag of Denmark.svg Denmark 496
19.  France 495
20.  Iceland 491
21.  United States 489
22.  Slovakia 488
23.  Spain 488
24.  Norway 487
25.  Luxembourg 486
26.  Italy 475
27.  Portugal 474
28.  Greece 473
29.  Turkey 424
30.  Mexico 410


Top 10 countries for Pisa 2006 results in Mathematics, Sciences and Reading.

Programme for International Student Assessment (2006)
(OECD member countries in boldface)
Maths Sciences Reading
1.  Taiwan 549
2.  Finland 548
3.  Hong Kong 547
3.  South Korea 547
5.  Netherlands 531
6.  Switzerland 530
7.  Canada 527
8.  Macau 525
8.  Liechtenstein 525
10.  Japan 523
1.  Finland 563
2.  Hong Kong 542
3.  Canada 534
4.  Taiwan 532
5.  Estonia 531
5.  Japan 531
7.  New Zealand 530
8.  Australia 527
9.  Netherlands 525
10.  Liechtenstein 522
1.  South Korea 556
2.  Finland 547
3.  Hong Kong 536
4.  Canada 527
5.  New Zealand 521
6.  Ireland 517
7.  Australia 513
8.  Liechtenstein 510
9.  Poland 508
10.  Sweden 507


The PISA 2009 results in Maths, Sciences and Reading for all 34 OECD members and 37 partner countries. Of the partner countries, only selected areas of three countries—India, Venezuela and China—were assessed. Due to scheduling constraints, 10 of those partners actually carried out their tests in 2010, not 2009.

Programme for International Student Assessment (2009)[10][11]
(OECD members as of the time of the study in boldface)
Maths Sciences Reading
1 People's Republic of China Shanghai, China 600
2  Singapore 562
3  Hong Kong, China 555
4  South Korea 546
5  Taiwan 543
6  Finland 541
7  Liechtenstein 536
8  Switzerland 534
9  Japan 529
10  Canada 527
11  Netherlands 526
12  Macau, China 525
13  New Zealand 519
14  Belgium 515
15  Australia 514
16  Germany 513
17  Estonia 512
18  Iceland 507
19 File:Flag of Denmark.svg Denmark 503
20  Slovenia 501
21  Norway 498
22  France 497
23  Slovakia 497
24  Austria 496
25  Poland 495
26  Sweden 494
27  Czech Republic 493
28  United Kingdom 492
29  Hungary 490
30  Luxembourg 489
31  United States 487
32  Portugal 487
33  Ireland 487
34  Spain 483
35  Italy 483
36  Latvia 482
37  Lithuania 477
38  Russia 468
39  Greece 466
40  Malta 463
41  Croatia 460
42  Israel 447
43  Turkey 445
44  Serbia 442
45  Azerbaijan 431
46  Bulgaria 428
47  Uruguay 427
48  Romania 427
49  United Arab Emirates 421
50  Chile 421
51  Mauritius 420
52  Thailand 419
53  Mexico 419
54  Trinidad and Tobago 414
55  Costa Rica 409
56  Kazakhstan 405
57  Malaysia 404
58  Montenegro 403
59  Moldova 397
60 Venezuela Miranda, Venezuela 397
61  Argentina 388
62  Jordan 387
63  Brazil 386
64  Colombia 381
65  Georgia 379
66  Albania 377
67  Tunisia 371
68  Indonesia 371
69  Qatar 368
70  Peru 365
71  Panama 360
72 India Tamil Nadu, India 351
73 India Himachal Pradesh, India 338
74  Kyrgyzstan 331
1 People's Republic of China Shanghai, China 575
2  Finland 554
3  Hong Kong, China 549
4  Singapore 542
5  Japan 539
6  South Korea 538
7  New Zealand 532
8  Canada 529
9  Estonia 528
10  Australia 527
11  Netherlands 522
12  Liechtenstein 520
13  Germany 520
14  Taiwan 520
15  Switzerland 517
16  United Kingdom 514
17  Slovenia 512
18  Macau, China 511
19  Poland 508
20  Ireland 508
21  Belgium 507
22  Hungary 503
23  United States 502
24  Norway 500
25  Czech Republic 500
26 File:Flag of Denmark.svg Denmark 499
27  France 498
28  Iceland 496
29  Sweden 495
30  Latvia 494
31  Austria 494
32  Portugal 493
33  Lithuania 491
34  Slovakia 490
35  Italy 489
36  Spain 488
37  Croatia 486
38  Luxembourg 484
39  Russia 478
40  Greece 470
41  Malta 461
42  Israel 455
43  Turkey 454
44  Chile 447
45  Serbia 443
46  Bulgaria 439
47  United Arab Emirates 438
48  Costa Rica 430
49  Romania 428
50  Uruguay 427
51  Thailand 425
52 Venezuela Miranda, Venezuela 422
53  Malaysia 422
54  Mauritius 417
55  Mexico 416
56  Jordan 415
57  Moldova 413
58  Trinidad and Tobago 410
59  Brazil 405
60  Colombia 402
61  Tunisia 401
62  Montenegro 401
63  Argentina 401
64  Kazakhstan 400
65  Albania 391
66  Indonesia 383
67  Qatar 379
68  Panama 376
69  Georgia 373
70  Azerbaijan 373
71  Peru 369
72 India Tamil Nadu, India 348
73  Kyrgyzstan 330
74 India Himachal Pradesh, India 325
1 People's Republic of China Shanghai, China 556
2  South Korea 539
3  Finland 536
4  Hong Kong, China 533
5  Singapore 526
6  Canada 524
7  New Zealand 521
8  Japan 520
9  Australia 515
10  Netherlands 508
11  Belgium 506
12  Norway 503
13  Estonia 501
14  Switzerland 501
15  Poland 500
16  Iceland 500
17  United States 500
18  Liechtenstein 499
19  Sweden 497
20  Germany 497
21  Ireland 496
22  France 496
23  Taiwan 495
24 File:Flag of Denmark.svg Denmark 495
25  United Kingdom 494
26  Hungary 494
27  Portugal 489
28  Macau, China 487
29  Italy 486
30  Latvia 484
31  Greece 483
32  Slovenia 483
33  Spain 481
34  Czech Republic 478
35  Slovakia 477
36  Croatia 476
37  Israel 474
38  Luxembourg 472
39  Austria 470
40  Lithuania 468
41  Turkey 464
42  Russia 459
43  Chile 449
44  Costa Rica 443
45  Malta 442
46  Serbia 442
47  United Arab Emirates 431
48  Bulgaria 429
49  Uruguay 426
50  Mexico 425
51  Romania 424
52 Venezuela Miranda, Venezuela 422
53  Thailand 421
54  Trinidad and Tobago 416
55  Malaysia 414
56  Colombia 413
57  Brazil 412
58  Montenegro 408
59  Mauritius 407
60  Jordan 405
61  Tunisia 404
62  Indonesia 402
63  Argentina 398
64  Kazakhstan 390
65  Moldova 388
66  Albania 385
67  Georgia 374
68  Qatar 372
69  Panama 371
70  Peru 370
71  Azerbaijan 362
72 India Tamil Nadu, India 337
73 India Himachal Pradesh, India 317
74  Kyrgyzstan 314

^† Participants in PISA 2009+, which were tested in 2010 after the main group of 65.[12]

PISA 2009 Mathematics results.
PISA 2009 Science results.
PISA 2009 Reading results.
PISA 2009 mathematics results. Dark blue nations scored statistically significantly above the OECD average. Intermediate blue nations not statistically significantly different from the OECD average. Light blue nations statistically significantly below the OECD average. For some nations (China, India, Venezuela) only students from limited areas were tested as indicated in the map.

Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. Western countries perform slightly better in PISA; Eastern European and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[13]


For many countries, the results from PISA 2000 were surprising. In Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed.[citation needed] Some headlines in national newspapers, for example, were:

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman".[14]


Education professor Yong Zhao has noted the PISA 2009 did not receive much attention in the Chinese media, and that the high scores in China are due to excessive workload and testing, adding that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle: Singapore, Korea, Japan, and Hong Kong."[15]


Of the 74 countries tested in the PISA 2009 cycle including the "+" nations, the two Indian states came up 72nd and 73rd out of 74 in both reading and maths, and 73rd and 74th in science. The poor result was greeted with dismay in the Indian media.[16] The BBC reported that as of 2008, only 15% of India's students reach high school.[17]

Research on causes of country differences

Large international student assessment programs such as PISA and TIMSS have provided essential data for many recent analyses of how student achievement affects society at large, such as economic development,[18] democratization and health.[19]

Although PISA and TIMSS officials and researchers themselves generally refrain from hypothesizing about the large and stable differences in student achievement between countries, other researchers have investigated single educational factors like central exams[20] private schools or streaming between schools at later age.[21] An extensive literature related to cross-countries difference in scores has also developed since 2000.[1]


The stable, good results of Finland have attracted a lot of attention. According to Hannu Simola[22] the results are due to a paradoxical mix of progressive policies implemented through a rather conservative pedagogic setting, where the high levels of teachers` academic preparation, social status, professionalism and motivation for the job are concomitant with the adherence to traditional roles and methods by both teachers and pupils in Finland`s changing, but still rather authoritarian culture. Others have suggested that Finland's low poverty rate is a reason for its success.[23][24] It has been suggested that the Finnish language plays an important part in Finland's PISA success.[25]

Lynn and Meisenberg (2010) found very high correlations (r>0.90) between mean student assessment results from PISA, TIMSS, PIRLS and others and IQ measurements at the country data level.[26]

An evaluation of the 2003 results showed that countries that spent more on education did not necessarily do better. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, South Korea, New Zealand and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.[citation needed]

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.[citation needed]


In 2010, the 2009 Program for International Student Assessment (PISA) results revealed that Shanghai students scored the highest in the world in every category (Mathematics, Reading and Science). The OECD described Shanghai as a pioneer of educational reform, noting that "there has been a sea change in pedagogy". OECD point out that they "abandoned their focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving."[27]

OECD has also noted that even in rural China results approached average levels for the OECD countries: "Citing further, as-yet unpublished OECD research, Mr Schleicher said, 'We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average.'"[28] For a developing country, China’s 99.4% enrolment in primary education is already, as the OECD puts it, “the envy of many countries” while junior secondary school participation rates in China are now 99%. But in Shanghai not only has senior secondary school enrolment attained 98% but admissions into higher education have achieved 80% of the relevant age group. That this growth reflects quality, not just quantity, is confirmed clearly by the OECD’s ranking of Shanghai’s secondary education as world number one.[28] According to the OECD, China has also expanded school access, and moved away from learning by rote.[29] "'The last point is key: Russia performs well in rote-based assessments, but not in Pisa,' says Schleicher, head of the indicators and analysis division at the OECD’s directorate for education. 'China does well in both rote-based and broader assessments.'"[28]

United States

Two studies have compared high achievers in mathematics on the PISA and the U.S. National Assessment of Educational Progress (NAEP). Comparisons were made between those scoring at the "advanced" and "proficient" levels in mathematics on the NAEP with the corresponding performance on the PISA. Overall, 30 nations had higher percentages than the U.S. of students at the "advanced" level of mathematics. The only OECD countries with worse results were Portugal, Greece, Turkey, and Mexico. Six percent of U.S. students were "advanced" in mathematics compared to 28 percent in Taiwan. The highest ranked state in the U.S. (Massachusetts) was just 15th in the world if it was compared with the nations participating in the PISA. 31 nations had higher percentages of "proficient" students than the U.S. Massachusetts was again the best U.S. state, but it ranked just ninth in the world if compared with the nations participating in the PISA.[30][31]

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results—suggesting that the U.S. states actually do better in world rankings.[32] The difference in apparent rankings is, however, almost entirely accounted for by the sampling of countries. PISA includes all of the OECD countries, while TIMSS is much more weighted in its sampling toward developing countries.


University of Southern California professor Stephen Krashen[33] and Mel Riddile of the NASSP say that low performance in the United States is closely related to American poverty, but the same reasoning applies to other countries.[23][24]

Reduced school lunch participation is the only available intra-poverty indicator for US schoolchildren; areas with less than 10% of the students having free or reduced price lunch averaged 551 (higher than any other OECD country). In comparison with the rest other OECD countries (which have tabled figures on children living in relative poverty):[24]

Country Percent of reduced school lunches (US)[24]

Percent of relative child poverty (Other OECD countries)[34]

PISA score[35]
United States < 10% 551
Finland 3.4% 536
Netherlands 9.0% 508
Belgium 6.7% 506
United States 10%–24.9% 527
Canada 13.6% 524
New Zealand 16.3% 521
Japan 14.3% 520
Australia 11.6% 515
United States 25–49.9% 502
Estonia 40.1% 501
United States 50–74.9% 471
Russian Federation 58.3% 459
United States > 75% 446


According to OECD's PISA, the average Portuguese 15-years old student was for many years underrated and underachieving in terms of reading literacy, mathematics and science knowledge in the OECD, nearly tied with the Italian and just above those from countries like Greece, Turkey and Mexico. However, since 2010, PISA results for Portuguese students improved dramatically. The Portuguese Ministry of Education announced a 2010 report published by its office for educational evaluation GAVE (Gabinete de Avaliação do Ministério da Educação) which criticized the results of PISA 2009 report and claimed that the average Portuguese teenage student had profund handicaps in terms of expression, communication and logic, as well as a low performance when asked to solve problems. They also claimed that those fallacies are not exclusive of Portugal but indeed occur in other countries due to the way PISA was designed.[36]

See also


  1. 1.0 1.1 Hanushek, Eric A., and Ludger Woessmann. 2011. "The economics of international differences in educational achievement." In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89-200.
  2. PISA 2009 Technical Report, 2012, OECD,
  3. Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf
  5. C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]
  6. The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. It is similar to procedures employed in NAEP and TIMSS. According to J. Wuttke Die Insignifikanz signifikanter Unterschiede. (2007, in German), the description in the Technical Reports is incomplete and plagued by notational errors.
  7. PISA 2009.,3746,en_32252351_32235731_46567613_1_1_1_1,00.html
  8. OECD (2001) p. 53; OECD (2004a) p. 92; OECD (2007) p. 56.
  9. E.g. OECD (2001), chapters 7 and 8: Influence of school organization and socio-economic background upon performance in the reading test. Reading was the main domain of PISA 2000.
  10. Multi-dimensional Data Request, OECD, 2010,, retrieved 2012-06-28 
  11. PISA 2009 Results: Executive Summary (Figure 1 only), OECD, 2010,, retrieved 2012-06-28 
  12. Walker, Maurice (2011), PISA 2009 Plus Results, OECD,, retrieved 2012-06-28 
  13. M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008. [2].
  14. Waiting for "Superman" trailer. Retrieved on October 8, 2010.
  15. Yong Zhao (2010-12-10), A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance, 
  16. Vishnoi, Anubhuti (2012-01-07), Poor PISA ranks: HRD seeks reason, The Indian Express, 
  17. Masani, Zareer (February 27, 2008). "India still Asia's reluctant tiger". BBC News. 
  18. Hanushek, Eric; Woessmann, Ludger (2008), "The role of cognitive skills in economic development", Journal of Economic Literature 46 (3): 607–668, doi:10.1257/jel.46.3.607, 
  19. Rindermann, Heiner; Ceci, Stephen J (2009), "Educational policy and country outcomes in international cognitive competence studies", Perspectives on Psychological Science 4 (6): 551–577, doi:10.1111/j.1745-6924.2009.01165.x 
  20. Bishop, John H (1997), "The effect of national standards and curriculum-based exams on achievement", American Economic Review 87 (2): 260–264, 
  21. Hanushek, Eric; Woessmann, Ludger (2006), "Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries", Economic Journal 116 (510): C63-C76, 
  22. Simola, H. (2005). The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education. Comparative Education, 41, 455-470.
  23. 23.0 23.1 "The Economics Behind International Education Rankings" National Educational Association
  24. 24.0 24.1 24.2 24.3 Riddile, Mel (2010-12-15), PISA: It's Poverty Not Stupid, National Association of Secondary School Principals, 
  25. Why does Finnish give better PISA results?
  26. Lynn, R. & Meisenberg, G. (2010). National IQs calculated and validated for 108 nations. Intelligence, 38, 353-360.
  27. Gumbel, Peter (2010-12-07), China Beats Out Finland for Top Marks in Education, TIME,,8599,2035586,00.html#ixzz17XACd2S2, retrieved 2012-06-27 
  28. 28.0 28.1 28.2 Cook, Chris (2010-12-07), Shanghai tops global state school rankings, Financial Times,, retrieved 2012-06-28 
  29. Mance, Henry (2010-12-07), Why are Chinese schoolkids so good?, Financial Times,, retrieved 2012-06-28 
  30. Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state’s international standing." Education Next 11, no. 4 (Fall): 51-59.
  31. Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10-18.
  32. Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (November 14); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  33. "How poverty affected U.S. PISA scores" The Washington Post
  34. "Child poverty statistics: how the UK compares to other countries", The Guardian. The same UNICEF figures were used by Riddile.
  35. Highlights From PISA 2009, Table 3.
  36. (Portuguese) Estudo do ministério aponta graves problemas aos alunos portugueses, GAVE (Gabinete de Avaliação do Ministério da Educação) 2010 report in RTP

Further reading

Official websites and reports

  • OECD/PISA website
    • OECD (1999): Measuring Student Knowledge and Skills. A New Framework for Assessment. Paris: OECD, ISBN 92-64-17053-7 [4]
    • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
    • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2 [5]
    • OECD (2004a): Learning for Tomorrow's World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6 [6]
    • OECD (2004b): Problem Solving for Tomorrow's World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
    • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
    • OECD (2007): Science Competencies for Tomorrow's World: Results from PISA 2006 [7]

Reception and political consequences

  • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.


  • N. Mons, X. Pons: The reception and use of Pisa in France.


  • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.
  • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v 32 n 5 pp 619–634 Nov 2006.

United Kingdom

  • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland. [8]



  • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)
  • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
  • R. Münch: Globale Eliten, lokale Autoritäten: Bildung und Wissenschaft unter dem Regime von PISA, McKinsey & Co. Frankfurt am Main : Suhrkamp, 2009. ISBN 978-3-518-12560-1 (in German)


External links

Video clips