fbpx
Wikipedia

Programme for International Student Assessment

The Programme for International Student Assessment (PISA) is a worldwide study by the Organisation for Economic Co-operation and Development (OECD) in member and non-member nations intended to evaluate educational systems by measuring 15-year-old school pupils' scholastic performance on mathematics, science, and reading.[1] It was first performed in 2000 and then repeated every three years. Its aim is to provide comparable data with a view to enabling countries to improve their education policies and outcomes. It measures problem solving and cognition.[2]

Programme for International Student Assessment
AbbreviationPISA
Formation1997
PurposeComparison of education attainment across the world
HeadquartersOECD Headquarters
Location
Region served
World
Membership
79 government education departments
Official language
English and French
Head of the Early Childhood and Schools Division
Yuri Belfali
Main organ
PISA Governing Body (Chair – Michele Bruniges)
Parent organization
OECD
Websitewww.oecd.org/pisa/
PISA average Mathematics scores (2018)
PISA average Science scores (2018)
PISA average Reading scores (2018)

The results of the 2018 data collection were released on 3 December 2019.[3]

Influence and impact Edit

PISA, and similar international standardised assessments of educational attainment are increasingly used in the process of education policymaking at both national and international levels.[4]

PISA was conceived to set in a wider context the information provided by national monitoring of education system performance through regular assessments within a common, internationally agreed framework; by investigating relationships between student learning and other factors they can "offer insights into sources of variation in performances within and between countries".[5]

Until the 1990s, few European countries used national tests. In the 1990s, ten countries / regions introduced standardised assessment, and since the early 2000s, ten more followed suit. By 2009, only five European education systems had no national student assessments.[4]

The impact of these international standardised assessments in the field of educational policy has been significant, in terms of the creation of new knowledge, changes in assessment policy, and external influence over national educational policy more broadly.[who?][citation needed]

Creation of new knowledge Edit

Data from international standardised assessments can be useful in research on causal factors within or across education systems.[4] Mons notes that the databases generated by large-scale international assessments have made it possible to carry out inventories and comparisons of education systems on an unprecedented scale* on themes ranging from the conditions for learning mathematics and reading, to institutional autonomy and admissions policies.[6] They allow typologies to be developed that can be used for comparative statistical analyses of education performance indicators, thereby identifying the consequences of different policy choices. They have generated new knowledge about education: PISA findings have challenged deeply embedded educational practices, such as the early tracking of students into vocational or academic pathways.[7]

  • 79 countries and economies participated in the 2018 data collection.

Barroso and de Carvalho find that PISA provides a common reference connecting academic research in education and the political realm of public policy, operating as a mediator between different strands of knowledge from the realm of education and public policy.[8] However, although the key findings from comparative assessments are widely shared in the research community[4] the knowledge they create does not necessarily fit with government reform agendas; this leads to some inappropriate uses of assessment data.

Changes in national assessment policy Edit

Emerging research suggests that international standardised assessments are having an impact on national assessment policy and practice. PISA is being integrated into national policies and practices on assessment, evaluation, curriculum standards and performance targets; its assessment frameworks and instruments are being used as best-practice models for improving national assessments; many countries have explicitly incorporated and emphasise PISA-like competencies in revised national standards and curricula; others use PISA data to complement national data and validate national results against an international benchmark.[7]

External influence over national educational policy Edit

More important than its influence on countries' policy of student assessment, is the range of ways in which PISA is influencing countries education policy choices.

Policy-makers in most participating countries see PISA as an important indicator of system performance; PISA reports can define policy problems and set the agenda for national policy debate; policymakers seem to accept PISA as a valid and reliable instrument for internationally benchmarking system performance and changes over time; most countries—irrespective of whether they performed above, at, or below the average PISA score—have begun policy reforms in response to PISA reports.[7]

Against this, impact on national education systems varies markedly. For example, in Germany, the results of the first PISA assessment caused the so-called 'PISA shock': a questioning of previously accepted educational policies; in a state marked by jealously guarded regional policy differences, it led ultimately to an agreement by all Länder to introduce common national standards and even an institutionalised structure to ensure that they were observed.[9] In Hungary, by comparison, which shared similar conditions to Germany, PISA results have not led to significant changes in educational policy.[10]

Because many countries have set national performance targets based on their relative rank or absolute PISA score, PISA assessments have increased the influence of their (non-elected) commissioning body, the OECD, as an international education monitor and policy actor, which implies an important degree of 'policy transfer' from the international to the national level; PISA in particular is having "an influential normative effect on the direction of national education policies".[7] Thus, it is argued that the use of international standardised assessments has led to a shift towards international, external accountability for national system performance; Rey contends that PISA surveys, portrayed as objective, third-party diagnoses of education systems, actually serve to promote specific orientations on educational issues.[4]

National policy actors refer to high-performing PISA countries to "help legitimise and justify their intended reform agenda within contested national policy debates".[11] PISA data can be "used to fuel long-standing debates around pre-existing conflicts or rivalries between different policy options, such as in the French Community of Belgium".[12] In such instances, PISA assessment data are used selectively: in public discourse governments often only use superficial features of PISA surveys such as country rankings and not the more detailed analyses. Rey (2010:145, citing Greger, 2008) notes that often the real results of PISA assessments are ignored as policymakers selectively refer to data in order to legitimise policies introduced for other reasons.[13]

In addition, PISA's international comparisons can be used to justify reforms with which the data themselves have no connection; in Portugal, for example, PISA data were used to justify new arrangements for teacher assessment (based on inferences that were not justified by the assessments and data themselves); they also fed the government's discourse about the issue of pupils repeating a year, (which, according to research, fails to improve student results).[14] In Finland, the country's PISA results (that are in other countries deemed to be excellent) were used by Ministers to promote new policies for 'gifted' students.[15] Such uses and interpretations often assume causal relationships that cannot legitimately be based upon PISA data which would normally require fuller investigation through qualitative in-depth studies and longitudinal surveys based on mixed quantitative and qualitative methods,[16] which politicians are often reluctant to fund.

Recent decades have witnessed an expansion in the uses of PISA and similar assessments, from assessing students' learning, to connecting "the educational realm (their traditional remit) with the political realm".[17] This raises the question of whether PISA data are sufficiently robust to bear the weight of the major policy decisions that are being based upon them, for, according to Breakspear, PISA data have "come to increasingly shape, define and evaluate the key goals of the national / federal education system".[7] This implies that those who set the PISA tests – e.g. in choosing the content to be assessed and not assessed – are in a position of considerable power to set the terms of the education debate, and to orient educational reform in many countries around the globe.[7]

Framework Edit

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA's methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA's Progress in International Reading Literacy Study (PIRLS).

PISA aims to test literacy the competence of students in three fields: reading, mathematics, science on an indefinite scale.[18]

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education's application to real-life problems and lifelong learning (workforce knowledge).

In the reading test, "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling." Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts."[19]

PISA also assesses students in innovative domains. In 2012 and 2015 in addition to reading, mathematics and science, they were tested in collaborative problem solving. In 2018 the additional innovative domain was global competence.

Implementation Edit

PISA is sponsored, governed, and coordinated by the OECD, but paid for by participating countries.[citation needed]

Method of testing Edit

Sampling Edit

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006, however, several countries also used a grade-based sample of students. This made it possible to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are fewer than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

Test Edit

 
PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

Each student takes a two-hour computer based test. Part of the test is multiple-choice and part involves fuller answers. There are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation, and family. School directors fill in a questionnaire describing school demographics, funding, etc. In 2012 the participants were, for the first time in the history of large-scale testing and assessments, offered a new type of problem, i.e. interactive (complex) problems requiring exploration of a novel virtual device.[20][21]

In selected countries, PISA started experimentation with computer adaptive testing.

National add-ons Edit

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: On the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in the international and the national test, another 45,000 take the national test only. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the "PISA" label for national tests.[22]

Data scaling Edit

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be 'scaled' to allow meaningful comparisons. Scores are thus scaled so that the OECD average in each domain (mathematics, reading and science) is 500 and the standard deviation is 100.[23] This is true only for the initial PISA cycle when the scale was first introduced, though, subsequent cycles are linked to the previous cycles through IRT scale linking methods.[24]

This generation of proficiency estimates is done using a latent regression extension of the Rasch model, a model of item response theory (IRT), also known as conditioning model or population model. The proficiency estimates are provided in the form of so-called plausible values, which allow unbiased estimates of differences between groups. The latent regression, together with the use of a Gaussian prior probability distribution of student competencies allows estimation of the proficiency distributions of groups of participating students.[25] The scaling and conditioning procedures are described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. NAEP and TIMSS use similar scaling methods.

Ranking results Edit

All PISA results are tabulated by country; recent PISA cycles have separate provincial or regional results for some countries. Most public attention concentrates on just one outcome: the mean scores of countries and their rankings of countries against one another. In the official reports, however, country-by-country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.[citation needed]

PISA never combines mathematics, science and reading domain scores into an overall score. However, commentators have sometimes combined test results from all three domains into an overall country ranking. Such meta-analysis is not endorsed by the OECD, although official summaries sometimes use scores from a testing cycle's principal domain as a proxy for overall student ability.

PISA 2018 ranking summary Edit

The results of PISA 2018 were presented on 3 December 2019, which included data for around 600,000 participating students in 79 countries and economies, with China's economic area of Beijing, Shanghai, Jiangsu and Zhejiang emerging as the top performer in all categories.[26] Note that this does not represent the entirety of mainland China.[27] Reading results for Spain were not released due to perceived anomalies.[28]

Mathematics Science Reading
1   China (B-S-J-Z)[a] 591
2   Singapore 569
3   Macau 558
4   Hong Kong 551
5   Taiwan 531
6   Japan 527
7   South Korea 526
8   Estonia 523
9   Netherlands 519
10   Poland 516
11    Switzerland 515
12   Canada 512
13   Denmark 509
13   Slovenia 509
15   Belgium 508
16   Finland 507
17   Sweden 502
17   United Kingdom 502
19   Norway 501
20   Germany 500
20   Ireland 500
22   Czech Republic 499
22   Austria 499
24   Latvia 496
24   Vietnam 496
26   France 495
26   Iceland 495
28   New Zealand 494
29   Portugal 492
30   Australia 491
International Average (OECD) 489
31   Russia 488
32   Italy 487
33   Slovakia 486
34   Luxembourg 483
35   Lithuania 481
35   Spain 481
35   Hungary 481
38   United States 478
39   Belarus 472
39   Malta 472
41   Croatia 464
42   Israel 463
43   Turkey 454
44   Ukraine 453
45   Cyprus 451
45   Greece 451
47   Serbia 448
48   Malaysia 440
49   Albania 437
50   Bulgaria 436
51   United Arab Emirates 435
52   Brunei 430
52   Montenegro 430
52   Romania 430
55   Kazakhstan 423
56   Moldova 421
57   Azerbaijan 420
58   Thailand 419
59   Uruguay 418
60   Chile 417
61   Qatar 414
62   Mexico 409
63   Bosnia and Herzegovina 406
64   Costa Rica 402
65   Jordan 400
65   Peru 400
67   Georgia 398
68   North Macedonia 394
69   Lebanon 393
70   Colombia 391
71   Brazil 384
72   Argentina 379
72   Indonesia 379
74   Saudi Arabia 373
75   Morocco 368
76   Kosovo 366
77   Panama 353
77   Philippines 353
79   Dominican Republic 325
1   China (B-S-J-Z)[a] 590
2   Singapore 551
3   Macau 544
4   Vietnam 543
5   Estonia 530
6   Japan 529
7   Finland 522
8   South Korea 519
9   Canada 518
10   Hong Kong 517
11   Taiwan 516
12   Poland 511
13   New Zealand 508
14   Slovenia 507
15   United Kingdom 505
16   Australia 503
16   Germany 503
16   Netherlands 503
19   United States 502
20   Belgium 499
20   Sweden 499
22   Czech Republic 497
23   Ireland 496
24    Switzerland 495
25   Denmark 493
25   France 493
27   Portugal 492
28   Austria 490
28   Norway 490
International Average (OECD) 489
30   Latvia 487
31   Spain 483
32   Lithuania 482
33   Hungary 481
34   Russia 478
35   Luxembourg 477
36   Iceland 475
37   Croatia 472
38   Belarus 471
39   Ukraine 469
40   Italy 468
40   Turkey 468
42   Slovakia 464
43   Israel 462
44   Malta 457
45   Greece 452
46   Chile 444
47   Serbia 440
48   Cyprus 439
49   Malaysia 438
50   United Arab Emirates 434
51   Brunei 431
52   Jordan 429
53   Moldova 428
54   Romania 426
54   Thailand 426
54   Uruguay 426
57   Bulgaria 424
58   Mexico 419
58   Qatar 419
60   Albania 417
61   Costa Rica 416
62   Montenegro 415
63   Colombia 413
63   North Macedonia 413
65   Argentina 404
65   Brazil 404
65   Peru 404
68   Azerbaijan 398
68   Bosnia and Herzegovina 398
70   Kazakhstan 397
71   Indonesia 396
72   Saudi Arabia 386
73   Lebanon 384
74   Georgia 383
75   Morocco 377
76   Kosovo 365
76   Panama 365
78   Philippines 357
79   Dominican Republic 336
1   China (B-S-J-Z)[a] 555
2   Singapore 549
3   Macau 525
4   Hong Kong 524
5   Estonia 523
6   Canada 520
6   Finland 520
8   Ireland 518
9   South Korea 514
10   Poland 512
11   New Zealand 506
11   Sweden 506
13   United States 505
13   Vietnam 505
15   Japan 504
15   United Kingdom 504
17   Australia 503
17   Taiwan 503
19   Denmark 501
20   Norway 499
21   Germany 498
22   Slovenia 495
23   Belgium 493
23   France 493
25   Portugal 492
26   Czech Republic 490
International Average (OECD) 487
27   Netherlands 485
28   Austria 484
28    Switzerland 484
30   Croatia 479
30   Latvia 479
30   Russia 479
33   Hungary 476
33   Italy 476
33   Lithuania 476
36   Belarus 474
36   Iceland 474
38   Israel 470
38   Luxembourg 470
40   Turkey 466
40   Ukraine 466
42   Slovakia 458
43   Greece 457
44   Chile 452
45   Malta 448
46   Serbia 439
47   United Arab Emirates 432
48   Romania 428
49   Uruguay 427
50   Costa Rica 426
51   Cyprus 424
51   Moldova 424
53   Montenegro 421
54   Bulgaria 420
54   Mexico 420
56   Jordan 419
57   Malaysia 415
58   Brazil 413
59   Colombia 412
60   Brunei 408
61   Qatar 407
62   Albania 405
63   Bosnia and Herzegovina 403
64   Argentina 402
65   Peru 401
66   Saudi Arabia 399
67   North Macedonia 393
67   Thailand 393
69   Azerbaijan 389
70   Kazakhstan 387
71   Georgia 380
72   Panama 377
73   Indonesia 371
74   Morocco 359
75   Kosovo 353
75   Lebanon 353
77   Dominican Republic 342
78   Philippines 340
     
     

Rankings comparison 2000–2015 Edit

Mathematics
Country 2015 2012 2009 2006 2003 2000
Score Rank Score Rank Score Rank Score Rank Score Rank Score Rank
International Average (OECD) 490 494 495 494 499 492
  Albania 413 57 394 54 377 53 381 33
  Algeria 360 72
  Argentina 409 58 388 30
  Australia 494 25 504 17 514 13 520 12 524 10 533 6
  Austria 497 20 506 16 496 22 505 17 506 18 503 12
  China B-S-J-G[b] 531 6
  Belgium 507 15 515 13 515 12 520 11 529 7 520 8
  Brazil 377 68 389 55 386 51 370 50 356 39 334 35
  Bulgaria 441 47 439 43 428 41 413 43 430 28
  Argentina CABA[c] 456 43 418 49
  Canada 516 10 518 11 527 8 527 7 532 6 533 6
  Chile 423 50 423 47 421 44 411 44 384 32
  Taiwan 542 4 560 3 543 4 549 1
  Colombia 390 64 376 58 381 52 370 49
  Costa Rica 400 62 407 53
  Croatia 464 41 471 38 460 38 467 34
  Cyprus 437 48
  Czech Republic 492 28 499 22 493 25 510 15 516 12 498 14
  Denmark 511 12 500 20 503 17 513 14 514 14 514 10
  Dominican Republic 328 73
  Estonia 520 9 521 9 512 15 515 13
  Finland 511 13 519 10 541 5 548 2 544 2 536 5
  France 493 26 495 23 497 20 496 22 511 15 517 9
  Macedonia 371 69 381 33
  Georgia 404 60
  Germany 506 16 514 14 513 14 504 19 503 19 490 16
  Greece 454 44 453 40 466 37 459 37 445 32 447 24
  Hong Kong 548 2 561 2 555 2 547 3 550 1 560 1
  Hungary 477 37 477 37 490 27 491 26 490 25 488 17
  Iceland 488 31 493 25 507 16 506 16 515 13 514 10
  Indonesia 386 66 375 60 371 55 391 47 360 37 367 34
  Ireland 504 18 501 18 487 30 501 21 503 20 503 12
  Israel 470 39 466 39 447 39 442 38 433 26
  Italy 490 30 485 30 483 33 462 36 466 31 457 22
  Japan 532 5 536 6 529 7 523 9 534 5 557 2
  Jordan 380 67 386 57 387 50 384 48
  Kazakhstan 460 42 432 45 405 48
  South Korea 524 7 554 4 546 3 547 4 542 3 547 3
  Kosovo 362 71
  Latvia 482 34 491 26 482 34 486 30 483 27 463 21
  Lebanon 396 63
  Lithuania 478 36 479 35 477 35 486 29
  Luxembourg 486 33 490 27 489 28 490 27 493 23 446 25
  Macau 544 3 538 5 525 10 525 8 527 8
  Malaysia 446 45 421 48
  Malta 479 35
  Mexico 408 59 413 50 419 46 406 45 385 36 387 31
  Moldova 420 52
  Montenegro 418 54 410 51 403 49 399 46
  Netherlands 512 11 523 8 526 9 531 5 538 4
  New Zealand 495 21 500 21 519 11 522 10 523 11 537 4
  Norway 502 19 489 28 498 19 490 28 495 22 499 13
  Peru 387 65 368 61 365 57 292 36
  Poland 504 17 518 12 495 23 495 24 490 24 470 20
  Portugal 492 29 487 29 487 31 466 35 466 30 454 23
  Qatar 402 61 376 59 368 56 318 52
  Romania 444 46 445 42 427 42 415 42 426 29
  Russia 494 23 482 32 468 36 476 32 468 29 478 18
  Singapore 564 1 573 1 562 1
  Slovakia 475 38 482 33 497 21 492 25 498 21
  Slovenia 510 14 501 19 501 18 504 18
  Spain 486 32 484 31 483 32 480 31 485 26 476 19
  Sweden 494 24 478 36 494 24 502 20 509 16 510 11
   Switzerland 521 8 531 7 534 6 530 6 527 9 529 7
  Thailand 415 56 427 46 419 45 417 41 417 35 432 27
  Trinidad and Tobago 417 55 414 47
  Tunisia 367 70 388 56 371 54 365 51 359 38
  Turkey 420 51 448 41 445 40 424 40 423 33
  United Arab Emirates 427 49 434 44
  United Kingdom 492 27 494 24 492 26 495 23 508 17 529 7
  United States 470 40 481 34 487 29 474 33 483 28 493 15
  Uruguay 418 53 409 52 427 43 427 39 422 34
  Vietnam 495 22 511 15
Science
Country 2015 2012 2009 2006
Score Rank Score Rank Score Rank Score Rank
International Average (OECD) 493 501 501 498
  Albania 427 54 397 58 391 54
  Algeria 376 72
  Argentina 432 52
  Australia 510 14 521 14 527 9 527 8
  Austria 495 26 506 21 494 28 511 17
  China B-S-J-G[b] 518 10
  Belgium 502 20 505 22 507 19 510 18
  Brazil 401 66 402 55 405 49 390 49
  Bulgaria 446 46 446 43 439 42 434 40
  Argentina CABA[c] 475 38 425 49
  Canada 528 7 525 9 529 7 534 3
  Chile 447 45 445 44 447 41 438 39
  Taiwan 532 4 523 11 520 11 532 4
  Colombia 416 60 399 56 402 50 388 50
  Costa Rica 420 58 429 47
  Croatia 475 37 491 32 486 35 493 25
  Cyprus 433 51
  Czech Republic 493 29 508 20 500 22 513 14
  Denmark 502 21 498 25 499 24 496 23
  Dominican Republic 332 73
  Estonia 534 3 541 5 528 8 531 5
  Finland 531 5 545 4 554 1 563 1
  France 495 27 499 24 498 25 495 24
  Macedonia 384 70
  Georgia 411 63
  Germany 509 16 524 10 520 12 516 12
  Greece 455 44 467 40 470 38 473 37
  Hong Kong 523 9 555 1 549 2 542 2
  Hungary 477 35 494 30 503 20 504 20
  Iceland 473 39 478 37 496 26 491 26
  Indonesia 403 65 382 60 383 55 393 48
  Ireland 503 19 522 13 508 18 508 19
  Israel 467 40 470 39 455 39 454 38
  Italy 481 34 494 31 489 33 475 35
  Japan 538 2 547 3 539 4 531 6
  Jordan 409 64 409 54 415 47 422 43
  Kazakhstan 456 43 425 48 400 53
  South Korea 516 11 538 6 538 5 522 10
  Kosovo 378 71
  Latvia 490 31 502 23 494 29 490 27
  Lebanon 386 68
  Lithuania 475 36 496 28 491 31 488 31
  Luxembourg 483 33 491 33 484 36 486 33
  Macau 529 6 521 15 511 16 511 16
  Malaysia 443 47 420 50
  Malta 465 41
  Mexico 416 61 415 52 416 46 410 47
  Moldova 428 53
  Montenegro 411 62 410 53 401 51 412 46
  Netherlands 509 17 522 12 522 10 525 9
  New Zealand 513 12 516 16 532 6 530 7
  Norway 498 24 495 29 500 23 487 32
  Peru 397 67 373 61 369 57
  Poland 501 22 526 8 508 17 498 22
  Portugal 501 23 489 34 493 30 474 36
  Qatar 418 59 384 59 379 56 349 52
  Romania 435 50 439 46 428 43 418 45
  Russia 487 32 486 35 478 37 479 34
  Singapore 556 1 551 2 542 3
  Slovakia 461 42 471 38 490 32 488 29
  Slovenia 513 13 514 18 512 15 519 11
  Spain 493 30 496 27 488 34 488 30
  Sweden 493 28 485 36 495 27 503 21
   Switzerland 506 18 515 17 517 13 512 15
  Thailand 421 57 444 45 425 45 421 44
  Trinidad and Tobago 425 56 410 48
  Tunisia 386 69 398 57 401 52 386 51
  Turkey 425 55 463 41 454 40 424 42
  United Arab Emirates 437 48 448 42
  United Kingdom 509 15 514 19 514 14 515 13
  United States 496 25 497 26 502 21 489 28
  Uruguay 435 49 416 51 427 44 428 41
  Vietnam 525 8 528 7
Reading
Country 2015 2012 2009 2006 2003 2000
Score Rank Score Rank Score Rank Score Rank Score Rank Score Rank
International Average (OECD) 493 496 493 489 494 493
  Albania 405 63 394 58 385 55 349 39
  Algeria 350 71
  Argentina 425 56
  Australia 503 16 512 12 515 8 513 7 525 4 528 4
  Austria 485 33 490 26 470 37 490 21 491 22 492 19
  China B-S-J-G[b] 494 27
  Belgium 499 20 509 16 506 10 501 11 507 11 507 11
  Brazil 407 62 407 52 412 49 393 47 403 36 396 36
  Bulgaria 432 49 436 47 429 42 402 43 430 32
  Argentina CABA[c] 475 38 429 48
  Canada 527 3 523 7 524 5 527 4 528 3 534 2
  Chile 459 42 441 43 449 41 442 37 410 35
  Taiwan 497 23 523 8 495 21 496 15
  Colombia 425 57 403 54 413 48 385 49
  Costa Rica 427 52 441 45
  Croatia 487 31 485 33 476 34 477 29
  Cyprus 443 45
  Czech Republic 487 30 493 24 478 32 483 25 489 24 492 20
  Denmark 500 18 496 23 495 22 494 18 492 19 497 16
  Dominican Republic 358 69
  Estonia 519 6 516 10 501 12 501 12
  Finland 526 4 524 5 536 2 547 2 543 1 546 1
  France 499 19 505 19 496 20 488 22 496 17 505 14
  Macedonia 352 70 373 37
  Georgia 401 65
  Germany 509 11 508 18 497 18 495 17 491 21 484 22
  Greece 467 41 477 38 483 30 460 35 472 30 474 25
  Hong Kong 527 2 545 1 533 3 536 3 510 9 525 6
  Hungary 470 40 488 28 494 24 482 26 482 25 480 23
  Iceland 482 35 483 35 500 15 484 23 492 20 507 12
  Indonesia 397 67 396 57 402 53 393 46 382 38 371 38
  Ireland 521 5 523 6 496 19 517 6 515 6 527 5
  Israel 479 37 486 32 474 35 439 39 452 29
  Italy 485 34 490 25 486 27 469 32 476 29 487 21
  Japan 516 8 538 3 520 7 498 14 498 14 522 9
  Jordan 408 61 399 55 405 51 401 44
  Kazakhstan 427 54 393 59 390 54
  South Korea 517 7 536 4 539 1 556 1 534 2 525 7
  Kosovo 347 72
  Latvia 488 29 489 27 484 28 479 27 491 23 458 28
  Lebanon 347 73
  Lithuania 472 39 477 37 468 38 470 31
  Luxembourg 481 36 488 30 472 36 479 28 479 27 441 30
  Macau 509 12 509 15 487 26 492 20 498 15
  Malaysia 431 50 398 56
  Malta 447 44
  Mexico 423 58 424 49 425 44 410 42 400 37 422 34
  Moldova 416 59
  Montenegro 427 55 422 50 408 50 392 48
  Netherlands 503 15 511 13 508 9 507 10 513 8
  New Zealand 509 10 512 11 521 6 521 5 522 5 529 3
  Norway 513 9 504 20 503 11 484 24 500 12 505 13
  Peru 398 66 384 61 370 57 327 40
  Poland 506 13 518 9 500 14 508 8 497 16 479 24
  Portugal 498 21 488 31 489 25 472 30 478 28 470 26
  Qatar 402 64 388 60 372 56 312 51
  Romania 434 47 438 46 424 45 396 45 428 33
  Russia 495 26 475 40 459 40 440 38 442 32 462 27
  Singapore 535 1 542 2 526 4
  Slovakia 453 43 463 41 477 33 466 33 469 31
  Slovenia 505 14 481 36 483 29 494 19
  Spain 496 25 488 29 481 31 461 34 481 26 493 18
  Sweden 500 17 483 34 497 17 507 9 514 7 516 10
   Switzerland 492 28 509 14 501 13 499 13 499 13 494 17
  Thailand 409 60 441 44 421 46 417 40 420 35 431 31
  Trinidad and Tobago 427 53 416 47
  Tunisia 361 68 404 53 404 52 380 50 375 39
  Turkey 428 51 475 39 464 39 447 36 441 33
  United Arab Emirates 434 48 442 42
  United Kingdom 498 22 499 21 494 23 495 16 507 10 523 8
  United States 497 24 498 22 500 16 495 18 504 15
  Uruguay 437 46 411 51 426 43 413 41 434 34
  Vietnam 487 32 508 17
  1. ^ a b c Beijing, Shanghai, Jiangsu, Zhejiang
  2. ^ a b c Shanghai (2009, 2012); Beijing, Shanghai, Jiangsu, Guangdong (2015)
  3. ^ a b c Ciudad Autónoma de Buenos Aires

Previous years Edit

Period Focus OECD countries Partner countries Participating students Notes
2000 Reading 28 4 + 11 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27 400,000 Reading scores for US disqualified from analysis due to misprint in testing materials.[29]
2009[30] Reading 34 41 + 10 470,000 10 additional non-OECD countries took the test in 2010.[31][32]
2012[33] Mathematics 34 31 510,000

Reception Edit

(China) China's participation in the 2012 test was limited to Shanghai, Hong Kong, and Macau as separate entities. In 2012, Shanghai participated for the second time, again topping the rankings in all three subjects, as well as improving scores in the subjects compared to the 2009 tests. Shanghai's score of 613 in mathematics was 113 points above the average score, putting the performance of Shanghai pupils about 3 school years ahead of pupils in average countries. Educational experts debated to what degree this result reflected the quality of the general educational system in China, pointing out that Shanghai has greater wealth and better-paid teachers than the rest of China.[34] Hong Kong placed second in reading and science and third in maths.

Andreas Schleicher, PISA division head and co-ordinator, stated that PISA tests administered in rural China have produced some results approaching the OECD average. Citing further as-yet-unpublished OECD research, he said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average."[35] Schleicher believes that China has also expanded school access and has moved away from learning by rote,[36] performing well in both rote-based and broader assessments.[35]

In 2018 the Chinese provinces that participated were Beijing, Shanghai, Jiangsu and Zhejiang. In 2015, the participating provinces were Jiangsu, Guangdong, Beijing, and Shanghai.[37] The 2015 Beijing-Shanghai-Jiangsu-Guangdong cohort scored a median 518 in science in 2015, while the 2012 Shanghai cohort scored a median 580.

Critics of PISA counter that in Shanghai and other Chinese cities, most children of migrant workers can only attend city schools up to the ninth grade, and must return to their parents' hometowns for high school due to hukou restrictions, thus skewing the composition of the city's high school students in favor of wealthier local families. A population chart of Shanghai reproduced in The New York Times shows a steep drop off in the number of 15-year-olds residing there.[38] According to Schleicher, 27% of Shanghai's 15-year-olds are excluded from its school system (and hence from testing). As a result, the percentage of Shanghai's 15-year-olds tested by PISA was 73%, lower than the 89% tested in the US.[39] Following the 2015 testing, OECD published in depth studies on the education systems of a selected few countries including China.[40]

In 2014, Liz Truss, the British Parliamentary Under-Secretary of State at the Department for Education, led a fact-finding visit to schools and teacher-training centres in Shanghai.[41] Britain increased exchanges with Chinese teachers and schools to find out how to improve quality. In 2014, 60 teachers from Shanghai were invited to the UK to help share their teaching methods, support pupils who are struggling, and help to train other teachers.[42] In 2016, Britain invited 120 Chinese teachers, planning to adopt Chinese styles of teaching in 8,000 aided schools.[43] By 2019, approximately 5,000 of Britain's 16,000 primary schools had adopted the Shanghai's teaching methods.[44] The performance of British schools in PISA improved after adopting China's teaching styles.[45][46]

Finland Edit

Finland, which received several top positions in the first tests, fell in all three subjects in 2012, but remained the best performing country overall in Europe, achieving their best result in science with 545 points (5th) and worst in mathematics with 519 (12th) in which the country was outperformed by four other European countries. The drop in mathematics was 25 points since 2003, the last time mathematics was the focus of the tests. For the first time Finnish girls outperformed boys in mathematics narrowly. It was also the first time pupils in Finnish-speaking schools did not perform better than pupils in Swedish-speaking schools. Minister of Education and Science Krista Kiuru expressed concern for the overall drop, as well as the fact that the number of low-performers had increased from 7% to 12%.[47]

India Edit

India participated in the 2009 round of testing but pulled out of the 2012 PISA testing, with the Indian government attributing its action to the unfairness of PISA testing to Indian students.[48] India had ranked 72nd out of 73 countries tested in 2009.[49] The Indian Express reported, "The ministry (of education) has concluded that there was a socio-cultural disconnect between the questions and Indian students. The ministry will write to the OECD and drive home the need to factor in India's "socio-cultural milieu". India's participation in the next PISA cycle will hinge on this".[50] The Indian Express also noted that "Considering that over 70 nations participate in PISA, it is uncertain whether an exception would be made for India".

India did not participate in the 2012, 2015 and 2018 PISA rounds.[51]

A Kendriya Vidyalaya Sangathan (KVS) committee as well as a group of secretaries on education constituted by the Prime Minister of India Narendra Modi recommended that India should participate in PISA. Accordingly, in February 2017, the Ministry of Human Resource Development under Prakash Javadekar decided to end the boycott and participate in PISA from 2020. To address the socio-cultural disconnect between the test questions and students, it was reported that the OECD will update some questions. For example, the word avocado in a question may be replaced with a more popular Indian fruit such as mango.[52]

Malaysia Edit

In 2015, the results from Malaysia were found by the OECD to have not met the maximum response rate.[53] Opposition politician Ong Kian Ming said the education ministry tried to oversample high-performing students in rich schools.[54][55]

Sweden Edit

Sweden's result dropped in all three subjects in the 2012 test, which was a continuation of a trend from 2006 and 2009. It saw the sharpest fall in mathematics performance with a drop in score from 509 in 2003 to 478 in 2012. The score in reading showed a drop from 516 in 2000 to 483 in 2012. The country performed below the OECD average in all three subjects.[56] The leader of the opposition, Social Democrat Stefan Löfven, described the situation as a national crisis.[57] Along with the party's spokesperson on education, Ibrahim Baylan, he pointed to the downward trend in reading as most severe.[57]

In 2020, Swedish newspaper Expressen revealed that Sweden had inflated their score in PISA 2018 by not conforming to OECD standards. According to professor Magnus Henrekson a large number of foreign-born students had not been tested.[58] According to an article of Sveriges Radio, poor immigrant children's scores are a significant cause of the recent decrease in Swedish Pisa scores.

United Kingdom Edit

In the 2012 test, as in 2009, the result was slightly above average for the United Kingdom, with the science ranking being highest (20).[59] England, Wales, Scotland and Northern Ireland also participated as separated entities, showing the worst result for Wales which in mathematics was 43rd of the 65 countries and economies. Minister of Education in Wales Huw Lewis expressed disappointment in the results, said that there were no "quick fixes", but hoped that several educational reforms that have been implemented in the last few years would give better results in the next round of tests.[60] The United Kingdom had a greater gap between high- and low-scoring students than the average. There was little difference between public and private schools when adjusted for socio-economic background of students. The gender difference in favour of girls was less than in most other countries, as was the difference between natives and immigrants.[59]

Writing in the Daily Telegraph, Ambrose Evans-Pritchard warned against putting too much emphasis on the UK's international ranking, arguing that an overfocus on scholarly performances in East Asia might have contributed to the area's low birthrate, which he argued could harm the economic performance in the future more than a good PISA score would outweigh.[61]

In 2013, the Times Educational Supplement (TES) published an article, "Is PISA Fundamentally Flawed?" by William Stewart, detailing serious critiques of PISA's conceptual foundations and methods advanced by statisticians at major universities.[62]

In the article, Professor Harvey Goldstein of the University of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias, it can have the effect of "smoothing out" key differences between countries. "That is leaving out many of the important things," he warned. "They simply don't get commented on. What you are looking at is something that happens to be common. But (is it) worth looking at? PISA results are taken at face value as providing some sort of common standard across countries. But as soon as you begin to unpick it, I think that all falls apart."

Queen's University Belfast mathematician Dr. Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental, insoluble mathematical error that renders Pisa rankings "valueless".[63] Goldstein remarked that Dr. Morrison's objection highlights "an important technical issue" if not a "profound conceptual error". However, Goldstein cautioned that PISA has been "used inappropriately", contending that some of the blame for this "lies with PISA itself. I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects." Professors Morrison and Goldstein expressed dismay at the OECD's response to criticism. Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECD's "senior people" about them, his points were met with "absolute silence" and have yet to be addressed. "I was amazed at how unforthcoming they were," he told TES. "That makes me suspicious." "Pisa steadfastly ignored many of these issues," he says. "I am still concerned."[64]

Professor Svend Kreiner, of the University of Copenhagen, agreed: "One of the problems that everybody has with PISA is that they don't want to discuss things with people criticising or asking questions concerning the results. They didn't want to talk to me at all. I am sure it is because they can't defend themselves.[64]

United States Edit

Since 2012 a few states have participated in the PISA tests as separate entities. Only the 2012 and 2015 results are available on a state basis. Puerto Rico participated in 2015 as a separate US entity as well.

2012 US State results
Mathematics Science Reading
  Massachusetts 514
  Connecticut 506
  US Average 481
  Florida 467
  Massachusetts 527
  Connecticut 521
  US Average 497
  Florida 485
  Massachusetts 527
  Connecticut 521
  US Average 498
  Florida 492
2015 US State results
Mathematics Science Reading
programme, international, student, assessment, pisa, redirects, here, other, uses, pisa, disambiguation, pisa, worldwide, study, organisation, economic, operation, development, oecd, member, member, nations, intended, evaluate, educational, systems, measuring,. PISA redirects here For other uses see Pisa disambiguation The Programme for International Student Assessment PISA is a worldwide study by the Organisation for Economic Co operation and Development OECD in member and non member nations intended to evaluate educational systems by measuring 15 year old school pupils scholastic performance on mathematics science and reading 1 It was first performed in 2000 and then repeated every three years Its aim is to provide comparable data with a view to enabling countries to improve their education policies and outcomes It measures problem solving and cognition 2 Programme for International Student AssessmentAbbreviationPISAFormation1997PurposeComparison of education attainment across the worldHeadquartersOECD HeadquartersLocation2 rue Andre Pascal 75775 Paris Cedex 16Region servedWorldMembership79 government education departmentsOfficial languageEnglish and FrenchHead of the Early Childhood and Schools DivisionYuri BelfaliMain organPISA Governing Body Chair Michele Bruniges Parent organizationOECDWebsitewww wbr oecd wbr org wbr pisa wbr PISA average Mathematics scores 2018 PISA average Science scores 2018 PISA average Reading scores 2018 The results of the 2018 data collection were released on 3 December 2019 3 Contents 1 Influence and impact 1 1 Creation of new knowledge 1 2 Changes in national assessment policy 1 3 External influence over national educational policy 2 Framework 3 Implementation 4 Method of testing 4 1 Sampling 4 2 Test 4 3 National add ons 4 4 Data scaling 5 Ranking results 5 1 PISA 2018 ranking summary 5 2 Rankings comparison 2000 2015 5 3 Previous years 6 Reception 6 1 Finland 6 2 India 6 3 Malaysia 6 4 Sweden 6 5 United Kingdom 6 6 United States 6 6 1 PISA results for the United States by race and ethnicity 7 Research on possible causes of PISA disparities in different countries 8 Comments on accuracy 9 See also 10 Explanatory notes 11 References 12 External linksInfluence and impact EditPISA and similar international standardised assessments of educational attainment are increasingly used in the process of education policymaking at both national and international levels 4 PISA was conceived to set in a wider context the information provided by national monitoring of education system performance through regular assessments within a common internationally agreed framework by investigating relationships between student learning and other factors they can offer insights into sources of variation in performances within and between countries 5 Until the 1990s few European countries used national tests In the 1990s ten countries regions introduced standardised assessment and since the early 2000s ten more followed suit By 2009 only five European education systems had no national student assessments 4 The impact of these international standardised assessments in the field of educational policy has been significant in terms of the creation of new knowledge changes in assessment policy and external influence over national educational policy more broadly who citation needed Creation of new knowledge Edit Data from international standardised assessments can be useful in research on causal factors within or across education systems 4 Mons notes that the databases generated by large scale international assessments have made it possible to carry out inventories and comparisons of education systems on an unprecedented scale on themes ranging from the conditions for learning mathematics and reading to institutional autonomy and admissions policies 6 They allow typologies to be developed that can be used for comparative statistical analyses of education performance indicators thereby identifying the consequences of different policy choices They have generated new knowledge about education PISA findings have challenged deeply embedded educational practices such as the early tracking of students into vocational or academic pathways 7 79 countries and economies participated in the 2018 data collection Barroso and de Carvalho find that PISA provides a common reference connecting academic research in education and the political realm of public policy operating as a mediator between different strands of knowledge from the realm of education and public policy 8 However although the key findings from comparative assessments are widely shared in the research community 4 the knowledge they create does not necessarily fit with government reform agendas this leads to some inappropriate uses of assessment data Changes in national assessment policy Edit Emerging research suggests that international standardised assessments are having an impact on national assessment policy and practice PISA is being integrated into national policies and practices on assessment evaluation curriculum standards and performance targets its assessment frameworks and instruments are being used as best practice models for improving national assessments many countries have explicitly incorporated and emphasise PISA like competencies in revised national standards and curricula others use PISA data to complement national data and validate national results against an international benchmark 7 External influence over national educational policy Edit More important than its influence on countries policy of student assessment is the range of ways in which PISA is influencing countries education policy choices Policy makers in most participating countries see PISA as an important indicator of system performance PISA reports can define policy problems and set the agenda for national policy debate policymakers seem to accept PISA as a valid and reliable instrument for internationally benchmarking system performance and changes over time most countries irrespective of whether they performed above at or below the average PISA score have begun policy reforms in response to PISA reports 7 Against this impact on national education systems varies markedly For example in Germany the results of the first PISA assessment caused the so called PISA shock a questioning of previously accepted educational policies in a state marked by jealously guarded regional policy differences it led ultimately to an agreement by all Lander to introduce common national standards and even an institutionalised structure to ensure that they were observed 9 In Hungary by comparison which shared similar conditions to Germany PISA results have not led to significant changes in educational policy 10 Because many countries have set national performance targets based on their relative rank or absolute PISA score PISA assessments have increased the influence of their non elected commissioning body the OECD as an international education monitor and policy actor which implies an important degree of policy transfer from the international to the national level PISA in particular is having an influential normative effect on the direction of national education policies 7 Thus it is argued that the use of international standardised assessments has led to a shift towards international external accountability for national system performance Rey contends that PISA surveys portrayed as objective third party diagnoses of education systems actually serve to promote specific orientations on educational issues 4 National policy actors refer to high performing PISA countries to help legitimise and justify their intended reform agenda within contested national policy debates 11 PISA data can be used to fuel long standing debates around pre existing conflicts or rivalries between different policy options such as in the French Community of Belgium 12 In such instances PISA assessment data are used selectively in public discourse governments often only use superficial features of PISA surveys such as country rankings and not the more detailed analyses Rey 2010 145 citing Greger 2008 notes that often the real results of PISA assessments are ignored as policymakers selectively refer to data in order to legitimise policies introduced for other reasons 13 In addition PISA s international comparisons can be used to justify reforms with which the data themselves have no connection in Portugal for example PISA data were used to justify new arrangements for teacher assessment based on inferences that were not justified by the assessments and data themselves they also fed the government s discourse about the issue of pupils repeating a year which according to research fails to improve student results 14 In Finland the country s PISA results that are in other countries deemed to be excellent were used by Ministers to promote new policies for gifted students 15 Such uses and interpretations often assume causal relationships that cannot legitimately be based upon PISA data which would normally require fuller investigation through qualitative in depth studies and longitudinal surveys based on mixed quantitative and qualitative methods 16 which politicians are often reluctant to fund Recent decades have witnessed an expansion in the uses of PISA and similar assessments from assessing students learning to connecting the educational realm their traditional remit with the political realm 17 This raises the question of whether PISA data are sufficiently robust to bear the weight of the major policy decisions that are being based upon them for according to Breakspear PISA data have come to increasingly shape define and evaluate the key goals of the national federal education system 7 This implies that those who set the PISA tests e g in choosing the content to be assessed and not assessed are in a position of considerable power to set the terms of the education debate and to orient educational reform in many countries around the globe 7 Framework EditPISA stands in a tradition of international school studies undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement IEA Much of PISA s methodology follows the example of the Trends in International Mathematics and Science Study TIMSS started in 1995 which in turn was much influenced by the U S National Assessment of Educational Progress NAEP The reading component of PISA is inspired by the IEA s Progress in International Reading Literacy Study PIRLS PISA aims to test literacy the competence of students in three fields reading mathematics science on an indefinite scale 18 The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in real world contexts To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge TIMSS on the other hand measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them curriculum attainment PISA claims to measure education s application to real life problems and lifelong learning workforce knowledge In the reading test OECD PISA does not measure the extent to which 15 year old students are fluent readers or how competent they are at word recognition tasks or spelling Instead they should be able to construct extend and reflect on the meaning of what they have read across a wide range of continuous and non continuous texts 19 PISA also assesses students in innovative domains In 2012 and 2015 in addition to reading mathematics and science they were tested in collaborative problem solving In 2018 the additional innovative domain was global competence Implementation EditPISA is sponsored governed and coordinated by the OECD but paid for by participating countries citation needed Method of testing EditSampling Edit The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period The school year pupils are in is not taken into consideration Only students at school are tested not home schoolers In PISA 2006 however several countries also used a grade based sample of students This made it possible to study how age and school year interact To fulfill OECD requirements each country must draw a sample of at least 5 000 students In small countries like Iceland and Luxembourg where there are fewer than 5 000 students per year an entire age cohort is tested Some countries used much larger samples than required to allow comparisons between regions Test Edit nbsp PISA test documents on a school table Neues Gymnasium Oldenburg Germany 2006 Each student takes a two hour computer based test Part of the test is multiple choice and part involves fuller answers There are six and a half hours of assessment material but each student is not tested on all the parts Following the cognitive test participating students spend nearly one more hour answering a questionnaire on their background including learning habits motivation and family School directors fill in a questionnaire describing school demographics funding etc In 2012 the participants were for the first time in the history of large scale testing and assessments offered a new type of problem i e interactive complex problems requiring exploration of a novel virtual device 20 21 In selected countries PISA started experimentation with computer adaptive testing National add ons Edit Countries are allowed to combine PISA with complementary national tests Germany does this in a very extensive way On the day following the international test students take a national test called PISA E E Erganzung complement Test items of PISA E are closer to TIMSS than to PISA While only about 5 000 German students participate in the international and the national test another 45 000 take the national test only This large sample is needed to allow an analysis by federal states Following a clash about the interpretation of 2006 results the OECD warned Germany that it might withdraw the right to use the PISA label for national tests 22 Data scaling Edit From the beginning PISA has been designed with one particular method of data analysis in mind Since students work on different test booklets raw scores must be scaled to allow meaningful comparisons Scores are thus scaled so that the OECD average in each domain mathematics reading and science is 500 and the standard deviation is 100 23 This is true only for the initial PISA cycle when the scale was first introduced though subsequent cycles are linked to the previous cycles through IRT scale linking methods 24 This generation of proficiency estimates is done using a latent regression extension of the Rasch model a model of item response theory IRT also known as conditioning model or population model The proficiency estimates are provided in the form of so called plausible values which allow unbiased estimates of differences between groups The latent regression together with the use of a Gaussian prior probability distribution of student competencies allows estimation of the proficiency distributions of groups of participating students 25 The scaling and conditioning procedures are described in nearly identical terms in the Technical Reports of PISA 2000 2003 2006 NAEP and TIMSS use similar scaling methods Ranking results EditAll PISA results are tabulated by country recent PISA cycles have separate provincial or regional results for some countries Most public attention concentrates on just one outcome the mean scores of countries and their rankings of countries against one another In the official reports however country by country rankings are given not as simple league tables but as cross tables indicating for each pair of countries whether or not mean score differences are statistically significant unlikely to be due to random fluctuations in student sampling or in item functioning In favorable cases a difference of 9 points is sufficient to be considered significant citation needed PISA never combines mathematics science and reading domain scores into an overall score However commentators have sometimes combined test results from all three domains into an overall country ranking Such meta analysis is not endorsed by the OECD although official summaries sometimes use scores from a testing cycle s principal domain as a proxy for overall student ability PISA 2018 ranking summary Edit The results of PISA 2018 were presented on 3 December 2019 which included data for around 600 000 participating students in 79 countries and economies with China s economic area of Beijing Shanghai Jiangsu and Zhejiang emerging as the top performer in all categories 26 Note that this does not represent the entirety of mainland China 27 Reading results for Spain were not released due to perceived anomalies 28 Mathematics Science Reading1 nbsp China B S J Z a 5912 nbsp Singapore 5693 nbsp Macau 5584 nbsp Hong Kong 5515 nbsp Taiwan 5316 nbsp Japan 5277 nbsp South Korea 5268 nbsp Estonia 5239 nbsp Netherlands 51910 nbsp Poland 51611 nbsp Switzerland 51512 nbsp Canada 51213 nbsp Denmark 50913 nbsp Slovenia 50915 nbsp Belgium 50816 nbsp Finland 50717 nbsp Sweden 50217 nbsp United Kingdom 50219 nbsp Norway 50120 nbsp Germany 50020 nbsp Ireland 50022 nbsp Czech Republic 49922 nbsp Austria 49924 nbsp Latvia 49624 nbsp Vietnam 49626 nbsp France 49526 nbsp Iceland 49528 nbsp New Zealand 49429 nbsp Portugal 49230 nbsp Australia 491International Average OECD 48931 nbsp Russia 48832 nbsp Italy 48733 nbsp Slovakia 48634 nbsp Luxembourg 48335 nbsp Lithuania 48135 nbsp Spain 48135 nbsp Hungary 48138 nbsp United States 47839 nbsp Belarus 47239 nbsp Malta 47241 nbsp Croatia 46442 nbsp Israel 46343 nbsp Turkey 45444 nbsp Ukraine 45345 nbsp Cyprus 45145 nbsp Greece 45147 nbsp Serbia 44848 nbsp Malaysia 44049 nbsp Albania 43750 nbsp Bulgaria 43651 nbsp United Arab Emirates 43552 nbsp Brunei 43052 nbsp Montenegro 43052 nbsp Romania 43055 nbsp Kazakhstan 42356 nbsp Moldova 42157 nbsp Azerbaijan 42058 nbsp Thailand 41959 nbsp Uruguay 41860 nbsp Chile 41761 nbsp Qatar 41462 nbsp Mexico 40963 nbsp Bosnia and Herzegovina 40664 nbsp Costa Rica 40265 nbsp Jordan 40065 nbsp Peru 40067 nbsp Georgia 39868 nbsp North Macedonia 39469 nbsp Lebanon 39370 nbsp Colombia 39171 nbsp Brazil 38472 nbsp Argentina 37972 nbsp Indonesia 37974 nbsp Saudi Arabia 37375 nbsp Morocco 36876 nbsp Kosovo 36677 nbsp Panama 35377 nbsp Philippines 35379 nbsp Dominican Republic 325 1 nbsp China B S J Z a 5902 nbsp Singapore 5513 nbsp Macau 5444 nbsp Vietnam 5435 nbsp Estonia 5306 nbsp Japan 5297 nbsp Finland 5228 nbsp South Korea 5199 nbsp Canada 51810 nbsp Hong Kong 51711 nbsp Taiwan 51612 nbsp Poland 51113 nbsp New Zealand 50814 nbsp Slovenia 50715 nbsp United Kingdom 50516 nbsp Australia 50316 nbsp Germany 50316 nbsp Netherlands 50319 nbsp United States 50220 nbsp Belgium 49920 nbsp Sweden 49922 nbsp Czech Republic 49723 nbsp Ireland 49624 nbsp Switzerland 49525 nbsp Denmark 49325 nbsp France 49327 nbsp Portugal 49228 nbsp Austria 49028 nbsp Norway 490International Average OECD 48930 nbsp Latvia 48731 nbsp Spain 48332 nbsp Lithuania 48233 nbsp Hungary 48134 nbsp Russia 47835 nbsp Luxembourg 47736 nbsp Iceland 47537 nbsp Croatia 47238 nbsp Belarus 47139 nbsp Ukraine 46940 nbsp Italy 46840 nbsp Turkey 46842 nbsp Slovakia 46443 nbsp Israel 46244 nbsp Malta 45745 nbsp Greece 45246 nbsp Chile 44447 nbsp Serbia 44048 nbsp Cyprus 43949 nbsp Malaysia 43850 nbsp United Arab Emirates 43451 nbsp Brunei 43152 nbsp Jordan 42953 nbsp Moldova 42854 nbsp Romania 42654 nbsp Thailand 42654 nbsp Uruguay 42657 nbsp Bulgaria 42458 nbsp Mexico 41958 nbsp Qatar 41960 nbsp Albania 41761 nbsp Costa Rica 41662 nbsp Montenegro 41563 nbsp Colombia 41363 nbsp North Macedonia 41365 nbsp Argentina 40465 nbsp Brazil 40465 nbsp Peru 40468 nbsp Azerbaijan 39868 nbsp Bosnia and Herzegovina 39870 nbsp Kazakhstan 39771 nbsp Indonesia 39672 nbsp Saudi Arabia 38673 nbsp Lebanon 38474 nbsp Georgia 38375 nbsp Morocco 37776 nbsp Kosovo 36576 nbsp Panama 36578 nbsp Philippines 35779 nbsp Dominican Republic 336 1 nbsp China B S J Z a 5552 nbsp Singapore 5493 nbsp Macau 5254 nbsp Hong Kong 5245 nbsp Estonia 5236 nbsp Canada 5206 nbsp Finland 5208 nbsp Ireland 5189 nbsp South Korea 51410 nbsp Poland 51211 nbsp New Zealand 50611 nbsp Sweden 50613 nbsp United States 50513 nbsp Vietnam 50515 nbsp Japan 50415 nbsp United Kingdom 50417 nbsp Australia 50317 nbsp Taiwan 50319 nbsp Denmark 50120 nbsp Norway 49921 nbsp Germany 49822 nbsp Slovenia 49523 nbsp Belgium 49323 nbsp France 49325 nbsp Portugal 49226 nbsp Czech Republic 490International Average OECD 48727 nbsp Netherlands 48528 nbsp Austria 48428 nbsp Switzerland 48430 nbsp Croatia 47930 nbsp Latvia 47930 nbsp Russia 47933 nbsp Hungary 47633 nbsp Italy 47633 nbsp Lithuania 47636 nbsp Belarus 47436 nbsp Iceland 47438 nbsp Israel 47038 nbsp Luxembourg 47040 nbsp Turkey 46640 nbsp Ukraine 46642 nbsp Slovakia 45843 nbsp Greece 45744 nbsp Chile 45245 nbsp Malta 44846 nbsp Serbia 43947 nbsp United Arab Emirates 43248 nbsp Romania 42849 nbsp Uruguay 42750 nbsp Costa Rica 42651 nbsp Cyprus 42451 nbsp Moldova 42453 nbsp Montenegro 42154 nbsp Bulgaria 42054 nbsp Mexico 42056 nbsp Jordan 41957 nbsp Malaysia 41558 nbsp Brazil 41359 nbsp Colombia 41260 nbsp Brunei 40861 nbsp Qatar 40762 nbsp Albania 40563 nbsp Bosnia and Herzegovina 40364 nbsp Argentina 40265 nbsp Peru 40166 nbsp Saudi Arabia 39967 nbsp North Macedonia 39367 nbsp Thailand 39369 nbsp Azerbaijan 38970 nbsp Kazakhstan 38771 nbsp Georgia 38072 nbsp Panama 37773 nbsp Indonesia 37174 nbsp Morocco 35975 nbsp Kosovo 35375 nbsp Lebanon 35377 nbsp Dominican Republic 34278 nbsp Philippines 340 Rankings comparison 2000 2015 Edit MathematicsCountry 2015 2012 2009 2006 2003 2000Score Rank Score Rank Score Rank Score Rank Score Rank Score RankInternational Average OECD 490 494 495 494 499 492 nbsp Albania 413 57 394 54 377 53 381 33 nbsp Algeria 360 72 nbsp Argentina 409 58 388 30 nbsp Australia 494 25 504 17 514 13 520 12 524 10 533 6 nbsp Austria 497 20 506 16 496 22 505 17 506 18 503 12 nbsp China B S J G b 531 6 nbsp Belgium 507 15 515 13 515 12 520 11 529 7 520 8 nbsp Brazil 377 68 389 55 386 51 370 50 356 39 334 35 nbsp Bulgaria 441 47 439 43 428 41 413 43 430 28 nbsp Argentina CABA c 456 43 418 49 nbsp Canada 516 10 518 11 527 8 527 7 532 6 533 6 nbsp Chile 423 50 423 47 421 44 411 44 384 32 nbsp Taiwan 542 4 560 3 543 4 549 1 nbsp Colombia 390 64 376 58 381 52 370 49 nbsp Costa Rica 400 62 407 53 nbsp Croatia 464 41 471 38 460 38 467 34 nbsp Cyprus 437 48 nbsp Czech Republic 492 28 499 22 493 25 510 15 516 12 498 14 nbsp Denmark 511 12 500 20 503 17 513 14 514 14 514 10 nbsp Dominican Republic 328 73 nbsp Estonia 520 9 521 9 512 15 515 13 nbsp Finland 511 13 519 10 541 5 548 2 544 2 536 5 nbsp France 493 26 495 23 497 20 496 22 511 15 517 9 nbsp Macedonia 371 69 381 33 nbsp Georgia 404 60 nbsp Germany 506 16 514 14 513 14 504 19 503 19 490 16 nbsp Greece 454 44 453 40 466 37 459 37 445 32 447 24 nbsp Hong Kong 548 2 561 2 555 2 547 3 550 1 560 1 nbsp Hungary 477 37 477 37 490 27 491 26 490 25 488 17 nbsp Iceland 488 31 493 25 507 16 506 16 515 13 514 10 nbsp Indonesia 386 66 375 60 371 55 391 47 360 37 367 34 nbsp Ireland 504 18 501 18 487 30 501 21 503 20 503 12 nbsp Israel 470 39 466 39 447 39 442 38 433 26 nbsp Italy 490 30 485 30 483 33 462 36 466 31 457 22 nbsp Japan 532 5 536 6 529 7 523 9 534 5 557 2 nbsp Jordan 380 67 386 57 387 50 384 48 nbsp Kazakhstan 460 42 432 45 405 48 nbsp South Korea 524 7 554 4 546 3 547 4 542 3 547 3 nbsp Kosovo 362 71 nbsp Latvia 482 34 491 26 482 34 486 30 483 27 463 21 nbsp Lebanon 396 63 nbsp Lithuania 478 36 479 35 477 35 486 29 nbsp Luxembourg 486 33 490 27 489 28 490 27 493 23 446 25 nbsp Macau 544 3 538 5 525 10 525 8 527 8 nbsp Malaysia 446 45 421 48 nbsp Malta 479 35 nbsp Mexico 408 59 413 50 419 46 406 45 385 36 387 31 nbsp Moldova 420 52 nbsp Montenegro 418 54 410 51 403 49 399 46 nbsp Netherlands 512 11 523 8 526 9 531 5 538 4 nbsp New Zealand 495 21 500 21 519 11 522 10 523 11 537 4 nbsp Norway 502 19 489 28 498 19 490 28 495 22 499 13 nbsp Peru 387 65 368 61 365 57 292 36 nbsp Poland 504 17 518 12 495 23 495 24 490 24 470 20 nbsp Portugal 492 29 487 29 487 31 466 35 466 30 454 23 nbsp Qatar 402 61 376 59 368 56 318 52 nbsp Romania 444 46 445 42 427 42 415 42 426 29 nbsp Russia 494 23 482 32 468 36 476 32 468 29 478 18 nbsp Singapore 564 1 573 1 562 1 nbsp Slovakia 475 38 482 33 497 21 492 25 498 21 nbsp Slovenia 510 14 501 19 501 18 504 18 nbsp Spain 486 32 484 31 483 32 480 31 485 26 476 19 nbsp Sweden 494 24 478 36 494 24 502 20 509 16 510 11 nbsp Switzerland 521 8 531 7 534 6 530 6 527 9 529 7 nbsp Thailand 415 56 427 46 419 45 417 41 417 35 432 27 nbsp Trinidad and Tobago 417 55 414 47 nbsp Tunisia 367 70 388 56 371 54 365 51 359 38 nbsp Turkey 420 51 448 41 445 40 424 40 423 33 nbsp United Arab Emirates 427 49 434 44 nbsp United Kingdom 492 27 494 24 492 26 495 23 508 17 529 7 nbsp United States 470 40 481 34 487 29 474 33 483 28 493 15 nbsp Uruguay 418 53 409 52 427 43 427 39 422 34 nbsp Vietnam 495 22 511 15 ScienceCountry 2015 2012 2009 2006Score Rank Score Rank Score Rank Score RankInternational Average OECD 493 501 501 498 nbsp Albania 427 54 397 58 391 54 nbsp Algeria 376 72 nbsp Argentina 432 52 nbsp Australia 510 14 521 14 527 9 527 8 nbsp Austria 495 26 506 21 494 28 511 17 nbsp China B S J G b 518 10 nbsp Belgium 502 20 505 22 507 19 510 18 nbsp Brazil 401 66 402 55 405 49 390 49 nbsp Bulgaria 446 46 446 43 439 42 434 40 nbsp Argentina CABA c 475 38 425 49 nbsp Canada 528 7 525 9 529 7 534 3 nbsp Chile 447 45 445 44 447 41 438 39 nbsp Taiwan 532 4 523 11 520 11 532 4 nbsp Colombia 416 60 399 56 402 50 388 50 nbsp Costa Rica 420 58 429 47 nbsp Croatia 475 37 491 32 486 35 493 25 nbsp Cyprus 433 51 nbsp Czech Republic 493 29 508 20 500 22 513 14 nbsp Denmark 502 21 498 25 499 24 496 23 nbsp Dominican Republic 332 73 nbsp Estonia 534 3 541 5 528 8 531 5 nbsp Finland 531 5 545 4 554 1 563 1 nbsp France 495 27 499 24 498 25 495 24 nbsp Macedonia 384 70 nbsp Georgia 411 63 nbsp Germany 509 16 524 10 520 12 516 12 nbsp Greece 455 44 467 40 470 38 473 37 nbsp Hong Kong 523 9 555 1 549 2 542 2 nbsp Hungary 477 35 494 30 503 20 504 20 nbsp Iceland 473 39 478 37 496 26 491 26 nbsp Indonesia 403 65 382 60 383 55 393 48 nbsp Ireland 503 19 522 13 508 18 508 19 nbsp Israel 467 40 470 39 455 39 454 38 nbsp Italy 481 34 494 31 489 33 475 35 nbsp Japan 538 2 547 3 539 4 531 6 nbsp Jordan 409 64 409 54 415 47 422 43 nbsp Kazakhstan 456 43 425 48 400 53 nbsp South Korea 516 11 538 6 538 5 522 10 nbsp Kosovo 378 71 nbsp Latvia 490 31 502 23 494 29 490 27 nbsp Lebanon 386 68 nbsp Lithuania 475 36 496 28 491 31 488 31 nbsp Luxembourg 483 33 491 33 484 36 486 33 nbsp Macau 529 6 521 15 511 16 511 16 nbsp Malaysia 443 47 420 50 nbsp Malta 465 41 nbsp Mexico 416 61 415 52 416 46 410 47 nbsp Moldova 428 53 nbsp Montenegro 411 62 410 53 401 51 412 46 nbsp Netherlands 509 17 522 12 522 10 525 9 nbsp New Zealand 513 12 516 16 532 6 530 7 nbsp Norway 498 24 495 29 500 23 487 32 nbsp Peru 397 67 373 61 369 57 nbsp Poland 501 22 526 8 508 17 498 22 nbsp Portugal 501 23 489 34 493 30 474 36 nbsp Qatar 418 59 384 59 379 56 349 52 nbsp Romania 435 50 439 46 428 43 418 45 nbsp Russia 487 32 486 35 478 37 479 34 nbsp Singapore 556 1 551 2 542 3 nbsp Slovakia 461 42 471 38 490 32 488 29 nbsp Slovenia 513 13 514 18 512 15 519 11 nbsp Spain 493 30 496 27 488 34 488 30 nbsp Sweden 493 28 485 36 495 27 503 21 nbsp Switzerland 506 18 515 17 517 13 512 15 nbsp Thailand 421 57 444 45 425 45 421 44 nbsp Trinidad and Tobago 425 56 410 48 nbsp Tunisia 386 69 398 57 401 52 386 51 nbsp Turkey 425 55 463 41 454 40 424 42 nbsp United Arab Emirates 437 48 448 42 nbsp United Kingdom 509 15 514 19 514 14 515 13 nbsp United States 496 25 497 26 502 21 489 28 nbsp Uruguay 435 49 416 51 427 44 428 41 nbsp Vietnam 525 8 528 7 ReadingCountry 2015 2012 2009 2006 2003 2000Score Rank Score Rank Score Rank Score Rank Score Rank Score RankInternational Average OECD 493 496 493 489 494 493 nbsp Albania 405 63 394 58 385 55 349 39 nbsp Algeria 350 71 nbsp Argentina 425 56 nbsp Australia 503 16 512 12 515 8 513 7 525 4 528 4 nbsp Austria 485 33 490 26 470 37 490 21 491 22 492 19 nbsp China B S J G b 494 27 nbsp Belgium 499 20 509 16 506 10 501 11 507 11 507 11 nbsp Brazil 407 62 407 52 412 49 393 47 403 36 396 36 nbsp Bulgaria 432 49 436 47 429 42 402 43 430 32 nbsp Argentina CABA c 475 38 429 48 nbsp Canada 527 3 523 7 524 5 527 4 528 3 534 2 nbsp Chile 459 42 441 43 449 41 442 37 410 35 nbsp Taiwan 497 23 523 8 495 21 496 15 nbsp Colombia 425 57 403 54 413 48 385 49 nbsp Costa Rica 427 52 441 45 nbsp Croatia 487 31 485 33 476 34 477 29 nbsp Cyprus 443 45 nbsp Czech Republic 487 30 493 24 478 32 483 25 489 24 492 20 nbsp Denmark 500 18 496 23 495 22 494 18 492 19 497 16 nbsp Dominican Republic 358 69 nbsp Estonia 519 6 516 10 501 12 501 12 nbsp Finland 526 4 524 5 536 2 547 2 543 1 546 1 nbsp France 499 19 505 19 496 20 488 22 496 17 505 14 nbsp Macedonia 352 70 373 37 nbsp Georgia 401 65 nbsp Germany 509 11 508 18 497 18 495 17 491 21 484 22 nbsp Greece 467 41 477 38 483 30 460 35 472 30 474 25 nbsp Hong Kong 527 2 545 1 533 3 536 3 510 9 525 6 nbsp Hungary 470 40 488 28 494 24 482 26 482 25 480 23 nbsp Iceland 482 35 483 35 500 15 484 23 492 20 507 12 nbsp Indonesia 397 67 396 57 402 53 393 46 382 38 371 38 nbsp Ireland 521 5 523 6 496 19 517 6 515 6 527 5 nbsp Israel 479 37 486 32 474 35 439 39 452 29 nbsp Italy 485 34 490 25 486 27 469 32 476 29 487 21 nbsp Japan 516 8 538 3 520 7 498 14 498 14 522 9 nbsp Jordan 408 61 399 55 405 51 401 44 nbsp Kazakhstan 427 54 393 59 390 54 nbsp South Korea 517 7 536 4 539 1 556 1 534 2 525 7 nbsp Kosovo 347 72 nbsp Latvia 488 29 489 27 484 28 479 27 491 23 458 28 nbsp Lebanon 347 73 nbsp Lithuania 472 39 477 37 468 38 470 31 nbsp Luxembourg 481 36 488 30 472 36 479 28 479 27 441 30 nbsp Macau 509 12 509 15 487 26 492 20 498 15 nbsp Malaysia 431 50 398 56 nbsp Malta 447 44 nbsp Mexico 423 58 424 49 425 44 410 42 400 37 422 34 nbsp Moldova 416 59 nbsp Montenegro 427 55 422 50 408 50 392 48 nbsp Netherlands 503 15 511 13 508 9 507 10 513 8 nbsp New Zealand 509 10 512 11 521 6 521 5 522 5 529 3 nbsp Norway 513 9 504 20 503 11 484 24 500 12 505 13 nbsp Peru 398 66 384 61 370 57 327 40 nbsp Poland 506 13 518 9 500 14 508 8 497 16 479 24 nbsp Portugal 498 21 488 31 489 25 472 30 478 28 470 26 nbsp Qatar 402 64 388 60 372 56 312 51 nbsp Romania 434 47 438 46 424 45 396 45 428 33 nbsp Russia 495 26 475 40 459 40 440 38 442 32 462 27 nbsp Singapore 535 1 542 2 526 4 nbsp Slovakia 453 43 463 41 477 33 466 33 469 31 nbsp Slovenia 505 14 481 36 483 29 494 19 nbsp Spain 496 25 488 29 481 31 461 34 481 26 493 18 nbsp Sweden 500 17 483 34 497 17 507 9 514 7 516 10 nbsp Switzerland 492 28 509 14 501 13 499 13 499 13 494 17 nbsp Thailand 409 60 441 44 421 46 417 40 420 35 431 31 nbsp Trinidad and Tobago 427 53 416 47 nbsp Tunisia 361 68 404 53 404 52 380 50 375 39 nbsp Turkey 428 51 475 39 464 39 447 36 441 33 nbsp United Arab Emirates 434 48 442 42 nbsp United Kingdom 498 22 499 21 494 23 495 16 507 10 523 8 nbsp United States 497 24 498 22 500 16 495 18 504 15 nbsp Uruguay 437 46 411 51 426 43 413 41 434 34 nbsp Vietnam 487 32 508 17 a b c Beijing Shanghai Jiangsu Zhejiang a b c Shanghai 2009 2012 Beijing Shanghai Jiangsu Guangdong 2015 a b c Ciudad Autonoma de Buenos Aires Previous years Edit Main article Programme for International Student Assessment 2000 to 2012 Period Focus OECD countries Partner countries Participating students Notes2000 Reading 28 4 11 265 000 The Netherlands disqualified from data analysis 11 additional non OECD countries took the test in 2002 2003 Mathematics 30 11 275 000 UK disqualified from data analysis Also included test in problem solving 2006 Science 30 27 400 000 Reading scores for US disqualified from analysis due to misprint in testing materials 29 2009 30 Reading 34 41 10 470 000 10 additional non OECD countries took the test in 2010 31 32 2012 33 Mathematics 34 31 510 000Reception EditFurther information Programme for International Student Assessment 2000 to 2012 China China s participation in the 2012 test was limited to Shanghai Hong Kong and Macau as separate entities In 2012 Shanghai participated for the second time again topping the rankings in all three subjects as well as improving scores in the subjects compared to the 2009 tests Shanghai s score of 613 in mathematics was 113 points above the average score putting the performance of Shanghai pupils about 3 school years ahead of pupils in average countries Educational experts debated to what degree this result reflected the quality of the general educational system in China pointing out that Shanghai has greater wealth and better paid teachers than the rest of China 34 Hong Kong placed second in reading and science and third in maths Andreas Schleicher PISA division head and co ordinator stated that PISA tests administered in rural China have produced some results approaching the OECD average Citing further as yet unpublished OECD research he said We have actually done Pisa in 12 of the provinces in China Even in some of the very poor areas you get performance close to the OECD average 35 Schleicher believes that China has also expanded school access and has moved away from learning by rote 36 performing well in both rote based and broader assessments 35 In 2018 the Chinese provinces that participated were Beijing Shanghai Jiangsu and Zhejiang In 2015 the participating provinces were Jiangsu Guangdong Beijing and Shanghai 37 The 2015 Beijing Shanghai Jiangsu Guangdong cohort scored a median 518 in science in 2015 while the 2012 Shanghai cohort scored a median 580 Critics of PISA counter that in Shanghai and other Chinese cities most children of migrant workers can only attend city schools up to the ninth grade and must return to their parents hometowns for high school due to hukou restrictions thus skewing the composition of the city s high school students in favor of wealthier local families A population chart of Shanghai reproduced in The New York Times shows a steep drop off in the number of 15 year olds residing there 38 According to Schleicher 27 of Shanghai s 15 year olds are excluded from its school system and hence from testing As a result the percentage of Shanghai s 15 year olds tested by PISA was 73 lower than the 89 tested in the US 39 Following the 2015 testing OECD published in depth studies on the education systems of a selected few countries including China 40 In 2014 Liz Truss the British Parliamentary Under Secretary of State at the Department for Education led a fact finding visit to schools and teacher training centres in Shanghai 41 Britain increased exchanges with Chinese teachers and schools to find out how to improve quality In 2014 60 teachers from Shanghai were invited to the UK to help share their teaching methods support pupils who are struggling and help to train other teachers 42 In 2016 Britain invited 120 Chinese teachers planning to adopt Chinese styles of teaching in 8 000 aided schools 43 By 2019 approximately 5 000 of Britain s 16 000 primary schools had adopted the Shanghai s teaching methods 44 The performance of British schools in PISA improved after adopting China s teaching styles 45 46 Finland Edit Finland which received several top positions in the first tests fell in all three subjects in 2012 but remained the best performing country overall in Europe achieving their best result in science with 545 points 5th and worst in mathematics with 519 12th in which the country was outperformed by four other European countries The drop in mathematics was 25 points since 2003 the last time mathematics was the focus of the tests For the first time Finnish girls outperformed boys in mathematics narrowly It was also the first time pupils in Finnish speaking schools did not perform better than pupils in Swedish speaking schools Minister of Education and Science Krista Kiuru expressed concern for the overall drop as well as the fact that the number of low performers had increased from 7 to 12 47 India Edit India participated in the 2009 round of testing but pulled out of the 2012 PISA testing with the Indian government attributing its action to the unfairness of PISA testing to Indian students 48 India had ranked 72nd out of 73 countries tested in 2009 49 The Indian Express reported The ministry of education has concluded that there was a socio cultural disconnect between the questions and Indian students The ministry will write to the OECD and drive home the need to factor in India s socio cultural milieu India s participation in the next PISA cycle will hinge on this 50 The Indian Express also noted that Considering that over 70 nations participate in PISA it is uncertain whether an exception would be made for India India did not participate in the 2012 2015 and 2018 PISA rounds 51 A Kendriya Vidyalaya Sangathan KVS committee as well as a group of secretaries on education constituted by the Prime Minister of India Narendra Modi recommended that India should participate in PISA Accordingly in February 2017 the Ministry of Human Resource Development under Prakash Javadekar decided to end the boycott and participate in PISA from 2020 To address the socio cultural disconnect between the test questions and students it was reported that the OECD will update some questions For example the word avocado in a question may be replaced with a more popular Indian fruit such as mango 52 Malaysia Edit In 2015 the results from Malaysia were found by the OECD to have not met the maximum response rate 53 Opposition politician Ong Kian Ming said the education ministry tried to oversample high performing students in rich schools 54 55 Sweden Edit Sweden s result dropped in all three subjects in the 2012 test which was a continuation of a trend from 2006 and 2009 It saw the sharpest fall in mathematics performance with a drop in score from 509 in 2003 to 478 in 2012 The score in reading showed a drop from 516 in 2000 to 483 in 2012 The country performed below the OECD average in all three subjects 56 The leader of the opposition Social Democrat Stefan Lofven described the situation as a national crisis 57 Along with the party s spokesperson on education Ibrahim Baylan he pointed to the downward trend in reading as most severe 57 In 2020 Swedish newspaper Expressen revealed that Sweden had inflated their score in PISA 2018 by not conforming to OECD standards According to professor Magnus Henrekson a large number of foreign born students had not been tested 58 According to an article of Sveriges Radio poor immigrant children s scores are a significant cause of the recent decrease in Swedish Pisa scores United Kingdom Edit In the 2012 test as in 2009 the result was slightly above average for the United Kingdom with the science ranking being highest 20 59 England Wales Scotland and Northern Ireland also participated as separated entities showing the worst result for Wales which in mathematics was 43rd of the 65 countries and economies Minister of Education in Wales Huw Lewis expressed disappointment in the results said that there were no quick fixes but hoped that several educational reforms that have been implemented in the last few years would give better results in the next round of tests 60 The United Kingdom had a greater gap between high and low scoring students than the average There was little difference between public and private schools when adjusted for socio economic background of students The gender difference in favour of girls was less than in most other countries as was the difference between natives and immigrants 59 Writing in the Daily Telegraph Ambrose Evans Pritchard warned against putting too much emphasis on the UK s international ranking arguing that an overfocus on scholarly performances in East Asia might have contributed to the area s low birthrate which he argued could harm the economic performance in the future more than a good PISA score would outweigh 61 In 2013 the Times Educational Supplement TES published an article Is PISA Fundamentally Flawed by William Stewart detailing serious critiques of PISA s conceptual foundations and methods advanced by statisticians at major universities 62 In the article Professor Harvey Goldstein of the University of Bristol was quoted as saying that when the OECD tries to rule out questions suspected of bias it can have the effect of smoothing out key differences between countries That is leaving out many of the important things he warned They simply don t get commented on What you are looking at is something that happens to be common But is it worth looking at PISA results are taken at face value as providing some sort of common standard across countries But as soon as you begin to unpick it I think that all falls apart Queen s University Belfast mathematician Dr Hugh Morrison stated that he found the statistical model underlying PISA to contain a fundamental insoluble mathematical error that renders Pisa rankings valueless 63 Goldstein remarked that Dr Morrison s objection highlights an important technical issue if not a profound conceptual error However Goldstein cautioned that PISA has been used inappropriately contending that some of the blame for this lies with PISA itself I think it tends to say too much for what it can do and it tends not to publicise the negative or the weaker aspects Professors Morrison and Goldstein expressed dismay at the OECD s response to criticism Morrison said that when he first published his criticisms of PISA in 2004 and also personally queried several of the OECD s senior people about them his points were met with absolute silence and have yet to be addressed I was amazed at how unforthcoming they were he told TES That makes me suspicious Pisa steadfastly ignored many of these issues he says I am still concerned 64 Professor Svend Kreiner of the University of Copenhagen agreed One of the problems that everybody has with PISA is that they don t want to discuss things with people criticising or asking questions concerning the results They didn t want to talk to me at all I am sure it is because they can t defend themselves 64 United States Edit Since 2012 a few states have participated in the PISA tests as separate entities Only the 2012 and 2015 results are available on a state basis Puerto Rico participated in 2015 as a separate US entity as well 2012 US State results Mathematics Science Reading nbsp Massachusetts 514 nbsp Connecticut 506 nbsp US Average 481 nbsp Florida 467 nbsp Massachusetts 527 nbsp Connecticut 521 nbsp US Average 497 nbsp Florida 485 nbsp Massachusetts 527 nbsp Connecticut 521 nbsp US Average 498 nbsp Florida 492 2015 US State results Mathematics Science Reading span, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.