Programme for International Student Assessment
From Wikipedia, the free encyclopedia
The Programme for International Student Assessment (PISA) is a triennial world-wide test of 15-year-old school children's scholastic performance, the implementation of which is coordinated by the Organisation for Economic Co-operation and Development (OECD).
The aim of the PISA study is to test and compare schoolchildren's performance across the world, with a view to improving educational methods and outcomes.
Contents |
[edit] Development and implementation
Developed from 1997, the first PISA assessment was carried out in 2000. The tests are taken every three years. Every period of assessment specialises in one particular subject, but also tests the other main areas studied. The subject specialisation is rotated through each PISA cycle.
In 2000, 265 000 students from 32 countries took part in PISA; 28 of them were OECD member countries. In 2002 the same tests were taken by 11 more "partner" countries (i.e. non-OECD members). The main focus of the 2000 tests was reading literacy, with two thirds of the questions being on that subject.
PISA’s debut round in 2000 was delivered on OECD’s behalf by an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). It continued to lead the design and implementation of subsequent rounds of PISA for OECD.
Over 275 000 students took part in PISA 2003, which was conducted in 41 countries, including all 30 OECD countries. (Britain data collection however, failed to meet PISA’s quality standards and so the UK was not included in the international comparisons.) The focus was mathematics literacy, testing real-life situations in which mathematics is useful. Problem solving was also tested for the first time.
In 2006, 57 countries participated, and the main focus of PISA 2006 was science literacy. Results are due out in late 2007. Researchers have begun preparation for 2009, in which reading literacy will again be the main focus, giving the first opportunity to measure improvements in that domain. At last count (end-March 2007), about 63 countries were set to participate in PISA 2009. It is anticipated that more countries will join in before 2009.
Development of the methodology and procedures required to implement the PISA survey in all participating countries are led by ACER. It also leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data.
The process of seeing through a single PISA cycle, start-to-finish, takes over 4 years.
[edit] Comparison with TIMSS and PIRLS
Another international mathematics assessment test is the Trends in International Mathematics and Science Study (TIMSS), undertaken by the International Association for Evaluation of Educational Achievement (IEA). The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them. It divides mathematical domains into two dimensions: first, the applied-knowledge "cognitive domains" and secondly more traditional "contents domains". The cognitive domains it covers are "Knowing Facts and Procedures, Using Concepts, Solving Routine Problems and Reasoning", and the contents domains are "Number, Algebra, Measurement, Geometry and Data". The latter reflect "the importance of being able to continue comparisons of achievement with previous assessments in these content domains" (TIMSS Assessment Framework 2003, pdf) PISA argues that international assessment should not be restricted to a set body of knowledge. Instead, it deals with education's application to real-life problems and life-long learning. While what PISA claims to measure (workforce knowledge) differs from what TIMSS claims to measure (curriculum attainment), research has shown that achievement on these two assessments is closely linked (Wu, 2008).
In reading literacy, the equivalent to TIMSS is the Progress in International Reading Literacy Study or PIRLS. According to the OECD: "OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling". Instead, they should be able to "construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts" (Chapter 2 of the publication "PISA 2003 Assessment Framework", pdf) PIRLS, on the other hand, describes reading literacy as "the ability to understand and use those written language forms required by society and/or valued by the individual." (Chapter 1 of PIRLS 2006 Assessment Framework, pdf)-- PIRLS includes using language forms in reading literacy. However, according to the IEA, in scoring the PIRLS tests, "the focus is solely on students’ understanding of the text, not on their ability to write well." (Chapter 4 of PIRLS 2006 Assessment Framework, pdf).
[edit] Method of testing
The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.
Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Participating students also answer a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.
Criticism has ensued in Luxembourg which scored quite low, over the method used in its PISA test. Although being a trilingual country, the test was not allowed to be done in Luxembourgish, the mother tongue of a majority of students.
[edit] Results
The results of each period of assessment normally take at least a year to be analysed. The first results for PISA 2000 came out in 2001 (OECD, 2001a) and 2003 (OECD, 2003c), and were followed by thematic reports studying particular aspects of the results. The evaluation of PISA 2003 was published in two volumes: Learning for Tomorrow’s World: First Results from PISA 2003 (OECD, 2004) and Problem Solving for Tomorrow’s World – First Measures of Cross-Curricular Competencies from PISA 2003 (OECD, 2004d)
Here is an overview of the top six scores in 2003:
Mathematics | Reading literacy | Science | Problem solving | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
|
|
Professor Jouni Välijärvi was in charge of the Finnish PISA study: he believed that the high Finnish score was due both to the excellent Finnish teachers and to Finland's 1990s LUMA programme which was developed to improve children's skills in mathematics and natural sciences. He also drew attention to the Finnish school system which teaches the same curriculum to all pupils. Indeed individual Finnish students' results did not vary a great deal and all schools had similar scores.
An evaluation of the 2003 results showed that the countries which spent more on education did not necessarily do better than those which spent less. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, Korea and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.
Compared with 2000, Poland, Belgium, the Czech Republic and Germany all improved their results. In fact, apparently due to the changes to the school system introduced in the educational reform of 1999, Polish students had above average reading skills in PISA 2003; in PISA 2000 they were near the bottom of the list.
Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.
[edit] 2006 survey
Here is an overview of the 20 places with the highest scores in 2006:
Mathematics | Science | Reading | |
---|---|---|---|
1. | Taiwan | Finland | South Korea |
2. | Finland | Hong Kong | Finland |
3. | Hong Kong | Canada | Hong Kong |
4. | South Korea | Taiwan | Canada |
5. | Netherlands | Estonia | New Zealand |
6. | Switzerland | Japan | Ireland |
7. | Canada | New Zealand | Australia |
8. | Macau | Australia | Liechtenstein |
9. | Liechtenstein | Netherlands | Poland |
10. | Japan | Liechtenstein | Sweden |
11. | New Zealand | South Korea | Netherlands |
12. | Belgium | Slovenia | Belgium |
13. | Australia | Germany | Macau |
14. | Estonia | United Kingdom | Switzerland |
15. | Denmark | Czech Republic | Japan |
16. | Czech Republic | Switzerland | Taiwan |
17. | Iceland | Macau | United Kingdom |
18. | Austria | Austria | Germany |
19. | Slovenia | Belgium | Denmark |
20. | Germany | Ireland | Slovenia |
[edit] Reactions to the results
For many countries, the first PISA results were a nasty surprise; in Germany, for example, the comparatively low scores brought on heated debate about how the school system should be changed. Other countries had an agreeable surprise. Some headlines in national newspapers, for example, were:
- "La France, élève moyen de la classe OCDE" (France, average student of the OECD class) Le Monde, December 5, 2001
- "Miserable Noten für deutsche Schüler" (Abysmal marks for German students) Frankfurter Allgemeine Zeitung, December 4, 2001
- "Are we not such dunces after all?" The Times, England, December 6, 2001
- "Economic Time Bomb: U.S. Teens Are Among Worst at Math" Wall Street Journal December 7, 2004
- "Preocupe-se. Seu filho é mal educado." (Be worried. Your child is badly educated.) Veja November 7, 2007
- "La educación española retrocede" (Spanish education moving backwards) El País December 5, 2007
- "Finnish teens score high marks in latest PISA study" Helsingin Sanomat November 30, 2007
[edit] See also
- Education Index
- Testing students
- Education in Japan
- Education in Taiwan
- Education in Finland
- Education in South Korea
- Education in Hong Kong
- Education in Germany
- Education in the Netherlands
[edit] References
- Rindermann, Heiner (2007). The g-factor of international cognitive ability comparisons: the homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21, 667-706 [1]
- Wu, M.L. (2008). A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008.
[edit] Further information
[edit] Official websites and reports
International: