WLD_2003_PISA_v01_M
Programme for International Student Assessment 2003
Name | Country code |
---|---|
Australia | AUS |
Austria | AUT |
Belgium | BEL |
Brazil | BRA |
Canada | CAN |
Switzerland | CHE |
Czech Republic | CZE |
Germany | DEU |
Denmark | DNK |
Spain | ESP |
Finland | FIN |
France | FRA |
United Kingdom | GBR |
Greece | GRC |
Hungary | HUN |
Indonesia | IDN |
Ireland | IRL |
Iceland | ISL |
Italy | ITA |
Japan | JPN |
Korea, Rep. | KOR |
Liechtenstein | LIE |
Luxembourg | LUX |
Latvia | LVA |
Mexico | MEX |
Netherlands | NLD |
Norway | NOR |
New Zealand | NZL |
Poland | POL |
Portugal | PRT |
Russian Federation | RUS |
Slovak Republic | SVK |
Sweden | SWE |
Thailand | THA |
Tunisia | TUN |
Turkiye | TUR |
Uruguay | URY |
United States | USA |
Serbia and Montenegro | SCG |
The first PISA survey was conducted in 2000 in 32 countries (including 28 OECD member countries) using written tasks answered in schools under independently supervised test conditions. Another 11 countries completed the same assessment in 2002. PISA 2000 surveyed reading, mathematical and scientific literacy, with a primary focus on reading. The second PISA survey, which covered reading, mathematical and scientific literacy, and problem solving, with a primary focus on mathematical literacy, was conducted in 2003 in 41 countries.
The OECD’s Programme for International Student Assessment (PISA) is a collaborative effort among OECD member countries to measure how well 15-year-old young adults approaching the end of compulsory schooling are prepared to meet the challenges of today’s knowledge societies. The assessment is forward-looking: rather than focusing on the extent to which these students have mastered a specific school curriculum, it looks at their ability to use their knowledge and skills to meet real-life challenges. This orientation reflects a change in curricular goals and objectives, which are increasingly concerned with what students can do with what they learn at school.
In addition to the assessments, PISA 2003 included Student and School Questionnaires to collect data that could be used in constructing indicators pointing to social, cultural, economic and educational factors that are associated with student performance. Using the data taken from these two questionnaires, analyses linking context information with student achievement could address:
Through the collection of such information at the student and school level on a cross-nationally comparable basis, PISA adds significantly to the knowledge base that was previously available from national official statistics, such as aggregate national statistics on the educational programs completed and the qualifications obtained by individuals.
Sample survey data [ssd]
The international target population is defined as all students aged from 15 years and 3 (completed) months to 16 years and 2 (completed) months at the beginning of the assessment period. The students had to be attending educational institutions located within the country, in grades 7 and higher. This meant that countries were to include 15-year-olds enrolled full-time in educational institutions, 15-year-olds enrolled in educational institutions who attended on only a part-time basis, students in vocational training types of programmes, or any other related type of educational programmes, and students attending foreign schools within the country (as well as students from other countries attending any of the programmes in the first three categories). It was recognised that no testing of persons schooled in the home, workplace or out of the country would occur and therefore these students were not included in the international target population.
The study covered the following aspects:
The second PISA survey was conducted in 41 countries: Australia, Austria, Belgium, Canada, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Japan, Korea, Luxembourg, Mexico, Netherlands, New Zealand, Norway, Poland, Portugal, Slovak Republic, Spain, Sweden, Switzerland, Turkey, United Kingdom, United States, Brazil, Hong Kong-China, Indonesia, Latvia, Liechtenstein, Macao-China, Russian Federation, Serbia and Montenegro, Thailand, Tunisia, Uruguay.
Name |
---|
Organisation for Economic Co-operation and Development |
Name |
---|
Australian Council for Educational Research |
Netherlands National Institute for Educational Measurement |
Westat (USA) |
Educational Testing Service (USA) |
National Institute for Educational Research (Japan) |
Name |
---|
Organisation for Economic Co-operation and Development |
More than a quarter of a million students, representing almost 30 million 15-year-olds enrolled in the schools of the 41 participating countries, were assessed in 2003.
The sampling design used for the PISA assessment was a two-stage stratified sample in most countries. The first-stage sampling units consisted of individual schools having 15-year-old students. In all but a few countries, schools were sampled systematically from a comprehensive national list of all eligible schools with probabilities that were proportional to a measure of size. This is referred to as probability proportional to size (PPS) sampling. The measure of size was a function of the estimated number of eligible 15-year-old students enrolled. Prior to sampling, schools in the sampling frame were assigned to strata formed either explicitly or implicitly.
The second-stage sampling units in countries using the two-stage design were students within sampled schools. Once schools were selected to be in the sample, a list of each sampled school's 15-year-old students was prepared. From each list that contained more than 35 students, 35 students were selected with equal probability, and for lists of fewer than 35, all students on the list were selected. It was possible for countries to sample a number of students within schools other than 35, provided that the number sampled within each school was at least as large as 20.
In two countries, a three-stage design was used. In such cases, geographical areas were sampled first (called first-stage units) using probability proportional to size sampling, and then schools (called second-stage units) were selected within sampled areas. Students were the third-stage sampling units in three-stage designs.
For additional information on sample design, refer to chapter 4 in the document "PISA 2003 Technical Report" provided as an external resource.
School response rates: A response rate of 85 percent was required for initially selected schools. If the initial school response rate fell between 65 and 85 percent, an acceptable school response rate could still be achieved through the use of replacement schools. To compensate for a sampled school that did not participate, where possible two replacement schools were identified for each sampled school. Furthermore, a school with a student participation rate between 25 and 50 percent was not considered as a participating school for the purposes of calculating and documenting response rates. However, data from such schools were included in the database and contributed to the estimates included in the initial PISA international report. Data from schools with a student participation rate of less than 25 percent were not included in the database, and such schools were also regarded as non-respondents.
Student response rates: A response rate of 80 percent of selected students in participating schools was required. A student who had participated in the original or follow-up cognitive sessions was considered to be a participant. A student response rate of 50 percent within each school was required for a school to be regarded as participating: the overall student response rate was computed using only students from schools with at least a 50 per cent response rate. Again, weighted student response rates were used for assessing this standard. Each student was weighted by the reciprocal of their sample selection probability.
For additional information on school and student response rates, refer to chapter 4 in the document "PISA 2003 Technical Report" provided as an external resource.
Survey weights were required to analyse PISA 2003 data, to calculate appropriate estimates of sampling error, and to make valid estimates and inferences. The consortium calculated survey weights for all assessed, ineligible and excluded students, and provided variables in the data that permit users to make approximately unbiased estimates of standard errors, to conduct significance tests and to create confidence intervals appropriately, given the sample design for PISA in each individual country.For detailed information on survey weighting, refer to chapter 8 in the document "PISA 2003 Technical Report" provided as an external resource.
PISA 2003 was a paper-and-pencil test. The test items were multiple choice, short answer, and extended response. Multiple choice items were either standard multiple choice with a limited number (usually four) of responses from which students were required to select the best answer, or complex multiple choice presenting several statements for each of which students were required to choose one of several possible responses (true/false, correct/incorrect, etc.). Short answer items included both closed-constructed response items that generally required students to construct a response within very limited constraints, such as mathematics items requiring a numeric answer, and items requiring a word or short phrase, etc. Short-response items were similar to closed-constructed response items, but for these a wider range of responses was possible. Open-constructed response items required more extensive writing, or showing a calculation, and frequently included some explanation or justification. Pencils, erasers, rulers, and in some cases calculators, were provided. The consortium recommended that calculators be provided in countries where they were routinely used in the classroom. National centres decided whether calculators should be provided for their students on the basis of standard national practice.
Two core questionnaires were used:
As in PISA 2000, additional questionnaire material was developed and offered as international options to participating countries. In PISA 2003, two international options were available: the ICT Familiarity questionnaire and Educational Career Questionnaire.
National centres could decide to add national items to the international student or school questionnaire. Insertion of national items into the student questionnaire had to be agreed upon with the international study centre during the review of adaptations, due to context relatedness. Adding more than five national items was considered as a national option. National student questionnaire options, which took less than ten minutes to be completed, could be administered after the international student questionnaire and international options. If the length of the national options exceeded ten minutes, national centres were requested to administer their national questionnaire material in follow-up sessions.
Start | End |
---|---|
2003 | 2003 |
This study is the product of a concerted effort between the countries participating in PISA, the experts and institutions working within the framework of the PISA Consortium, and the OECD.
National project managers (NPMs) were required to submit their national data in KeyQuest, the generic data entry package developed by consortium staff. The data were verified at several points starting at the time of data entry. Validation rules (or range checks) were specified for each variable defined in KeyQuest, and a datum was only accepted if it satisfied that validation rule. To prevent duplicate records, a set of variables assigned to an instrument were identified as primary keys. For the student test booklets, the stratum, school and student identifications were the primary keys. After the data entry process was completed, NPMs were required to implement some of the checking procedures implemented in KeyQuest before submitting data to the consortium, and to rectify any integrity errors. For detailed information on data entry and editing, refer to chapter 11 in the document "PISA 2003 Technical Report" provided as an external resource.
Use of the dataset must be acknowledged using a citation which would include:
Example:
Organisation for Economic Co-operation and Development. World Programme for International Student Assessment (PISA) 2003. Ref. WLD_2003_PISA_v01_M. Dataset downloaded from [URL] on [date].
The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.
Name | URL | |
---|---|---|
OECD PISA | edu.pisa@oecd.org | http://www.oecd.org/pisa/home/ |
DDI_WLD_2003_PISA_v01_M_WB
Name | Affiliation | Role |
---|---|---|
Development Economics Data Group | The World Bank | Documentation of the DDI |
2014-06-13
Version 01 (July 2014)