IHSN Survey Catalog
  • Home
  • Microdata Catalog
  • Citations
  • Login
    Login
    Home / Central Data Catalog / WLD_2012_PISA_V01_M
central

Programme for International Student Assessment 2012

Albania, United Arab Emirates, Argentina...and 57 more, 2012
Reference ID
WLD_2012_PISA_v01_M
Producer(s)
Organisation for Economic Co-operation and Development
Metadata
Documentation in PDF DDI/XML JSON
Created on
Sep 05, 2014
Last modified
Jun 14, 2022
Page views
80559
Downloads
18217
  • Study Description
  • Data Dictionary
  • Downloads
  • Get Microdata
  • Related Publications
  • Identification
  • Scope
  • Coverage
  • Producers and sponsors
  • Sampling
  • Survey instrument
  • Data collection
  • Data processing
  • Data Access
  • Disclaimer and copyrights
  • Contacts
  • Metadata production
  • Identification

    Survey ID number

    WLD_2012_PISA_v01_M

    Title

    Programme for International Student Assessment 2012

    Country
    Name Country code
    Albania ALB
    United Arab Emirates ARE
    Argentina ARG
    Australia AUS
    Austria AUT
    Belgium BEL
    Bulgaria BGR
    Brazil BRA
    Canada CAN
    Switzerland CHE
    Chile CHL
    Colombia COL
    Costa Rica CRI
    Czech Republic CZE
    Germany DEU
    Denmark DNK
    Spain ESP
    Estonia EST
    Finland FIN
    France FRA
    United Kingdom GBR
    Greece GRC
    Hong Kong SAR, China HKG
    Croatia HRV
    Hungary HUN
    Indonesia IDN
    Ireland IRL
    Iceland ISL
    Israel ISR
    Italy ITA
    Jordan JOR
    Japan JPN
    Kazakhstan KAZ
    Liechtenstein LIE
    Lithuania LTU
    Luxembourg LUX
    Latvia LVA
    Macao SAR, China MAC
    Mexico MEX
    Montenegro MNE
    Malaysia MYS
    Netherlands NLD
    Norway NOR
    New Zealand NZL
    Peru PER
    Poland POL
    Portugal PRT
    Qatar QAT
    Singapore SGP
    Serbia SRB
    Slovak Republic SVK
    Slovenia SVN
    Sweden SWE
    Thailand THA
    Tunisia TUN
    Turkiye TUR
    Taiwan, China TWN
    Uruguay URY
    United States USA
    Vietnam VNM
    Series Information

    The 2012 survey is the fifth round of assessments since PISA began in 2000, and the second, after the 2003 survey, that focuses on mathematics. As such, PISA 2012 provides an opportunity to evaluate changes in student performance in mathematics since 2003, and to view those changes in the context of policies and other factors. For the first time, PISA 2012 also included two optiona assessments: al computer-based assessment of mathematics and an assessment of the financial literacy of young people.

    Abstract

    “What is important for citizens to know and be able to do?” That is the question that underlies the triennial survey of 15-year-old students around the world known as the Programme for International Student Assessment (PISA). PISA assesses the extent to which students near the end of compulsory education have acquired key knowledge and skills that are essential for full participation in modern societies. The assessment, which focuses on reading, mathematics, science and problem solving, does not just ascertain whether students can reproduce knowledge; it also examines how well students can extrapolate from what they have learned and apply that knowledge in unfamiliar settings, both in and outside of school. This approach reflects the fact that modern economies reward individuals not for what they know, but for what they can do with what they know. All 34 OECD member countries and 31 partner countries and economies participated in PISA 2012, representing more than 80% of the world economy.

    With mathematics as its primary focus, the PISA 2012 assessment measured 15-year-olds’ capacity to reason mathematically and use mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena, and to make the wellfounded judgements and decisions needed by constructive, engaged and reflective citizens. Literacy in mathematics defined this way is not an attribute that an individual has or does not have; rather, it is a skill that can be acquired and used, to a greater or lesser extent, throughout a lifetime.

    The PISA assessment provides three main types of outcomes:

    • basic indicators that provide a baseline profile of students’ knowledge and skills;
    • indicators that show how skills relate to important demographic, social, economic and educational variables; and
    • indicators on trends that show changes in student performance and in the relationships between student-level and school-level variables and outcomes.
    Kind of Data

    Sample survey data [ssd]

    Unit of Analysis

    To better compare student performance internationally, PISA targets a specific age of students. PISA students are aged between 15 years 3 months and 16 years 2 months at the time of the assessment, and have completed at least 6 years of formal schooling. They can be enrolled in any type of institution, participate in full-time or part-time education, in academic or vocational programmes, and attend public or private schools or foreign schools within the country. Using this age across countries and over time allows PISA to compare consistently the knowledge and skills of individuals born in the same year who are still in school at age 15, despite the diversity of their education histories in and outside of school.

    Scope

    Notes

    The scope of the PISA 2012 study includes the following:

    • Students' family background
    • Learning mathematics
    • Problem solving experiences
    • General computer use
    • Use of Information and Communication Technology (ICT)
    • Structure and organisation of the school
    • School resources
    • School instruction curriculum and assessment
    • School climate
    • Financial education at school

    Coverage

    Geographic Coverage

    PISA 2012 covered 34 OECD countries and 31 partner countries and economies. All countries attempted to maximise the coverage of 15-year-olds enrolled in education in their national samples, including students enrolled in special educational institutions.

    Producers and sponsors

    Primary investigators
    Name
    Organisation for Economic Co-operation and Development
    Producers
    Name
    Australian Council for Educational Research
    Netherlands National Institute for Educational Measurement
    Service de Pédagogie Expérimentale at Université de Liège
    Westat (USA)
    Educational Testing Service (USA)
    National Institute for Educational Research (Japan)
    Funding Agency/Sponsor
    Name
    Organisation for Economic Co-operation and Development

    Sampling

    Sampling Procedure

    The accuracy of any survey results depends on the quality of the information on which national samples are based as well as on the sampling procedures. Quality standards, procedures, instruments and verification mechanisms were developed for PISA that ensured that national samples yielded comparable data and that the results could be compared with confidence.

    Most PISA samples were designed as two-stage stratified samples (where countries applied different sampling designs. The first stage consisted of sampling individual schools in which 15-year-old students could be enrolled. Schools were sampled systematically with probabilities proportional to size, the measure of size being a function of the estimated number of eligible (15-year-old) students enrolled. A minimum of 150 schools were selected in each country (where this number existed), although the requirements for national analyses often required a somewhat larger sample. As the schools were sampled, replacement schools were simultaneously identified, in case a sampled school chose not to participate in PISA 2012.

    Experts from the PISA Consortium performed the sample selection process for most participating countries and monitored it closely in those countries that selected their own samples. The second stage of the selection process sampled students within sampled schools. Once schools were selected, a list of each sampled school's 15-year-old students was prepared. From this list, 35 students were then selected with equal probability (all 15-year-old students were selected if fewer than 35 were enrolled). The number of students to be sampled per school could deviate from 35, but could not be less than 20.

    Around 510 000 students between the ages of 15 years 3 months and 16 years 2 months completed the assessment in 2012, representing about 28 million 15-year-olds in the schools of the 65 participating countries and economies.

    Response Rate

    Data-quality standards in PISA required minimum participation rates for schools as well as for students. These standards were established to minimise the potential for response biases. In the case of countries meeting these standards, it was likely that any bias resulting from non-response would be negligible, i.e. typically smaller than the sampling error.

    A minimum response rate of 85% was required for the schools initially selected. Where the initial response rate of schools was between 65% and 85%, however, an acceptable school response rate could still be achieved through the use of replacement schools. This procedure brought with it a risk of increased response bias. Participating countries were, therefore, encouraged to persuade as many of the schools in the original sample as possible to participate. Schools with a student participation rate between 25% and 50% were not regarded as participating schools, but data from these schools were included in the database and contributed to the various estimations. Data from schools with a student participation rate of less than 25% were excluded from the database.

    PISA 2012 also required a minimum participation rate of 80% of students within participating schools. This minimum participation rate had to be met at the national level, not necessarily by each participating school. Follow-up sessions were required in schools in which too few students had participated in the original assessment sessions. Student participation rates were calculated over all original schools, and also over all schools, whether original sample or replacement schools, and from the participation of students in both the original assessment and any follow-up sessions. A student who participated in the original or follow-up cognitive sessions was regarded as a participant. Those who attended only the questionnaire session were included in the international database and contributed to the statistics presented in this publication if they provided at least a description of their father’s or mother’s occupation.

    Survey instrument

    Questionnaires

    Paper-based tests were used, with assessments lasting two hours. In a range of countries and economies, an additional 40 minutes were devoted to the computer-based assessment of mathematics, reading and problem solving.

    Test items were a mixture of questions requiring students to construct their own responses and multiple-choice items. The items were organised in groups based on a passage setting out a real-life situation. A total of about 390 minutes of test items were covered, with different students taking different combinations of test items.

    Students answered a background questionnaire, which took 30 minutes to complete, that sought information about themselves, their homes and their school and learning experiences. School principals were given a questionnaire, to complete in 30 minutes, that covered the school system and the learning environment. In some countries and economies, optional questionnaires were distributed to parents, who were asked to provide information on their perceptions of and involvement in their child’s school, their support for learning in the home, and their child’s career expectations, particularly in mathematics. Countries could choose two other optional questionnaires for students: one asked students about their familiarity with and use of information and communication technologies, and the second sought information about their education to date, including any interruptions in their schooling and whether and how they are preparing for a future career.

    Data collection

    Dates of Data Collection
    Start End
    2012 2012
    Data Collection Notes

    This study is the product of a concerted effort between the countries participating in PISA, the experts and institutions working within the framework of the PISA Consortium, and the OECD.

    Data processing

    Data Editing

    Software specially designed for PISA facilitated data entry, detected common errors during data entry, and facilitated the process of data cleaning. Training sessions familiarised National Project Managers with these procedures.

    Data Access

    Citation requirements

    Use of the dataset must be acknowledged using a citation which would include:

    • the Identification of the Primary Investigator
    • the title of the survey (including country, acronym and year of implementation)
    • the survey reference number
    • the source and date of download

    Example:
    Organisation for Economic Co-operation and Development. World Programme for International Student Assessment (PISA) 2012. Ref. WLD_2012_PISA_v01_M. Dataset downloaded from [URL] on [date].

    Disclaimer and copyrights

    Disclaimer

    The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.

    Contacts

    Contacts
    Name Email URL
    OECD PISA edu.pisa@oecd.org http://www.oecd.org/pisa/home/

    Metadata production

    DDI Document ID

    DDI_WLD_2012_PISA_v02_M_WB

    Producers
    Name Affiliation Role
    Development Economics Data Group The World Bank Documentation of the DDI
    Date of Metadata Production

    2014-07-20

    Metadata version

    DDI Document version

    DDI Document - Version 02 - (04/21/21)
    This version is identical to DDI_WLD_2012_PISA_v01_M_WB but country field has been updated to capture all the countries covered by survey.

    Version 01 (June 2014)

    Back to Catalog
    IHSN Survey Catalog

    © IHSN Survey Catalog, All Rights Reserved.