WLD_2012_PISA-CBA_v01_M
Programme for International Student Assessment, Computer-Based Assessment 2012
Name | Country code |
---|---|
Albania | ALB |
United Arab Emirates | ARE |
Argentina | ARG |
Australia | AUS |
Austria | AUT |
Belgium | BEL |
Bulgaria | BGR |
Brazil | BRA |
Canada | CAN |
Switzerland | CHE |
Chile | CHL |
China | CHN |
Colombia | COL |
Costa Rica | CRI |
Czech Republic | CZE |
Germany | DEU |
Denmark | DNK |
Spain | ESP |
Estonia | EST |
Finland | FIN |
France | FRA |
United Kingdom | GBR |
Greece | GRC |
Hong Kong SAR, China | HKG |
Croatia | HRV |
Hungary | HUN |
Indonesia | IDN |
Ireland | IRL |
Iceland | ISL |
Israel | ISR |
Italy | ITA |
Jordan | JOR |
Japan | JPN |
Kazakhstan | KAZ |
Liechtenstein | LIE |
Lithuania | LTU |
Luxembourg | LUX |
Latvia | LVA |
Macao SAR, China | MAC |
Mexico | MEX |
Montenegro | MNE |
Malaysia | MYS |
Netherlands | NLD |
Norway | NOR |
New Zealand | NZL |
Peru | PER |
Poland | POL |
Portugal | PRT |
Qatar | QAT |
Romania | ROU |
Russian Federation | RUS |
Singapore | SGP |
Serbia | SRB |
Slovak Republic | SVK |
Slovenia | SVN |
Sweden | SWE |
Thailand | THA |
Tunisia | TUN |
Turkiye | TUR |
Taiwan, China | TWN |
Uruguay | URY |
United States | USA |
Vietnam | VNM |
PISA 2012 supplemented the paper-based assessment with an optional computer-based assessment in mathematics and reading in which 32 of the 65 countries and economies participated. In addition, PISA 2012 included an optional computer-based assessment of problem solving in which 44 of the countries and economies participated. Fourty-one specially designed computer-based items were developed for the assessment. Future PISA surveys will feature more sophisticated computer-based items as developers and item writers become more fully immersed in the computer-based assessment and as delivery of the computer-based assessment becomes more sophisticated.
There were two reasons for including a computer-based mathematics assessment in PISA 2012. First, computer-based items can be more interactive, authentic and engaging than paper-based items. They can be presented in new formats (e.g. drag-and-drop), include real-world data (such as a large, sortable dataset), and use colour, graphics and movement to aid comprehension. Students may be presented with a moving stimulus or representations of three-dimensional objects that can be rotated, or have more flexible access to relevant information. New item formats can expand response types beyond verbal and written, giving a more rounded picture of mathematical literacy. Second, computers have become essential tools for representing, visualising, exploring, and experimenting with all kinds of mathematical objects, phenomena and processes, not to mention for realising all types of computations – at home, at school, and at work. In the workplace, mathematical literacy and the use of computer technology are inextricably linked.
Sample survey data [ssd]
To better compare student performance internationally, PISA targets a specific age of students. PISA students are aged between 15 years 3 months and 16 years 2 months at the time of the assessment, and have completed at least 6 years of formal schooling. They can be enrolled in any type of institution, participate in full-time or part-time education, in academic or vocational programmes, and attend public or private schools or foreign schools within the country. Using this age across countries and over time allows PISA to compare consistently the knowledge and skills of individuals born in the same year who are still in school at age 15, despite the diversity of their education histories in and outside of school.
The mathematical competencies being tested: These comprise aspects of mathematical literacy applicable in any environment, not just computer environments, and are being tested in every computer-based assessment item.
Competencies that cover aspects of mathematics and ICT: These require knowledge of doing mathematics with the assistance of a computer or handheld device. These are being tested in some – but not all – computer-based assessment items. The computer-based test may include assessments of the following competencies:
• making a chart from data, including from a table of values (e.g. pie chart, bar chart, line graph) using simple ‘wizards’;
• producing graphs of functions and using the graphs to answer questions about the functions;
• sorting information and planning efficient sorting strategies;
• using hand-held or on-screen calculators;
• using virtual instruments such as an on-screen ruler or protractor; and
• transforming images using a dialog box or mouse to rotate, reflect or translate the image.
ICT skills: Just as pencil and paper assessments rely on a set of fundamental skills for working with printed materials, computer-based assessments rely on a set of fundamental skills for using computers. These include knowledge of basic hardware (e.g. keyboard and mouse) and basic conventions (e.g. arrows to move forward and specific buttons to press to execute commands). The intention is to keep such skills to a minimal core level in every computer-based assessment item.
44 countries and economies participated in a computer-based assessment of problem solving, and among them, 32 participated in a computer-based assessment of reading and mathematics.
Name |
---|
Organisation for Economic Co-operation and Development |
Name |
---|
Australian Council for Educational Research |
Netherlands National Institute for Educational Measurement |
Service de Pédagogie Expérimentale at Université de Liège |
Westat (USA) |
Educational Testing Service (USA) |
National Institute for Educational Research (Japan) |
Name |
---|
Organisation for Economic Co-operation and Development |
Out of the 65 countries and economies that participated in PISA 2012, 44 also implemented the computer-based assessment (CBA) of problem solving. Of these, 12 countries and economies only assessed problem solving, while 32 also assessed mathematics and (digital) reading on computers.
In all 44 countries/economies, only a random sub-sample of students who participated in the paper-based assessment (PBA) of mathematics were sampled to be administered the assessment of problem solving. However, as long as at least one student in a participating school was sampled for the computer-based assessment, all students in the PISA sample from that school received multiple imputations (plausible values) of performance in problem solving. This is similar to the procedure used to impute plausible values for minor domains in PISA (for instance, not all test booklets in 2012 included reading questions; but all students received imputed values for reading performance).
In all but four of the 44 countries/economies that assessed problem solving, the school samples for CBA and PBA coincide. As a consequence, in 40 countries/economies the main student dataset, containing the results of paper-based assessments, and the CBA dataset have the same number of observations.
For more on sampling, refer to the report titled "PISA 2012 Results: Volume V," provided as an external resource.
The duration of the PISA 2012 computer-delivered assessment was 40 minutes. A total of 80 minutes of problem-solving material was organised into four 20-minute clusters. Students from countries not participating in the optional computer-based assessment of mathematics and digital reading did two of the clusters according to a balanced rotation design. Students from countries also participating in the optional computer-based assessment of mathematics and digital reading did two, one or none of the four problem-solving clusters according to a separate balanced rotation design. The optional computer-based component contained a total of 80 minutes of mathematics material and 80 minutes of reading material. The material for each domain was arranged in four clusters of items, with each cluster representing 20 minutes of testing time. All material for computer delivery was arranged in a number of rotated test forms, with each form containing two clusters. Each student did one form, representing a total testing time of 40 minutes.
The computer-based assessment included a variety of types of questions. Some required students to select or produce simple responses that can be directly compared with a single correct answer, such as multiple-choice or closed-constructed response items. These questions had either a correct or incorrect answer and often assess lower-order skills. Others were more constructive, requiring students to develop their own responses designed to measure broader constructs than those captured by more traditional surveys, allowing for a wider range of acceptable responses and more complex marking that can include partially correct responses.
Start | End |
---|---|
2012 | 2012 |
This study is the product of a concerted effort between the countries participating in PISA, the experts and institutions working within the framework of the PISA Consortium, and the OECD.
Use of the dataset must be acknowledged using a citation which would include:
Example:
Organisation for Economic Co-operation and Development. World Programme for International Student Assessment, Computer-Based Assessment (PISA-CBA) 2012. Ref. WLD_2012_PISA-CBA_v01_M. Dataset downloaded from [URL] on [date].
The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.
Name | URL | |
---|---|---|
OECD PISA | edu.pisa@oecd.org | http://www.oecd.org/pisa/home/ |
DDI_WLD_2012_PISA-CBA_v02_M_WB
Name | Affiliation | Role |
---|---|---|
Development Economics Data Group | The World Bank | Documentation of the DDI |
2014-07-20
DDI Document - Version 02 - (04/21/21)
This version is identical to DDI_WLD_2012_PISA-CBA_v01_M_WB but country field has been updated to capture all the countries covered by survey.
Version 01 (June 2014)