On April 1st, 2014, the official results from PISA 2012 Problem Solving have been published! Wow! About 85.000 students from 44 countries participated in our computerbased assessment! Now everyone can see the results of what has been prepared, discussed, pretested, and refined for many years. A volume with 250 pages shows the main results:
- OECD (2014). PISA 2012 Results: Creative Problem Solving: Students’ Skills in Tackling Real-Life Problems (Volume V). Paris: OECD Publishing. doi:10.1787/9789264208070-en
What are the central results? In the words of the OECD (2014, Problem Solving Results, p. 48):
- “Students in Singapore and Korea, followed by students in Japan, score higher in problem solving than students in all other participating countries and economies.
- On average across OECD countries, about one in five students is only able to solve very straightforward problems – if any – provided that they refer to familiar situations. By contrast, fewer than one in ten students in Japan, Korea, Macao-China and Singapore are low-achievers in problem solving.
- Across OECD countries, 11.4% of 15-year-old students are top performers in problem solving, meaning that they can systematically explore a complex problem scenario, devise multi-step solutions that take into account all constraints, and adjust their plans in light of the feedback received.
- Problem-solving performance is positively related to performance in other assessed subjects, but the relationship is weaker than that observed between performance in mathematics and reading or between performance in mathematics and science.
- In Australia, Brazil, Italy, Japan, Korea, Macao-China, Serbia, England (United Kingdom) and the United States, students perform significantly better in problem solving, on average, than students in other countries who show similar performance in mathematics, reading and science. In Australia, England (United Kingdom) and the United States, this is particularly true among strong and top performers in mathematics; in Italy, Japan and Korea, it is particularly true among moderate and low performers in mathematics.”
The problem solving score for Germany is 509, being significantly higher than the OECD average of 500 (with 100 as standard deviation). This score is statistically equivalent to the scores from England (UK), Estonia, France, Netherlands, Italy, Czech Republic, United States, Belgium, Austria, and Norway. All these countries are on the same level.
Countries significantly above OECD average are Singapore, Korea, Japan, Macao-China, Hong Kong-China, Shanghai-China, Chinese Taipei, Canada, Australia, Finland, England (UK), Estonia, France, Netherlands, Italy, Czech Republic, Germany, United States, and Belgium.
Countries significantly below OECD average are Sweden, Russian Federation, Slovak Republic, Poland, Spain, Slovenia, Serbia, Croatia, Hungary, Turkey, Israel, Chile, Cyprus, Brazil, Malaysia, United Arab Emirates, Montenegro, Uruguay, Bulgaria, and Colombia.
Some Background on Interactive Problem Solving
Since I became Chairman of the International Expert Group on Problem Solving in the year 2010, together with my international experts (Beno Csapo, Hungary; John Dossey, USA; Art Graesser, USA; Detlev Leutner, Germany; Romain Martin, Luxembourg; Richard Mayer, USA; Min Min Tan, Singapore) we have changed the way the OECD is seeing problem solving. ACER (Melbourne, Australia) with Barry McCrae, Ray Philpot and Dara Ramalingam helped to realize our ideas. Whereas in the PISA 2003 wave problem solving was seen as a primarily analytical competence, now in PISA 2012 the shift to interactive problem solving has been made (see the Problem Solving Framework).
What does that mean? Analytical problem solving requires from a person to make use of given information (see for example the item “Birthday Party”, available from http://cbasq.acer.edu.au, Username: public, Password: access, in the section “Problem Solving”) and to combine all information under the given restrictions. Interactive problem solving, on the other hand, requires from a person to interact with the environment to get more information about it during this interaction and to learn how to use this knowledge for reaching given goal (see for example the item “Climate Control”, available from http://cbasq.acer.edu.au, Username: public, Password: access, in the section “Problem Solving”).
Interactive problem solving can only be assessed if the student can interact with the environment. PISA 2012 Problem Solving used computer-based assessment, making that interaction possible. According to our framework, problem-solving competence is defined as “an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen” (OECD, 2013, Framework on Problem Solving).
“In the 28 OECD countries and 16 partner countries and economies that participated in the assessment of problem solving, the survey was conducted after the paper-based assessment of mathematics, reading and science. In countries that also assessed mathematics and reading on computers, these computer-based tests were administered at the same time as the problem-solving assessment. The 16 units of the problem-solving assessment were grouped into four clusters, each of which was designed to be completed in 20 minutes. Each student assessed was given either one or two clusters, depending on whether the student was also participating in the computer-based assessment of mathematics or reading. In all cases, the total time allocated to computer-based tests was 40 minutes.” (OECD, 2014, Problem Solving Results, p. 32).
“Most interactive units included in the PISA 2012 assessment of problem solving belong to one of two classes of problems studied in the literature, “MicroDYN” systems and “finite-state automata”. In both cases, exploration and control of an unknown system are the two main tasks for the student. The single exception is a resource-allocation problem, in which experimental interaction with the test scenario is needed to uncover important information about the available resources.” (OECD, 2014, Problem Solving Results, p. 34).
The MicroDYN and MicroFIN tasks are result of our Heidelberg research funded by the German Research Foundation (DFG) from 2007-2014 (thanks to the Special Priority Program SPP 1293, see my older blog-entry on that program [in German]). The idea is to use a number of minimal complex items with varying content and varying difficulty. In each item, first a participant has to find out (e.g., within 3 minutes) how input variables are related to output variables; second, a participant is given a set of goal values for the output variables that have to be reached (e.g., within another 2 minutes) by proper control of the input variables.
What has to be done next? From a basic research perspective, we are looking for a deeper understanding of the relationship between problem solving and other cognitive abilities like, for example, intelligence. Also, the extension of individual problem solving to collaborative problem solving (CoPS) is an issue at hand because OECD wants to assess this dimension in PISA 2015 (see the draft framework; responsible for the CoPS framework will be Art Graesser, Memphis). From a perspective of applied psychology research, we are interested in training programs - how to improve problem solving competencies by means of teaching and learning.
–> Hier geht es zur deutschen Seite der PISA-Studie
–> here is an interview with Andreas Schleicher about PISA 2012 Problem Solving (from YouTube)
–> see short description of results from Francesco Avvisati here
–> PISA 2012 Problem Solving via TUM (in German)