Long title, sorry. But, Volume 46, Issue 8 (October 2009) of the Journal of Research in Science Teaching was devoted to Scientific Literacy and Context in PISA Science. The entire issue was devoted to this theme. In one of the articles in this volume (Scientific Literacy, PISA, and Socioscientific Discourse: Assessment for Progressive Aims of Science Education), the authors used the term progressive science education in the way George DeBoer used in many years ago to summarize movements in science teaching that included public understanding of science, humanistic science education, context-based science teaching, STS, and socioscientific issues science education.
How can the aims of progressive science education be measured?
According to some, The Programme for International Student Assessment (PISA), which is coordinated by the Organization for Economic Co-operation and Development (OECD), suggests that their test, which assesses 15 year-old students in nearly 60 countries, can do so. The PISA assessment in science (there are also PISA tests in Reading and Mathematics), which is described in the first article in this volume, purports to assess
- scientific knowledge and use of that knowledge
- understanding of the characteristic features of science
- awareness of how science and technology shape
- our world willingness to engage in science-related issues
The last administration of the science test was 2006, and more than 40 countries participated in the test. You can see sample test items here to get a feel for the nature of the test questions used on the PISA science test.
I was happy to see a little bit of criticism in two of the articles in the research journal, but overall I felt as if the journal was endorsing the PISA assessment. It’s the criticism that I was interested in exploring, especially since science education around the world is very much structured around standards, and the resultant high stakes science achievement tests that are used to measure student progress, and now, teacher assessment.
Some of the authors pointed to an article written by Douglas Roberts, in which he differentiates between two different kinds or visions of science teaching. Vision I emphasizes subject matter itself; Vision II emphasizes science in life situations in which science plays a key role.
According to many of the authors of this NARST issue, PISA has developed an assessment system that aligns “very well” with Vision II. And indeed, if you go ahead and look at the sample of test items from the last PISA science test, there is the air of application, and use of science. In this view, Vision II be seen as progressive science education, and if, indeed, PISA claims to be able to “measure” these kinds of outcomes, then it would indeed be an attractive instruments for science education.
But in my view, its simply another large scale test, that really does not assess how students use science in lived experiences. Svein Sjøberg challenges the wisdom of PISA’s claim to measure students’ real life experiences and science. Sjøberg is a professor of science education at the University of Oslo, and director of another large scale project that assesses students attitudes about science. In his research on PISA, he points out that:
The main point of view is that the PISA ambitions of testing “real-life skills and competencies in authentic contexts” are by definition alone impossible to achieve. A test is never better than the items that constitute the test. Hence, a critique of PISA should not mainly address the official rationale, ambitions and definitions, but should scrutinize the test items and the realities around the data collection. The secrecy over PISA items makes detailed critique difficult, but I will illustrate thof the items with two examples from the released texts.
0 Comments