PISA: Can this test measure the outcomes of progressive science education

Written by Jack Hassard

On November 6, 2009

Long title, sorry.  But, Volume 46, Issue 8 (October 2009) of the Journal of Research in Science Teaching was devoted to Scientific Literacy and Context in PISA Science.  The entire issue was devoted to this theme.  In one of the articles in this volume (Scientific Literacy, PISA, and Socioscientific Discourse: Assessment for Progressive Aims of Science Education), the authors used the term progressive science education in the way George DeBoer used in many years ago to summarize movements in science teaching that included public understanding of science, humanistic science education, context-based science teaching, STS, and socioscientific issues science education.

How can the aims of progressive science education be measured?

According to some, The Programme for International Student Assessment (PISA), which is coordinated by the Organization for Economic Co-operation and Development (OECD), suggests that their test, which assesses 15 year-old students in nearly 60 countries, can do so.   The PISA assessment in science (there are also PISA tests in Reading and Mathematics), which is described in the first article in this volume, purports to assess

  • scientific knowledge and use of that knowledge
  • understanding of the characteristic features of science
  • awareness of how science and technology shape
  • our world willingness to engage in science-related issues

The last administration of the science test was 2006, and more than 40 countries participated in the test. You can see sample test items here to get a feel for the nature of the test questions used on the PISA science test.

I was happy to see a little bit of criticism in two of the articles in the research journal, but overall I felt as if the journal was endorsing the PISA assessment.  It’s the criticism that I was interested in exploring, especially since science education around the world is very much structured around standards, and the resultant high stakes science achievement tests that are used to measure student progress, and now, teacher assessment.

Some of the authors pointed to an article written by Douglas Roberts, in which he differentiates between two different kinds or visions of science teaching.  Vision I emphasizes subject matter itself; Vision II emphasizes science in life situations in which science plays a key role.

According to many of the authors of this NARST issue, PISA has developed an assessment system that aligns “very well” with Vision II.  And indeed, if you go ahead and look at the sample of test items from the last PISA science test, there is the air of application, and use of science.   In this view, Vision II be seen as progressive science education, and if, indeed, PISA claims to be able to “measure” these kinds of outcomes, then it would indeed be an attractive instruments for science education.

But in my view, its simply another large scale test, that really does not assess how students use science in lived experiences.  Svein Sjøberg challenges the wisdom of PISA’s claim to measure students’ real life experiences and science.  Sjøberg is a professor of science education at the University of Oslo, and director of another large scale project that assesses students attitudes about science.  In his research on PISA, he points out that:

The main point of view is that the PISA ambitions of testing “real-life skills and competencies in authentic contexts” are by definition alone impossible to achieve. A test is never better than the items that constitute the test. Hence, a critique of PISA should not mainly address the official rationale, ambitions and definitions, but should scrutinize the test items and the realities around the data collection. The secrecy over PISA items makes detailed critique difficult, but I will illustrate thof the items with two examples from the released texts.

Sjøberg provides more details.   I’ll talk more about this in the days ahead.  I’ll also come back to some of the criticism that was included in two of research papers in the Journal of Research in Science Teaching.  In the meantime you might enjoy reading some of the abstracts in the JRST volume, and Sjoberg’s article.

You May Also Like…

A Letter from A Teen Living in 2051 about Education and the Climate Crisis

A Letter from A Teen Living in 2051 about Education and the Climate Crisis

This post focuses on education and climate as seen by a teen living in Atlanta in the year 2051.  I originally published it on April 21,  2012.  Although a work of fiction, it is presented here as a reminder of the consequences of making decisions based on faulty reasoning and ignorance.  I am re-publishing it today ahead of the 2021 United Nations Climate Change Conference being held in Glasgow, Scotland

Michelle Rhee’s legacy

Latest Story: Reblogged from Mathbabe Michelle Rhee’s legacy Dr. O’Neil provides important comparisons between the Atlanta cheating scandal and the cheating scandal in Washington, D.C. under Michelle Rhee. The difference was the scandal in D.C. was buried. Originally...

Is the Smarter Balanced Assessment Consortium Smart or Just Dumb?

Is the Smarter Balanced Assessment Consortium Smart or Just Dumb?  That's the question we'll try to address in this blog post. The Smarter Balanced Assessment Consortium (Smarter Balanced) released scale scores for math and ELA (English Language Arts) aligned to the...

0 Comments

We would enjoy reading your comments

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Citizen Jack

Subscribe now to keep reading and get access to the full archive.

Continue reading