Students Lag in Science So Says the National Center for Education Statistics

Written by Jack Hassard

On August 25, 2009

There was story on cnn.com today that caught my attention entitled U.S. students behind in math, science, analysis says.  The analysis was written by the National Center of Educational Statistics and was a summary analysis of several international assessments including the Trends in International Mathematics and Science Study (TIMSS), and the Program for International Student Assessment (PISA, 2006 results).

The story was a report of a brief talk given by the U.S. Secretary of Education (Mr. Arne Duncan) in which he used the results on the “Condition of Education” issued by the National Center of Educational Statistics.  You can see the full report by clicking on the previous link.  The basic question in the report was: How do U.S. students compare with their peers in other countries?  For all the details that you can examine, the analysis comes down to this:

The performance of U.S. students neither leads nor trails the world in reading, mathematics, or science at any grade or age (quote from report’s summary).

The Secretary of Education uses the results of the analysis to say that “we are lagging the rest of the world, and we are lagging it in pretty substantial ways.  I think we have become complacent. We’ve sort of lost our way.”  Unfortunately politicians believe that the data represents an accurate picture of student learning, and use it to drum up support for their policies. Yet, U.S. scores have not changed since 2000.

If you look at the PISA results, which are for the year 2006, U.S. 15 year-old students scored higher than some peers, and scored lower than some peers on the major areas of testing as reported by PISA: overall scientific literacy, identifying scientific issues, explaining phenomena, and using scientific evidence.  Rank ordering the countries by score (similar to way we rank order competitive sports),  in overall scientific literacy, Finland leads the way with a score of 563 (500 is average), the U.S. scores 489 (21st), and Mexico scores 410 (30th).

Is the sky falling?  Have we lost our way?  Should we pay math and science teachers more?  Can we educate our way to a better economy?

I’ve written before that the results of international comparisons and other large-scale assessments need to be carefully scrutinized before making sweeping generalizations about the fitness of a country’s or state’s educational system.  For example, the U.S. has more than 15,000 independent school systems;  to use an average score that is representative of the students in these schools based on a sit-down test of 48 to 60 items doesn’t describe the qualities or inequalities inherent in any country’s schools.

Results as reported by PISA and TIMSS help shape the public image of science education (or mathematics education), and it is unfortunate that educators allow this to happen.  Dr. Svein Sjøberg of the University of Oslo in a publication entitled Pisa and Real Life Challenges: Mission Impossible, questions the use of tests such as PISA and TIMSS.  He informs us that:

The PISA project sets the educational agenda internationally as well as within the participating countries. PISA results and advice are often considered as objective and value- free scientific truths, while they are, in fact embedded in the overall political and economic aims and priorities of the OECD. Through media coverage PISA results create the public perception of the quality of a country’s overall school system. The lack of critical voices from academics as well as from media gives authority to the images that are presented.

PISA measures only three areas of the curriculum (math, science, reading), according to Dr. Sjøberg , and the implication is that these are the most important areas, and areas such as history, geography, social science, ethics, foreign language, practical skills, arts and aesthetics are not as important to the goals of PISA.  TIMSS, according to his analysis (and I would agree) is based on a science curriculum that many science educators want to replace, yet uses test items that could have been used 50 years ago.   In general the public is convinced that these international tests are valid ways of measuring learning, and that the results can be used to draw significant conclusions about the effectiveness of teaching and learning.

If you live in the world of psychometrics and modeling, the results that are gathered by these international testing bodies is a dream come true. Sjøberg puts it this way:

PISA (and even more so TIMSS) is dominated and driven by psychometric concerns, and much less by educational. The data that emerge from these studies provides a fantastic pool of social and educational data, collected under strictly controlled conditions – a playground for psychometricians and their models. In fact, the rather complicated statistical design of the studies decreases the intelligibility of the studies. It is, even for experts, rather difficult to understand the statistical and sampling procedures, the rationale and the models that underlie the emergence of even test scores. In practice, one has to take the results at face value and on trust, given that some of our best statisticians are involved. But the advanced statistics certainly reduce the transparency of the study and hinder publicly informed debate.

You May Also Like…

A Letter from A Teen Living in 2051 about Education and the Climate Crisis

A Letter from A Teen Living in 2051 about Education and the Climate Crisis

This post focuses on education and climate as seen by a teen living in Atlanta in the year 2051.  I originally published it on April 21,  2012.  Although a work of fiction, it is presented here as a reminder of the consequences of making decisions based on faulty reasoning and ignorance.  I am re-publishing it today ahead of the 2021 United Nations Climate Change Conference being held in Glasgow, Scotland

Michelle Rhee’s legacy

Latest Story: Reblogged from Mathbabe Michelle Rhee’s legacy Dr. O’Neil provides important comparisons between the Atlanta cheating scandal and the cheating scandal in Washington, D.C. under Michelle Rhee. The difference was the scandal in D.C. was buried. Originally...

Is the Smarter Balanced Assessment Consortium Smart or Just Dumb?

Is the Smarter Balanced Assessment Consortium Smart or Just Dumb?  That's the question we'll try to address in this blog post. The Smarter Balanced Assessment Consortium (Smarter Balanced) released scale scores for math and ELA (English Language Arts) aligned to the...

0 Comments

Post your comments

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Citizen Jack

Subscribe now to keep reading and get access to the full archive.

Continue reading