On Sunday, the Atlanta Journal-Constitution, a Cox newspaper published the results of its investigation into “cheating” in American schools. The article was entitled Suspicious Scores Across the Nation, and you can read it by following the link. The article was subtitled “Cheating Our Children.”
I was immediately suspicious of the report that the Journal-Constitution published. They have put into place an agressive team of watchdog reporters, database specialists, and investigative reporters. In whose interest motivates this team and this newspaper? After reading the report, I reaffirm my suspicions. Let me explain.
The data (collection of facts, observations, and measurements) that is available to this team is a database specialists’ fantasy. Fifty-states provided standardized testing data from 69,000 schools, 14, 743 districts, 13 million students, representing 1.6 million records. Most states provided the data immediately (public records law), but a few Departments of Education hedged a bit, but eventually sent on their data.
According to the researchers,
For each state, grade, cohort and year, we created a linear regression model, weighted by the number of students in a class, and compared the average score for a class with the score predicted by the model based on the previous year’s average score. We then calculated a p-value — an estimated probability that such a difference would occur by chance — using standardized residuals and the “T” probability distribution, which adjusts the probability upward for classes with fewer students. (Links Mine)
What the researchers looked for was scores rising or falling with probabilities less than 0.05. These were flagged. Maybe the bubble sheets were erased and correct answers added? This would mean that some one cheated. Wouldn’t it? Or could there be other explanations?
Is cheating what caused the some scores to change at a probability level that you wouldn’t expect. According to the Journal-Constitution,
A statistical analysis cannot prove cheating. It can only identify improbable events that can be caused by cheating and should be investigated.
If it smells like cheating, it must be cheating.
But if you read the Journal-Constitution initial article, and one they published today (Cheating our children: AJC’s testing investigation spurs action), the only suggestion that the Journal makes is that the T-scores might indicate cheating. And indeed, in the latter article, Georgia U.S. Senator Johnny Isakson said that this report is troubling, and if the districts don’t do something about it, then the Governors should. If they don’t, then “I don’t think Congress should look the otherway.” I am sorry, but the Congress has been looking the other way since the NCLB Act was signed into law ten years ago.
Here is what the Atlanta Journal staff had to say about their research model:
A statistical analysis cannot prove cheating. It can only identify improbable events that can be caused by cheating and should be investigated.
Ideally we would look at how individual student test scores change from year to year, but federal privacy regulations precluded access to that data. The approximate cohorts we used were the only available substitute. It is unlikely that two groups of students in a cohort are perfectly identical. Urban districts in particular have high student mobility.
In the model that the data analysts used, average scores from one year to the next were not necessarily based on the same population of students. They didn’t have student data. The only had average data for a school by subject. So it is possible that they are making predictions based unreliable dats.
High student mobility might give us a clue about other possibilities to explain score changes given the demographics of the schools and districts that were highlighted. For example if you look at the districts that were highlighted in the paper (Amarillo, Atlanta, Baltimore, Chicago, Dallas, Detroit, East St. Louis, Fresno, Gary, IN, Houston, Los Angeles, & Mobile County, AL), all of these districts reported that at least 64% of the students were eligible for free or reduced-price meals. The poverty concentration of these schools averaged 80%. Mobility is high in these districts.
Critique of Methodology used by the Atlanta Journal Data Analysts
Gary Miron, professor of education at Western Michigan University was one of four academics or test specialists that advised USA Today and its affiliated Gannett newspapers newspapers on a multistate analysis of irregularities in assessment data. In his article published on The Answer Sheet, Dr. Miron writes about some of his concerns regarding the research methodology used by the Atlanta-Journal Constitution.
My review, however, yielded serious concerns about the data used, the methods of analysis employed, and the conclusions drawn. I shared these concerns with journalists at the Dayton Daily News, which is one of the Cox affiliates involved in this story.
To be clear, the Cox analysis may accurately detect large variations in assessment results from year to year. But my own analysis of the data suggests that these irregularities are less likely due to actual cheating than due to mobility in student population (recall the lack of student-level data). Although the Cox news articles on this study offer a disclaimer that their analysis does not actually prove cheating, this disclaimer should be expanded considerably.
The evidence is that the Atlanta Journal may be practicing sensationalism in throwing the data up against the wall, and hoping that some of it will stick. The map they published is impressive, but it raises more questions than it answers, and the writers of the article were quick draw the “cheat” card. “Cheating” was mentioned 57 times in the first article of 3000 words, and 31 times in the second article, which was 978 words long. Mobility or any mention of changing student populations was mentioned once in both articles. One might raise the issue that the manner in which the paper published this article is self-serving. Are they really interesting in uncovering serious issues facing our nation’s schools, or are they more interested in selling newspapers, and receiving awards for investigative reporting. I really don’t know. But the thought crossed my mind.
Dr. Miron conveys his concerns about the methodology that was used by the newspaper. Here are his concerns:
- As noted, the analysis is based on school-level data and not individual student-level data. Accordingly, it was not possible to ensure that the same students were in the group in both years.
- The analysis of irregular jumps in test scores should have been coupled with irregularities in erasure data where this data was available.
- The analysis by Cox generates predicted values for schools, but this does not incorporate demographic characteristics of the student population.
- The limited details available on the study methods made it impossible to replicate and verify what the journalists were doing. Further, the rationale was unclear for some of the steps they took.
I am angry at the machinations of those who, with so little knowledge of learning, of teachers, or of children, are twisting the life out of schools.
She adds that the current use of standardized tests, which promotes competition between schools, and used to evaluate teachers and principals to determine their salaries bring out the worst in adults. It should be no surprise to the Atlanta Journal staff, but they fail to pursue the data other than to blame schools for the current state of our schools. This is not about cheating.
According to Dr. Delpit, “the problem is that the cultural framework of our country has, almost since its inception, dictated that “black” is bad and less than and in all arenas “white” is good and superior.”
To point to urban schools as the harbinger of cheating and surprising test results leaves us with many questions. Why don’t we talk about how we got ourselves into this mess in the first place? What has been the effect of the NCLB policy that has turned schooling into test factories, only interested in finding out what low level of knowledge students might have learned in school? We simply can not continue to test with out really reforming education. And simply writing common standards (objectives) for all kids regardless of where they live, and with little teacher flexibility or input from teachers simply will only reinforce the authoritarian nature of schooling. Testing students doesn’t make them smarter, any more than weighing the cows makes them heavier.
I agree with Dr. Delpit when she calls for us to create excellence in urban classrooms, and she suggests that we must do the following:
- Recognize the importance of a teacher and good teaching, especially for the “school dependent” children of low-income communities.
- Recognize the brilliance of poor, urban children and teach them more content, not less.
- Whatever methodology or instructional program is used, demand critical thinking while at the same time assuring that all children gain access to “basic skills”—the conventions and strategies that are essential to success in American society.
- Provide children with the emotional ego strength to challenge racist societal views of their own competence and worthiness and that of their families and communities.
- Recognize and build on children’s strengths.
- Use familiar metaphors and experiences from the children’s world to connect what students already know to school-taught knowledge.
- Create a sense of family and caring in the classroom.
- Monitor and assess students’ needs and then address them with a wealth of diverse strategies.
- Honor and respect the children’s home cultures.
- Foster a sense of children’s connection to community, to something greater than themselves. (Delpit, Lisa (2012-03-20). “Multiplication Is for White People”: Raising Expectations for Other People’s Children . Perseus Books Group. Kindle Edition.)
What do you think about the Atlanta Journal’s inference that cheating might be the reason for the unexpected changes in test scores? Do you think they are helping solve a problem, or are they creating problems? What can be done to improve urban schools?
0 Comments