Pin It

Thursday, March 29, 2012

I'm not saying the AJC's nationwide 7-month investigation into school cheating is complete bullshit — I'm just saying there are some 'irregularities'

AJC-schools-cheating-story.jpg
  • CL File Photo

On Sunday, the Atlanta Journal-Constitution published a series of stories suggesting that, according to its own seven-month investigation, "196 of the nation's 3,125 largest school districts had a high degree of suspicious results on standardized test scores, which could point to instances of cheating."

I'm about to tell you, in a very long and involved way, two things about that statement:

1. That's technically true, and there are probably instances of cheating flagged within that 196 number, but that number is most likely tremendously overstated and the data used to arrive at it is deeply flawed.

2. The paper knew this and decided to publish anyway, because it didn't have the time, resources, or desire to dive deeper into these numbers.

I say this based on conversations I've had with school administrators, detailed responses by the districts themselves, and an expert who advised the paper and told it specifically why these numbers were not only wrong, but irresponsible to publish. In fact, I'm more certain of my conclusions than you should be of the notion the AJC's report indicates widespread cheating on the level the story asserts.

Let's just say I want to start a conversation about what these irregularities show. That logic and wording was good enough for the AJC to all-but-indict nearly 200 school districts - and hire a marketing firm to help it do so - as harboring cheaters, so it's good enough for me. Jump with me and let's discuss:

Gary Miron had seen this before. He was one of four outside experts brought in to advise USA Today as it worked on a large investigative report about cheating on school tests. ("Testing the System," which ran in March of last year.) When USAT approached him, he says the reporters were very excited with the data they had showing jumps in test scores across the nation that were statistically suspect, and they were anxious to publish.

Calm down, he told them. You've just taken the first step: You've analyzed group data. Yes, the irregularities you've found could suggest cheating. But, he said, it could be explained away legitimately, in multiple ways. One obvious way is student migration: In large school districts with 20 to 40 percent yearly change, this can explain the large increases (or decreases) in scores. But there are many other possibilities, like teacher change in small schools, or the unique student migration of special needs schools, or rezoning taking effect. He said the paper needed to take three more steps beyond the first step (analyzing group data):

2nd step: It must obtain individual data, to see abnormal gains in individual performance year-to-year. Then you can match up with group gains and see if it's just one student copying from another, or entire classrooms/schools showing these abnormal gains.

3rd step: It must obtain erasure data. Private testing companies keep this to show, say, an abnormal number of wrong answers erased and made correct (in instances, for example, where the teacher or tester was correcting student work).

4th step: Go to the districts with your findings. District by district, school by school, they may have logical answers to explain the irregularities.

Each step, he said, will reduce greatly your number of suspected cheaters, but by the end, you can be extraordinarily confident in your findings.

That's exactly what USAT did, and in conjunction with local papers, produced damning evidence of cheating in a handful of districts around the country. (Miron is mentioned in one of them.)

A week ago Monday, Miron was asked by the Cox newspaper in Dayton (the sister pub to the AJC) to go over the group data the AJC had on 3,000-plus school districts alleging possible cheating at 196 of them. He reviewed it and thought, I see problems, big problems, but no big deal. This is what happens in the beginning stages of just such a long project.

He had no idea that the paper was six days from publishing the full report - and had, in fact, the day before run a huge tease on its Sunday Page 1A touting the investigation.

A day or two later (he can't remember if it was Tuesday or Wednesday of last week), he was involved in a conference call with the Dayton editor. He registered his concerns about the methodology, which were more technical (demographic data ... regression analysis ... whatever, Statistics 101 was a long time ago). He was told, OH, we've changed the data already and made some of those adjustments. Odd, he thought. In two days? Anyway, he continued with two main points: a) Miron still had questions about the group data and the methodology, and b) just like with USAT, he made clear that the data they had is only the beginning, saying the paper needs to take steps 2 through 4 (as outlined above) so that it can narrow this most-likely inflated figure of 196 to one that the AJC can proudly stand behind.

He says he was told about each step:

Student data: Tough to get, often impossible to get, and besides, no time. This story is running Sunday.

Erasure data: Too expensive.

Contact individual school districts: We've already contacted some (Houston is the only large-district example I could find at that point), he was told, and we'll contact a few others (Nashville later in the week, Dallas late Friday afternoon, etc.), and, did you hear us? This is running Sunday.

Miron was astonished. "I don't care how they couch their explanation, these stories imply there is systematic cheating at these schools," he told me. "They've shown a good methodology for irregularity, but you can't attribute that to cheating."

No more astonished than Jon Dahlander, the public information officer for the Dallas Independent School District. On Friday afternoon, he received a call from the AJC, alerting him that Dallas' school system was flagged in the AJC report. He was not asked to respond, but was told the report was going up over the weekend, so he was asked for his cellphone number in case the paper wanted to follow up on Sunday. (I guess to, I dunno, ask "How did THAT nutkick feel?") He did find an email address for a reporter at the paper so he could quickly give some sort of response. He says two hours later he received a call from NBC Nightly News, asking him to respond to the report.

"Respond?" he said. "I haven't even seen it."

How did NBC Nightly News see it? Because the paper had apparently hired a TV marketing firm, A-1 Broadcast, to pimp the project. At week's end, again before the story was close to hitting newsstands, A-1 emailed to media outlets around the country this bit of 1998-font-tastic bit of awesome.
AJC-A1-PressRelease.pdf
As you can see, the paper was gearing up for its big PR push even as it was first notifying some of the school districts included.

Once the story came out, the AJC wasted no time in patting itself on the back. Sunday evening it posted a story saying how amazing the investigation was. It of course noted details to back up this claim, like this paragraph:

The AJC story was picked up by more than 300 news outlets around the country, including the Huffington Post, ABC News, the Washington Post, NBC Nightly News and MSNBC. Several news outlets in areas identified in the report ran their own stories, including The Tennessean and The Houston Chronicle. The story also created a social media buzz, generating several hundred tweets on Twitter and numerous mentions on Facebook.

(At the time of this posting, it was impossible to verify the claim of "numerous postings on Facebook." If true, however, that's totally sweet.)

The next two paragraphs in that follow-up story were telling, though. They claimed the following:

Reaction by districts identified in the AJC's analysis varied widely. Some said they would initiate deeper reviews, while others challenged the findings and defended their schools' integrity. The newspaper analyzed test results for 69,000 public schools and found high concentrations of suspect math or reading scores in school systems from coast to coast.

After reviewing the AJC's findings, the superintendent of the Houston school district told the Houston Chronicle that he will review the AJC results with principals and may put in place additional test monitors.

Ehhh, not exactly. Houston, which had more time than most to respond, did produce a long and even-handed statement that basically said, look, we take this seriously, and if they've helped us identify cheating, good on 'em.

However, the district took pains to point out that it - like Dallas, like Nashville, like Ohio schools named as part of the dreaded 196 - already does far more accurate yearly testing to find cheaters. And that, like these other districts, they've fired teachers when their findings (which use all the details that Miron discussed) prove cheating occurred. Also, in their statement (you can read it here), they pointed out problems they had with the methodology of the AJC that mirrored what Miron told the paper.

Nashville also came out swinging, putting out a press release that called the report basically useless as well as crafting a detailed examination of the methodology problems. (The paper has already removed one school from its report based on Nashville feedback.) Also, the Nashville paper held a live chat with a district administrator which was supposed to include the AJC editor to defend the project, but the AJC never joined the chat. Miron wrote a Washington Post blog post to detail his concerns. Alabama also issued a release. (I've lost it. Trust me.) And Dallas has just issued its detailed problems with the methodology, discussing how it's been doing more accurate testing (and firing for cheating abuses) since 2005. You can read it here:
Response_to_AtlantaJournalConstitution.pdf

Since then, the AJC has largely dismissed claims the report is flawed. They've said things like, well, high migration took place in Cleveland, and they didn't get flagged as one of the 196. Which is not really the point. Looking for one example of an outcome you're defending in a data set of 3K-plus districts and saying "that proves the method is accurate" doesn't wash. (Yes, I've reached out to the project manager of the AJC piece. He hasn't gotten back to me. We'll have much to discuss when he does.)

They've also said that other experts agree that the method used can accurately show irregularities that could be explained by cheating.

As Miron notes, incredulously and loudly: "I agree! It is a good methodology for irregularity. But you can't attribute that to cheating. There are way too many other explanations."

The districts are continuing to respond, but most news agencies, given the AJC's credibility in light of the Atlanta school cheating scandal, have taken the findings nationwide without much questioning. The districts themselves are even tiring of discussing it. A Nashville school spokeswoman told me:

At this point, Nashville has made it very clear to the Atlanta Journal-Constitution and others the methodology used that resulted in our district being in their report was flawed and created results that were wrong. We actually question if any of the results they found in Tennessee are valid based on the flawed methodology that factored zeros in as scores for students who for whatever reason were not tested and their inclusion of scores from an alternative assessment that has a totally different scale lower than the TCAP. Similar concerns have been echoed by other districts and by independent statisticians since this story appeared.

To be fair, the AJC did at least say in the story "ideally we would look at how individual student test scores change from year to year, but Federal privacy regulations precluded access to that data." Which begs the question: Then why do the investigation at all if you know this will smear, but prove nothing?

Unless you're still living in the reflected glory of your Atlanta cheating scandal story, and you want to squeeze more out of your journalistic 15 minutes? I dunno. I just think most of the time was spent fighting to get the data, and then by the time everyone realized these concerns could have validity, the pressure to publish was too great. I could be wrong. That's a conversation I'd like to have with the AJC.

Because I'm not saying the AJC report is total bullshit. I'm just saying they've got some irregularities to explain.

Tags: , , , , , , , , , ,

Comments (39)

Showing 1-39 of 39

Add a comment

 
Subscribe to this thread:
Showing 1-39 of 39

Add a comment

Readers also liked…

Latest in Fresh Loaf

Search Events

Search Fresh Loaf

Recent Comments

© 2014 Creative Loafing Atlanta
Powered by Foundation