This is a continuation of my last post which you can read here.
In my last post I argued that there is almost no correlation between the progress ranks from one year to the next that New York City uses to calculate the report card grades that are then used to shut down schools.
One commenter noted that when you make a graph of the progress scores from 2010 to 2011 rather than the ranks, there does appear to be a correlation. I’d like to address that here and add some more analysis of the data in this context.
The commenter says that the city uses the scores and not the ranks to decide which schools to close so it is not appropriate to look at the change in ranks. I think that the unstable ranks actually are the more relevant stat since the number of ‘F’ schools is based not on an absolute scale of what progress should be, but on the preordained decision that the bottom 5% of the schools are going to get ‘F’s.
Still, the commenter makes a valid point, though one that I don’t think will weaken my argument, and actually one that will enable me to find some more weaknesses in the reformers plan.
I did make a graph of the progress scores (it is ranked out of 60) comparing 2010 to 2011 scores. As you will see in the graph below, these do correlate a lot more than the ranks. It still looks a lot like a blob of random points, but one that looks a bit like an upward sloping line. If you like this graph better, it proves that there is some stability in the metric from one year to the next. I still don’t think that the metric actually measures anything important so it doesn’t matter to me that it might be somewhat stable.
Looking at this got me thinking about what sorts of conclusions I could make about the progress reports data if I suspend disbelief and pretend that I believe that they are reliable. Reformers might criticize me here saying that if I don’t believe in these metrics aren’t I a hypocrite to use them to prove other points. I don’t see it that way. I see it like the way Clarence Darrow in the Scopes trial used The Bible in his famous cross examination. It is great to show how even under their own concocted metrics the reformers still aren’t able to cover up what little progress they are making.
So under the assumption that this progress metric was good, I looked through the database and found that out of 1108 schools, 67 were charters and 1041 were non-charters. In the student progress category, which accounts for 60% of the report card score, I found that there were exactly 86 F’s in this category. Looking closer, I saw that 9 of those Fs went to charters and 77 went to non-charters. This means that 9 out of 67 charters got Fs in this category, or 13% while 77 out of 1041 non-charters got Fs, or 7%.
So my first conclusion using this metric as The Bible is that if you go to a charter school in New York City you are twice as likely to get a school that has an F in progress than if you go to a non-charter school.
If we include Ds also, there were 175 schools that got either Ds or Fs. 15 out of the 67 charters got Ds or Fs, which is 22% while 160 out of 1041 non-charters got Ds or Fs, which is 15%.
So, even by the reformers own metric, a student would have a better chance of making progress at a non-charter school than at a charter school.
The next thing I studied was inspired by something I noticed in the progress report for one charter. I saw that their math progress score was significantly higher than their ELA progress score. I’m a math teacher and I love math, but I think that it is definitely over-emphasized in its importance in this standardized testing age. Reading is a much more important skill to develop. But if a school wants to maximize their test scores, they can focus on the math which is easier to test prep for.
So what I did was sort the list by the difference between the math and ELA scores. Out of 1108 schools there were only 126 schools whose math progress score was ten points or higher than their ELA progress score. Of those 126 schools, 28 were charters and 98 were non-charters. So 28 out of 67 or 42% of the charters had significantly higher math scores while only 98 out of 1041 or 9% of the non-charter schools did this. On the other end of the spectrum, only 2 of the 67 charters had significantly higher ELA than math scores. This is evidence of the type of intensive (and often mindless) math test prep that happens in some schools. You can see these charter outliers on the bottom right of the blob.
So charters which represent 6% of the schools but, since the schools are smaller, only 4% of the students have a pretty big representation in the number of Fs and Ds in progress. They also have an unusually high number of schools with way higher math progress than ELA progress.
Again, I encourage everyone to download the files (see previous post for the link) and see what sorts of things you can find in there.