The way reformers misuse data follows a very simple and predictable plan: First they get some skewed data, then pick a ‘researcher’ to interpret the skewed data. The ‘researcher’ then writes a report which gets touted in The74, EduPost, and eventually even makes it into more mainstream publications like USA Today and The Wall Street Journal. Since the report is filled with nonsense and half-truths, within a few weeks the truth comes out and the report is discredited, but not before the damage was done and the spin has made it into folklore. When this happens, the reformers will then ‘move the goalposts’ and get some more skewed data and start the process over again.

An example of this is the July 2017 report by Richard Whitmire called ‘The Alumni‘. Whitmire has written books about both KIPP and about Michelle Rhee so I think you get the idea of what his point of view is. In this poorly researched project he concludes that “Data Show Charter School Students Graduating From College at Three to Five Times National Average“. The national average he is comparing to is the 9% of low-income students that graduate college, according to the Pell institute.

This was probably the easiest report I ever debunked. The biggest flaw was that for most of the charter schools, they were only counting the percent of graduating seniors who persisted in college and then comparing that percent to the overall percent of all low-income students — an apples to oranges comparison. Whitmire acknowledges this in another post about the methodology in which he says that only KIPP counts students who leave the school before they graduate and that their numbers are much lower, but still at 38% which is at least triple the expected graduation rate for low income students.

A second flaw, and this one is very difficult to compensate for, is that charter school students are not a random sampling of all students since many families choose not to apply to them. So you get a biased sampling even if you do count all the students who get into the charter school and not just the ones who make it to graduate from the charter school. And even though I and others have discredited his report, it is something that still gets quoted in the main stream media.

Just recently, however, I learned of a report generated by Mathematica and funded by the John Arnold Foundation. I think that Mathematica is a very reputable company and even though reformers often hire them to produce reports, sometimes those reports reach conclusions that reformers were not expecting.

In this case, the report called “Long-Term Impacts of KIPP Middle Schools on College Enrollment and Early College Persistence” , reached a result that completely contradicts Whitmire’s claim that “Charter School Students Graduating From College at Three to Five Times National Average”.

What Mathematica did was follow 1000 students who applied to KIPP schools that required the use of a lottery since they had more applicants that open slots. 500 of them were offered spots and KIPP and 500 of them were not. Comparing the fates of lottery winners to lottery losers is more accurate than comparing KIPP students to students who never even applied to a lottery since the KIPP students are not a random sampling. The study compared the college persistence rate (what percent of each group completed four semesters of college) among other comparisons.

Here’s the relevant summary of what they found:

This needs a lot of unpacking: So 30.4% of the group who were offered a spot at KIPP were still in college compared to 25.6% of the group who were not offered a spot at KIPP. Remember that Whitmire claimed a 3x to 5x comparison for charter chains. Well, even if you just take these two numbers without any other context they provide, this is 30.4 is 1.1875 times as big as 25.6.

The big headline here should be that from this study in which all of the students involved are ones who entered the KIPP lottery (and only half of them actually attended a KIPP), about 28% of them completed four semesters of college. Where is Whitmire’s post about how this shows that simply applying to KIPP (whether you get in or not) increases your chances of persisting in college by 3 times? Really this shows what everyone already knows, but which anyone reporting about ‘The Alumni’ report ignores, that students who apply to charter lotteries are not a random sampling and therefore any comparison between college enrollment and college persistence for charter schools vs. all schools is going to be a flawed comparison.

Now since 30.4 is greater than 25.6, it might seem at first that winning a spot in a KIPP lottery causes a slightly higher chance of completing four semesters of college. But the report also says that this difference “is not significantly different from zero (p-value 0.135).” What does that mean?

Most people don’t understand what the difference between “large” and “significantly different” when it comes to statistics. Here’s an analogy that demonstrates this a little. Suppose there were 100 students altogether and 30 of them completed four semesters of college and 70 did not. If you took those 100 people and split them into two groups by some sorting rule, like one group is people who have an even social security number and the other group is the people who have an odd social security number and, it turns out, there are 50 people in each group. You would expect, since there is nothing about having an even or odd social security number that would cause one group to do better than the other group, that each of these groups would have about 15 students who completed four semesters of college, since that would be 30% for each group. But would you be shocked if it turned out that the group with even social security numbers had 16 students (32%) who completed four semesters of college and the other group had 14 students (28%) who did this? Would you feel confident in saying that having an even social security number somehow causes people to be more successful in college? Of course not. It was still very close to the expected 15 / 15 split. Well this is what is meant by “not significantly different from zero (p-value 0.135),” in a nutshell. The larger the p-value, the less significant the difference is with a ‘good’ p-value being very small, less than 0.05. So this report cannot say with appropriate confidence that winning the KIPP lottery is associated with any increase in college persistence rate. This is surely not what The Arnold Foundation would have hoped would come from this and I seriously doubt that The74 or EduPost will write about this.

There were three other comparisons that they did for this report and those three look better for KIPP so I want to explain about those also. Some of the students who won the lottery to get into KIPP did not, for whatever reason, go to KIPP. Maybe they didn’t want to go, maybe they were discouraged by KIPP not to accept their offer, who knows? So in addition to the comparison I just mentioned, there is a second comparison where they compared students in the study who did not go to KIPP (lottery losers and lottery winners who did not go for whatever reason) to students in the study who got an offer and also did end up going to KIPP. For that, the difference in the percent of students completing four semesters of college was 9% (24% vs 33%). This was still not considered a ‘significant’ difference by the authors of the report and for me, I would consider this statistic biased anyway. When you remove the lottery winners who did not go to KIPP for whatever reason, you would need to also remove the lottery losers who would not have gone to KIPP had they won the lottery. Since it is not possible to know who from the lottery losers would have not gone to KIPP, this is a comparison that I consider to be flawed.

There were two other comparisons where instead of college persistence, they compared just college enrollment. For this one there was a 7% difference which they considered ‘significant’ for lottery winners vs. lottery losers and a 13% difference for the other way where only the students who got into KIPP and enrolled in KIPP were part of the treatment group.

Maybe there is a benefit to going to college for a few semesters and then quitting before completing the fourth semester. I think that since charter schools talk so much about “To and through college,” getting into college and leaving so soon seems like something that they should not be celebrating.

The big takeaway, though, from this recent report is that it is an excellent counter to the 3x to 5x propaganda claim spread by the reform blog sites. This suggests that the number is “not significantly different from zero” when a more valid comparison is done.

“500 of them were offered spots and KIPP and 500 of them were not.” So they didn’t examine how many of the 500 students who were not offered spots at KIPP instead attended other charter schools?

About 16% not winning a lottery position went on to a KIPP school, only about 68% of those winning a lottery position ever attended a KIPP school. The treat-to-treat analysis takes care of the two groups being intermingled. There isn’t anything else about where the non-winners went to but in 2008/09 I don’t believe there were many alternative charters available.

The interesting thing for me was that 29.5% of the treatment group and 23.8% of the control group had “Mother has at least a college education” (page 6) (different to “Mother has some college education”). Even though the former statistic is likely a mistake (19.5% I would presume rather than 29.5%) it’s still incredibly high and hugely influential on outcomes for the child. These kids aren’t like a random selection of other kids.

Excellent point, mjpledger, about the mothers’ education. BTW, it does seem that they intended 29.5% since the four, apparently mutually exclusive, categories thereby add to 100%.

Are they mutually exclusive? “Mother has at least a college education” seems like a subgroup of “Mother has some college education” and why the former % should be smaller than the latter. Or maybe you right and the categories are just terribly specified in the table.

Pingback: Gary Rubinstein: That Misleading KIPP Research about College Completion | Diane Ravitch's blog

Note that KIPP utilizes a number of ways to impose hurdles in the admission process that keep out less able students from unsupportive families. Some years ago, when a happy KIPP parent in my community announced that his daughter had “tested into” KIPP San Francisco Bay Academy, I applied to that KIPP school for my own then-7th-grader to see if they would give her a test, and they did indeed contact us to schedule the test. (We didn’t follow through, for the record.) KIPP also uses pre-enrollment “counseling” that of course screens out less-able students and less-high-functioning families — and basically whatever other methods individual KIPP schools choose.

Also, Mathematica is NOT so pure — in fact, “tainted” wouldn’t be overstating it. Some years ago KIPP hired it to study its attrition, and Mathematica produced a report that conflated “attrition” with “mobility” and was used to deliberately obfuscate KIPP schools’ staggering attrition. That’s not righteous and destroyed Mathematica’s image of respectability and honesty.

Pingback: More Reform Funded Research: KIPP Graduates Persist In College At The Same Rate As Their Mothers | Gary Rubinstein's Blog

Pingback: Gary Rubinstein: More Reform-Funded Nonsense About KIPP Graduation Rates | Diane Ravitch's blog