The Alum-lie Spreads

Two weeks ago, The 74 published the results of The Alumni which claims to show that the graduates of certain charter school networks go on to graduate college at a rate of 3 to 5 times the rate of low-income students on average.

They say that only 9% of low-income students graduate college after six years while among the nine charter networks they studied, their graduates had college completion rates ranging from 25% (approximately 3 times 9%) and 50% (approximately 5 times 9%).

The problem with this calculation is that the charter schools are only counting students who completed 12th grade at that school (or for KIPP, 8th grade).  So if a school only has 14 graduates and 7 of them graduate after six years, it is accurate that 50% of their graduates went on to complete college, but if that cohort of 14 students was 40 students three years earlier, then their rate is really 25%.  In other words, by just counting the ‘graduates’ they get an inflated college completion rate.

In the original The Alumni article the author, Richard Whitmire, admitted as such.  He even put a comment from the KIPP network about how the other schools should use 8th grade as the cutoff so they don’t get unfairly inflated percentages.  I argued in my first post about this that 5th grade would be an even more accurate cutoff.

In a follow up article on The 74 called The Data Behind The Alumni the case is made even stronger:

The one network that insists on including students who leave the system is KIPP, which reports its college success data starting in ninth grade for students new to the KIPP system and at the end of eighth grade for existing KIPP students. YES Prep, part of the United for College Success Coalition in Texas, has promised to start calculating its college success data from ninth grade, but no figures are yet available.

All the other networks start their data set in 12th grade — and say they don’t have data that begins in ninth grade. KIPP takes a principled stand on that issue, refusing to release any results that start the tracking in 12th grade, despite the fact that it would boost its college success rate.

Within the charter community, this is turning into a hot-button issue. KIPP feels very strongly that the only honest method for reporting graduation is to start in ninth grade. In theory, a charter network could increase its college success numbers by pushing or counseling out weak students before their senior year. That would apply to any high school, not just charter high schools.

You’ve got to love the part about the other networks “say they don’t have data that begins in ninth grade.”

Also on The 74, and again to their credit, they published something by the chief executive officer of YES prep.  He brags first that the class of 2010 had a 54% college completion rate, but then, a few paragraphs later admitted about that same cohort:

When I was principal of YES Prep North Central in 2010, only 34 percent of our founding sixth-grade class went on to graduate from our campus. This unacceptably low persistence rate, a symptom of a “no-excuses” culture, needed to be addressed to align with our mission of increasing the number of students from underserved communities who graduate prepared to lead.

Suddenly the 54% turns into 54% of 35% which is just 18% college completion.

He says that from now on they are going to use the 8th grade cutoff like KIPP, though of course he should be pushing it to 6th grade based on what he just said about their huge attrition numbers.

Another interesting statistic in that follow up article is that if you want to do a more fair comparison you would want to compare not to the 9% number but to the percent of low-income high school graduates that go on to graduate college within 6 years.  To their credit, The 74 does say that this is actually 15%, to which they then say:

Even if as many as 15 percent of low-income minority students who make it through high school earn college degrees, that means these top charter networks are still doing three and a half times better.

So suddenly it goes from 3 to 5 times better to 1.6 to 3.3 times better.  This is important since surely the 3 to 5 number is the thing that is going to be remembered by people who read just the first article or one of the editorials Whitmire has published in The New York Daily News, The Wall Street Journal, and, most recently, The Hill.  There was also a report about The Alumni by another writer in The Houston Chronicle.

So after reading Whitmire’s The Hill piece which has no mention at all the issues with the calculations, I reached out to him on Twitter and we had this amusing exchange.

Screen Shot 2017-08-08 at 4.09.59 PMScreen Shot 2017-08-08 at 4.10.13 PMScreen Shot 2017-08-08 at 4.10.46 PMScreen Shot 2017-08-08 at 4.11.05 PMScreen Shot 2017-08-08 at 4.11.23 PMScreen Shot 2017-08-08 at 4.11.40 PMScreen Shot 2017-08-08 at 4.11.52 PM

And that was it.  I have no idea what his last tweet means.  I think Trump might have written it.  Why would I back off if I found flawed data.  I just don’t get it.  This is why most reformers don’t engage with me.  In any kind of debate they have with me on equal terms I absolutely embarrass them.

Advertisements
This entry was posted in Uncategorized. Bookmark the permalink.

21 Responses to The Alum-lie Spreads

  1. John says:

    This is another strong post. Thanks for sharing this. With respect to the final tweet, I can’t be sure but I think that he is trying to counter your point by saying: “Anyone who is worried about misleading numbers being reported in these situations should relax. If you find out that a charter network you are working with has flawed data, then don’t promote/work with them. No big deal!”

    If this is what he means, then it is a weak counter. Parallel lines of logic would be these:
    “Fake news that misleads readers is not a big deal. Readers have access to other news sources and should just not believe a fake news story if they find a credible news source that contradicts it.”
    OR
    “There is no harm in the president writing tweets that are patently false. Given that there are fact checkers out there, just look for a fact-checking site after your read a tweet that seems to be inaccurate.”

  2. Stephen B Ronan says:

    To the extent I was able to grasp it (been a long time since I took the Regents), I was enormously impressed by your series on those exams. I truly hope that the powers that be take you up on your offers of assistance.

    Here, though, it seems to me that rather than offering helpful clarification, you’re muddling together different techniques for making these calculations… and encouraging use of a method, which is not in regular use by any reputable bureaucratic or scholarly authority that I’m aware of. And for good reason.

    John, Joe, Mary, and Kate are 9th graders together at a H.S.
    John and Joe graduate, Joe completes college but not John
    Mary and Kate, during their junior year move to another town where Mary graduates HS and completes college, Kate drops out of college.

    By KIPP’s method that keeps tracking the students’ outcomes wherever the go, college completion is 50%

    Using your prescribed method for other schools in that circumstance that don’t track students who transfer out, we’d get 25%, i.e., don’t count Mary.

    What’s particularly odd is that you then seem to want to use results derived from that latter method to compare to an entirely different method. I’m asking you once again, what do you think the 9% (or the 15%) rate for low-income students would be if you ignored the college completion of any who had transferred out of one school and into another after your preferred cutoff year (5th grade? 8th grade?)?

    By the way, here’s a little more info that you may find of interest concerning some activities of the Academy of the Pacific Rim, where The 74 article had cited a 70% college completion rate for graduates (I alluded to that Boston school in my comment on your previous article on this subject):
    http://www.bulletinnewspapers.com/23469/270622/a/apr-gets-wild-card-invite-for-national-civics-competition.

  3. mjpledger says:

    Kipp shouldn’t count all of Mary’s success. They only taught her between 1/2 and 3/4 of her years in high school so they should only claim that on their graduations rates e.g. their rate would be somewhere between 37.5%. and 43.75% depending on when she left.

    Also, kids who are expelled, asked to leave, etc shouldn’t be counted in the numerator but should be counted in the denominator – schools shouldn’t be able to count the success of kids they gave up on,

    • garyrubinstein says:

      I agree. For sure they can’t just shed kids and get rewarded for it otherwise they will have the incentive to do just that.

    • Stephen B Ronan says:

      If I understand your notion properly, they would be responsible for 100% of the inability to complete college of anyone who transferred out to another school system, but only for a fraction of the success of those who did complete college? That seems unreasonable. Perhaps you could elaborate if I am misunderstanding your suggestion.

      • garyrubinstein says:

        Well, keeping in mind that the number they publish is inflated in so many different ways, I’m comfortable with my ‘lower’ bound where I don’t give them credit for the graduation of the kids that they kicked out. I would be interested in tracking the kids who did not graduate from the charter school. If they had a very low college graduation rate, it would show how inflated the alumni numbers are. Then again, even if they had a high college graduation rate, it could be because the charter schools have the more motivated students and it could prove, again, that it isn’t fair to compare the alumni numbers to the 9 (or 15) percent.

        I agree that this is a tricky calculation, but these are the sorts of questions that journalists don’t think to ask when they hear things like “this school had 100% of their graduating seniors admitted to college” and most of the public thinks that it means 100% of the cohort when it is actually just 50%. I’m just trying to point things like that out and why I believe the alumni numbers are inflated. Without more data it is true that it is speculation, but I try my best.

      • Stephen B Ronan says:

        If an appropriate “lower bound” is to assume that nobody who left the school succeeded in some endeavor, wouldn’t the equivalent “upper bound” be to assume that each and every outgoing transfer did succeed?

        I don’t see evidence of anyone attempting an “uppper bound” analysis like that.

        Instead, for those attempting an analysis with a starting point prior to HS graduation, some don’t include in numerators or denominators anyone who transfers out (standard Ed bureaucracy methodology) or track the student after departure, and include their results, favorable or unfavorable, with the originating school, after a transfer. Neither method is perfect. No method is for all purposes. But certainly virtually all scholars and bureaucrats consider either of those methods preferable to the “lower bound” method you offer as an alternative.

        I certainly appreciate your attempts to shed needed light on statements like: “had 100% of their graduating seniors admitted to college.” Keep it up (using, I hope, increasingly valid methodology)!

        Naturally, I thought immediately of you when, a week or two ago, I heard Randi Weingarten say this on our local NPR station:
        “But Margaret, you’re talking to the head of the AFT. I run a charter school in New York that had a 100% graduation rate.”
        http://news.wgbh.org/2017/07/28/bpr-0728-full-show-post (42:00)

      • Stephen B Ronan says:

        By the way, I think you’d find of interest the analysis on pp 32-33 of this document comparing those who stay and those who leave the Brooke charter schools in Boston.
        http://www.doe.mass.edu/bese/docs/FY2016/2016-02/item3-tabA1-1.pdf
        It’s a shame that an independent authority doesn’t ensure that such data is carefully gathered and made publicly available for each school.
        As you likely know, Whitmire hailed Brooke as America’s best in the charter sector (presumably supplementing his visit there with a look at CREDO’s 2017 analysis of CMOs).
        https://www.the74million.org/article/whitmire-americas-best-charter-school-doesnt-look-anything-like-top-charters-is-that-bad

      • Steve M says:

        The official college completion rate for public high schools includes all students who begin in a school’s 9th grade but who do not officially transfer to other institutions (i.e., receive transcript requests from other institutions and are accounted for). For various reasons (kids move away and their new schools don’t bother requesting transcripts, undocumented kids leave the country, students drop out, and so forth…), the persistence rates/official graduation rates for inner city 9th graders across the nation typically run about 50-60%.

        The persistence rates for charter schools are much, much higher, for reasons that you well know and understand (charters cherry-pick kids, students driven out of charters are always accounted for, and so forth…), but pretend not to.

        You are being damned disingenuous.

      • Stephen B Ronan says:

        Here in Boston, there’s quite a bit of selectivity among major district schools e.g.: tests (Boston Latin, Boston Latin Academy, John O’Bryant) auditions (Boston Arts Academy), essay (Fenway High), interviews (Boston Community Leadership Academy). While by contrast for charter schools, basically you can fill out name, address, phone, email, and grade level, submit once and thereby complete the application for virtually all of the charter schools in the city that serve the appropriate grade.

        That said, I rarely put much if any stock in attempts to contrast overall district and charter school results, tend to rely heavily instead on results achieved by comparing charter lottery winner/loser info or well-matched virtual pairs.

        As you should see in this thread, I’ve mainly been pointing out flaws in Rubinstein’s approach rather than offering an alternative that I think ideal for all purposes.

        By the way, if you like the idea of including in the denominator those who transfer out and graduate, while removing them from the numerator, for charter schools, you’ll presumably adore the Mass Teachers Association method that to favor district schools adds incoming transfers who graduate to the numerator, but doesn’t add them at all to the cohort/denominator.

        Combining those techniques together, and ignoring front-loaded attrition, led to their knowingly trumpeting wildly false material in their campaign literature last fall like this:

        “A study of charter high schools in Boston showed that only 40 percent of those enrolled as freshmen made it to graduation, compared to 80 percent of those enrolled in the Boston Public Schools.” (they left unsaid that the bizarrely defective “study” was of their own creation in 2009).

        FWIW, I have a message stuck in moderation here that highlights one celebrated Boston charter school system’s comparison of students who stay with those who transfer out. Will be interested in your thoughts on that if it ever escapes moderation.

  4. Jack Covey says:

    Excuse my vulgarity, but Richard Whitmire wouldn’t know The Truth if it went down on him.

    In the past, Whitmire has shown himself to have the credibility of O.J. Simpson, or of Melvin Dummar (the Howard Hughes’ will forger and scam artist, for those unfamiliar).

    For example, Whitmire is a man who actually believes that Michelle Rhee, during her 2-year TFA stint in Baltimore, took her students from the 13th to the 90th percentile in those students’ Reading and Math scores. Such an accomplishment would be the pedagogical equivalent of the moon landing, or splitting the atom.

    Here’s an excerpt from Whitmire’s book

    “THE BEE EATER: Michelle Rhee Takes on the Nation’s Worst School District”:

    x x x x x x x x x x x x x x x x
    RICHARD WHITMIRE:

    “According to Rhee, all this (hard work of Rhee & her students) paid off in startling test score gains. According to the resume turned over to the Washington, D.C.’s City Council in her 2007 confirmation hearing, 90% of her students were scoring at the 90th percentile on national reading and math tests. Only two years earlier, they were scoring on average at the 13th percentile.”
    x x x x x x x x x x x x x x x x

    Sweet Jesus! Every teacher to whom I’ve ever repeated this claim burst out laughing, “She really said that?” “Seriously?” ” You can’t be serious!” are typical responses.

    The Whitmire’s and Rhee’s of the corporate ed. reform world are so ignorant of the realities of the classroom, and are so reckless in their desire to promote themselves and their nonsensical ideas that they continuously vomit up one ludicrous assertion after another, while not even comprehending the sheer idiocy of what they are claiming.

    When asked for documentation of her 13th-to-90th-percentile miracle, Rhee said, “You want proof? Well, I don’t have it.” She claimed simultaneously that since no one could access data DISPROVING the claim, everyone would and should just have to take her word for it. Alas, they did.

    However, unfortunately for both Rhee and Whitmire, math teacher Guy Brandenburg eventually obtained the actual data definitively proving that all of this was false:

    https://gfbrandenburg.wordpress.com/2011/01/31/the-rhee-miracle-examined-again-by-cohort/

    Faced with Brandenburg’s research, even Rhee supporter Jay Mathews at THE WASHINGTON POST had to concede that the claim was a bogus — again, a falsehood later repeated by Whitmire in his book:

    http://voices.washingtonpost.com/class-struggle/2011/02/michelle_rhees_early_test_scor.html

    • Stephen B Ronan says:

      I’m presuming he relied largely on the 2017 CREDO analysis of CMOs when lauding Brooke’s students’ academic advances in comparison to other charter networks, in conjunction with what he saw on a visit there. The portrayal of what he saw and heard on his visit seems particularly plausible to me because it is consistent with what others, from Boston Globe reporters to Jennifer (“Edushyster”) Berkshire have observed on their visits. And I spend a little time with kids who are students there.

      I think you’d find of interest the discussion in the comments section if you search for:
      edushyster, brooke, “interesting and amazing things”
      The quoted part is from charter skeptic Berkshire/Edushyster after a visit.

      If you don’t have time for all of it, I’d especially highlight the first comment by mathteacher as worth a read.

  5. Steve M says:

    Gary, you need to fix this:

    “So if a school only has 14 graduates and 7 of them graduate after six years, it is accurate that 50% of their graduates went on to complete college, but if that cohort of 14 students was 40 students three years earlier, then their rate is really 25%.”

    7 out of 40 is 17.5%, not 25%.

  6. Pingback: Gary Rubinstein: The Alum-Lie Spreads About College Completion Rates for Charter Students | Diane Ravitch's blog

  7. carolinesf says:

    In earlier times, KIPP used to lie outrageously about this. Maybe the lying got so obvious that it started to backfire and they have since decided to change that tactic.

    Candidly, there is a question to be asked here. If a school has a student population that weeds out lower-functioning/challenged students, does that mean some at-risk students have a better chance of success, surrounded only by other motivated, compliant students? It seems like a valid question, though it’s hard to know what the ideal route for creating that climate would be.

    But the question isn’t asked because the charter sector lies and lies and lies about it, and the lies are overall not challenged in the mainstream world.

    • Stephen B Ronan says:

      That is a good question, one that is often asked. It appears that in respect to High Schools, at least, the reality may be the opposite of what you’re guessing. Instead of charter schools getting a higher performing bunch of students through a “weeds out” technique, it may instead be the case that charter schools tend to more successfully retain their most struggling students while such students are more likely to drop out of traditional public schools.

      Angrist et al examined this question in “Stand and Deliver: Effects of Boston’s Charter High Schools on College Preparation, Entry, and Choice” and concluded:

      “Charter schools are sometimes said to generate gains by the selective retention of higher-performing students — see, e.g., Skinner 2009. In this view, charter effectiveness is at least partly attributed to a tendency to eject trouble-makers and stragglers, leaving a student population that is easier to teach.”
      […]
      “These results suggest that positive charter effects cannot be attributed to low-quality peers leaving charter schools. If anything, selective exit of low achievers is more pronounced at Boston’s traditional public schools.”
      http://economics.mit.edu/files/9799

      In respect to middle schools, it’s not much more than anecdotal evidence and could well be atypical, but it may be worth looking at the material I referenced above about Brooke charter schools, particularly since the’re so successful. It surely appears that their huge successes are not predicated on “weeding”.

    • Steve M says:

      Inner city school teachers will tell you [the phenomenon occurs everywhere, it’s just much more pronounced in the inner city schools] that when you hit a critical mass of problematic kids in a class (somewhere between 10-20%), the class becomes too unruly and suffers tremendously. At that point the battle of trying to have a highly pronounced effect on the students in the class is essentially lost. If the number of problematic students is kept below 10% an effective teacher can produce good results.

      Conversely, when “marginal” students (C students who are well behaved) are placed into academically challenging situations with higher achieving peers (and make up to perhaps 40-50% of the class population) they can grow tremendously. That can be a very painful process, as it requires a great deal of pushing, prodding, cajoling, separation of the marginal students from their less academically achieving friends, tutoring, support and so forth. It can definitely be done, and this is what the most successful charters are accomplishing…but it absolutely requires the separation of the marginal students from their lower achieving friends.

      There’s nothing magical about it, and it is well documented: overwhelmingly, the number one factor in student achievement is the quality of the friends/peers that the student surrounds him/herself with. Charters affect this predominant factor through a process of artificial selection, separation and culling.

      • Stephen B Ronan says:

        Your argument makes a lot of intuitive sense. But the research I’ve seen provides it far less support that I might have anticipated.

        I’ve highlighted Brooke charter schools in Boston. They admit students as kindergarteners and also do some back-filling, all via an open lottery, plus sibling preference. Culling? They’re apparently thoroughly inept at that. You saw the data I referenced before?

        Brooke students develop academic skills to a degree enormously greater than peers/”twins” matched on a whole bunch of different measures who are educated in the local public district schools.

        By contrast, there are prestigious exam schools like Boston Latin, and Stuyvesant High school. Selectivity to the max!

        In “The Elite Illusion: Achievement Effects at Boston and New York Exam Schools” by Abdulkadiroglu, et al., we see: “Our strategy in a nutshell is to compare the scores of exam school applicants who barely clear the admissions cutoff to the scores of those who fall just below.” “Our estimates show little effect of exam school offers on most students’ achievement.”
        https://economics.mit.edu/files/6931

        BTW, I’m curious at your opinion… Imagine you have two groups of middle school kids attending school together in the same traditional public school. And these two particular groups have virtually identical academic test scores…. But you identify one of the groups as having grown up with parents substantially more motivated to be engaged in their children’s education and to promote academic success. Which of the two groups would you consider more likely to be academically successful in High School? And college? And why?

      • Steve M says:

        You’ve already explained how the Boston charter situation is entirely different from that of Chicago, Los Angeles and most other regions, so I’m not surprised to see research that looks into Boston’s charters be at odds with what anti-charter commentators state (and is the norm nearly everywhere else).

        I am not particularly surprised by this: “Our strategy in a nutshell is to compare the scores of exam school applicants who barely clear the admissions cutoff to the scores of those who fall just below … Our estimates show little effect of exam school offers on most students’ achievement.”

        Some of the top kids from low performing, inner-city schools move into elite high schools, while comparable peers stay in their local school. The kids going into Brooklyn Tech and Stuyvesant under such circumstances are literally the weakest there, and struggle to adapt. And, although they do prosper and grow, they often feel alienated and stigmatized, as opposed to the kids that stayed in their neighborhood. The ones that stayed end up being the highest achieving there, and then are given the benefits of affirmative action that accrue from being at the top of an inner-city school. The two groups obtain similar outcomes…they end up going to selective universities and being fairly successful.

        The phenomenon that I was speaking of described what happens when you remove average-to-slightly-above-average kids from an academically impoverished situation and place them with higher achieving peers.

        As to this: “BTW, I’m curious at your opinion… Imagine you have two groups of middle school kids attending school together in the same traditional public school. And these two particular groups have virtually identical academic test scores…. But you identify one of the groups as having grown up with parents substantially more motivated to be engaged in their children’s education and to promote academic success. Which of the two groups would you consider more likely to be academically successful in High School? And college? And why?”

        For the most part, you don’t get what you describe. Children of driven parents begin to show increased achievement early on. Your situation is asking what happens if intrinsically bright kids hailing from bad environments will achieve at higher levels than more average kids hailing from better home environments. You’re conflating two distinct groups.

      • Stephen B Ronan says:

        Steve M: “You’ve already explained how the Boston charter situation is entirely different from that of Chicago, Los Angeles and most other regions,”

        “Entirely different” rather overstates the case. There’s significant overlap… I would expect that the Kipp school here is more similar to the KIPP schools in LA and Chicago than it is to other schools in Boston like Brooke or MATCH. And the Uncommon School here quite likely has considerable resemblance to those in NYC, Newark, Camden, etc.

        But if Boston schools are relatively unlikely to adhere to your model for how charter schools achieve success, and at the same time achieve particularly great success, does that help call into question the validity of your analysis of how charter success is achieved?

        Steve M: “I’m not surprised to see research that looks into Boston’s charters be at odds with what anti-charter commentators state (and is the norm nearly everywhere else)”

        What “anti-charter commentators state” is frequently demonstrably false no matter where they or the schools they’re referring to are located.

        Steve M: “I am not particularly surprised by this: ‘Our strategy in a nutshell is to compare the scores of exam school applicants who barely clear the admissions cutoff to the scores of those who fall just below … Our estimates show little effect of exam school offers on most students’ achievement.'”

        I find your explanation interesting, helpful, plausible. I wonder whether NY’s Comparison Group data should be considered a useful supplement to the research I cited?
        http://schools.nyc.gov/OA/SchoolReports/2015-16/School_Quality_Snapshot_2016_HS_M475.pdf

        “Comparison Group is made up of students from other schools across the city who were the most similar to the students at this school, based on their incoming test scores, disability status, economic need, and over-age status. The ‘comparison group’ result is an estimate of how the students at this school would have performed if they had attended other schools throughout the city.”

        That appears to show 84% of Stuyvestant students as having “graduated from high school and enrolled in college or other postsecondary program within 6 months. And 95% with that result among the comparison group from other schools throughout the city. That stands out as a bit of a sore thumb. I wonder if the comparison there has defects that anyone here is aware of?

        Steve M: “As to this: ‘For the most part, you don’t get what you describe. Children of driven parents begin to show increased achievement early on. Your situation is asking what happens if intrinsically bright kids hailing from bad environments will achieve at higher levels than more average kids hailing from better home environments. You’re conflating two distinct groups.'”

        I’m not conflating two groups, and don’t understand your response at all. Perhaps if I rephrase the question, you’ll understand better what I’m getting at.

        Again, I am asking you to consider two groups of middle school students where the test scores of each group are the same, but one group has parent(s) who are particularly highly motivated to facilitate the students’ academic success and the other doesn’t.

        I can’t help but imagine that the kids with the less motivated parents may have more academic aptitude and/or self-motivation if they are obtaining test scores that are just as good as the comparison group.

        If you operated a school, which group would you hope to attend your school? Which group is likely to achieve greater academic progress in High School? The kids with greater academic aptitude and self-starter capacity, or the kids with the more motivated parents? I’m not sure there’s a clear answer.

        In case it helps you understand the question I am posing… anti-charter commentators look at, for example, CREDO results for Boston and are commonly inclined to suggest that charter lottery participants have more motivated parents, and that is why they are more successful than their “twins”. But conveniently overlook the fact that the CREDO matching process includes test scores as one of the elements for constructing the virtual control record.

        As a particularly egregious example of similarly defective analysis, funded by teachers unions, and apparently attempting to undercut the CREDO results that had demonstrated mildly positive results for Chicago charter schools, Orfield and Luce wrote:
        “However, there are reasons to worry about the CREDO results. The studies match charter students with traditional school students based on race/ethnicity, gender, English proficiency, free/reduced price lunch status, special education status and grade level. The method is designed to control for selection bias by creating a control group like those used in randomized experiments, but the list of matching variables does not include anything that reliably captures parental engagement, a primary source of selection bias in charter studies (Maul, 2013; Miron & Applegate, 2009).” Orfield, M.,& Luce, T. (2016). An analysis of student performance in Chicago’s charter schools. Education Policy Analysis Archives, 24(111).
        http://dx.doi.org/10.14507/epaa.24.2203

        They mention six of the CREDO variables. But conveniently omit the seventh (though Maul had certainly acknowleded it). The seventh variable that they omitted? Baseline test scores. If test scores don’t “reliably capture parental engagement,” their argument falls apart. And if they do, their argument falls apart.

  8. Richard Whitmire does not like to be challenged.
    When challenged on Michelle Rhee and her story, he started to compare critics of Rhee (and him) to the birthers on Obama.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s