Reformers love A-F school ratings. They think that giving a school a single letter grade will inform parents to make the best ‘choice’ of where they should send their child. The idea is that these ratings will improve schools in two ways: 1) When a school gets an F rating, the staff will stop being so lazy and negligent in the future so they can get their rating up to passing, and 2) Parents will ‘vote with their feet’ to ‘escape’ the ‘failing’ school which will cause that school to get closed down for under enrollment.
New York City used to have such a system and when the new mayor was elected I had a meeting with the accountability team at the NYC Department of Education to share my feelings about the A-F ratings. I said that I did not like them because they were based on some very sketchy statistics and that these crude calculations could improperly label a school as failing which would then become a self-fulfilling prophecy as the most motivated families flee that school. I was pleased to see, a year later, that New York City did abandon the A-F rating system.
But throughout the country, the A-F system is spreading. In the Every Student Succeeds Act (ESSA) states are required to identify the bottom 5% of schools. It doesn’t demand an A-F system, but there definitely does have to be some sort of ranking to determine which schools are in the bottom 5%, which is like an F in the A-F rating.
In 2015 Texas passed something called HB 2804 which said that by 2018 all Texas schools and districts would get A-F ratings on several different categories and also a single letter grade overall. They recently did a ‘test run’ to show what all the schools and districts in Texas would get right now under this system.
Each school and district were graded on four domains, all based on standardized tests: The first domain is ‘Student Achievement’ which is basically percent proficient on the state tests. Reformers always talk about how the most important measure of a school is not so much the test scores as much as the ‘growth.’ So the second domain is called ‘Student Progress’ and this is the mythical one that tries to isolate the school’s impact on the students. I think that this isn’t a bad idea to try to find a way to calculate this accurately. I’m not convinced, though, that they have found a reliable way to do this. Still, reformers often say that though these measures are not perfect, we shouldn’t let the perfect be the enemy of the good, or something like that, and they have been using these metrics to shut down schools and fire teachers with abandon.
Reformers always talk about expanding ‘high quality’ charters. And one of the most famous examples of such a high quality charter chain are the KIPP network of schools. There are about 200 KIPP schools around the country and surely there will be many more now that a very charter friendly president has been elected. KIPP schools are staffed by a large number of Teach For America teachers and alumni and was founded by two TFA alumni.
When I saw that Texas released the trial run of their new A-F system, I checked to see how the KIPP schools did on them, particularly on the ‘Student Progress’ domain. What I found, and you can double check this here, is that out of 37 KIPP schools in Texas that received a grade in this domain, 9 of them or 25%, received an ‘F’ in student progress.
I thought that maybe this was one of those things where a lot of schools got an F in this domain so I looked at the 280 Houston Independent School District schools and found that only 34, or about 12.5%, got an F in ‘Student Progress.’ So the percent of ‘failing’ KIPP schools is double the number of ‘failing’ schools in the biggest district in Texas.
I wonder what the reformers would think of this? Would they say that the progress ratings are not accurate, thus disparaging the most powerful tool they have for closing schools and firing teachers? Or would they say that KIPP is not a ‘high quality’ option that deserves to be expanded? Likely they will just ignore it. Things like A-F ratings are good for labeling district schools as failures, not major charter networks.
Pingback: Gary Rubinstein: 25% of KIPP Schools in Texas Get an F Grade | Diane Ravitch's blog
Thanks for this good work. I am always eager for more information that might explain the difference in the so-called “growth scores” of the schools rated A or B versus those rated F. This is the frustrating part of the outcomes only rating games and demands for versions of value-added scores and ratings..ratings that are not even relevant to many, many teachers whose stdents do not generate scores on state-wide tests.
What’s the percent statewide Gary?
Pingback: Ed News, Tuesday, January 17, 2017 Edition | tigersteach
Pingback: When Charters Fail To Make The Grade: The KIPP Edition | Mike the Mad Biologist
Pingback: KIPPie Does Dallas | Gary Rubinstein's Blog