Why does the ‘worst’ NYC high school have better SATs than the ‘best’ NYC high school

My father, the occasionally high profile defense attorney Ronald Rubinstein, graduated from Samuel J. Tilden High School in Brooklyn in 1956.  Fifty years later, the NYC DOE began the process of phasing out the school and replacing it with three small school on what is now the Tilden Educational campus.

The NYC DOE recently released the 2012 high school progress reports, and my father was proud when I told him that one of the schools in his old building was rated as the ‘best’ high school in all of New York City, a school called ‘It Takes A Village Academy.’  Then I had to tell him the rest of the story.

There are now around 450 high schools in the five boroughs of New York.  About 100 of them didn’t get progress reports at all, for one reason or another.  Of the remaining 350, I was disappointed, but not surprised, that the school I teach at, Stuyvesant High School — often on lists of the top schools in the entire country — was ranked as the 59th best school in the city.  This, despite the fact that nearly 25% of our students get accepted to Ivy League schools, that we had the most Intel semi-finalists in the state, and our average SAT scores at around 2100.  Our rivals, Bronx Science and Brooklyn Tech were ranked 31st and 139th respectively.

I decided to take a close look at the progress report for the top rated ‘It Takes A Village Academy’ and found that the progress reports are even more of a sham than I had determined from my earlier analysis of them.  This year, to the credit of the DOE, the progress reports have much more detailed information about how students actually did on different tests.  On page 8 of the progress report it has this:

So this school, which got an A not just in ‘progress’ but in ‘performance’ too, has horrific Regents scores.  And as far as ‘rigor’ goes, notice that while 139 students took the Algebra Regents, a test that 8th and 9th graders generally take, only 80 took Geometry, and 24 took Algebra II / Trigonometry.  Their ‘college readiness’ stats show that few students got over 75% on even the highly curved Algebra test.  Also notice how few students even take any science Regents.

But the real amazing numbers here are their SAT scores.  Though only 55% of the students in the cohort took the test, presumably the top 55%, their scores were 325 in math, 325 in reading, and 345 in writing.  To put these scores in perspective, you get about 300 in each section for writing your name.  According the The College Board, the scores for these students are in about the 5th percentile.

I also noticed that the Harlem Children’s Zone Promise Academy I High School was ranked in the 99th percentile as the 5th best school in the city.  HCZ is known for being not just a school, but an ‘anti-poverty’ program where millions of dollars are spent providing health clinics and other ‘wraparound’ services.  I definitely approve of this comprehensive approach, but based on their scores, I do not think we can declare them a ‘miracle school’ just yet.

Their Regents scores are better, but notice that they have 0 students taking Geometry, and hardly any taking any science.  They don’t have their students take the SAT, but their ACT scores are 52 percentile for math, 24th percentile for English, 19th percentile for reading, and 24th percentile for science.  Definitely better than the #1 rated ‘It Takes A Village,’ but I really wonder if these scores help to support the case that ‘poverty is not destiny.’

I also thought it would be interesting to see what the statistics were for the lowest rated high school, the ironically F rated ‘Academy For Social Action:  A College Board School.’  This school is one of the 24 schools that is now slated for closure.  While their Regents scores were extremely low, I noticed that they actually had higher SAT scores than the top rated school.  They had a 373 in math, a 362 in Reading, and a 356 in writing, which are about the 10th percentile.  Now they only had 37% taking the test so it is hard to make an exact comparison to the 55% the other school did, but still this is pretty strange.

The ‘Harlem Village Academy’ High School run by Deborah Kenny actually got a C rating though, for unknown reasons, their progress report is not available online.  The KIPP high school has not gotten a rating yet since they are just having their first graduating class this year.

Now of course I don’t take these low test scores to mean that these are necessarily ‘bad’ schools or that they should be closed and their teachers fired.  But still it is pretty amazing that schools with such low achievement can be ranked so high on any metric.  And for other schools to get closed on the same crude metric is shameful.

This entry was posted in Research. Bookmark the permalink.

22 Responses to Why does the ‘worst’ NYC high school have better SATs than the ‘best’ NYC high school

  1. Wayne Gersen says:

    My hunch: progress is measured by change from one year to the next. If your HS has lots of high scoring students, there isn’t much “headroom” for them to show “improvement” (i.e. if you have kids scoring 700 on SATs it’s far more difficult for them to “improve” than kids scoring 400… This is one of many flaws with VAM: it overvalues improvements toward the mean and undervalues improvements toward the top of the scale. My guess, without looking at the data, is that schools rated near the top of the NYC metric were ranked low the prior year and vice versa.

  2. How can anyone have faith in a system where the correlations between measures are so pitifully bad?

    This is a failure at the top. Tweed and the Bloomberg administration should hang their heads in shame at their miserable performance. Unfortunately, mayoral control all but assures no one will be held to account.

    As always, Gary, great job.

  3. I’d love to see Harlem Village Academy data; anyway you can get it from the NYC charter office Gary?

  4. Pingback: NYC’s Meaningless High School Report Cards « Diane Ravitch's blog

  5. Doug says:

    Why don’t more people compare regents scores to the state average? You have % passing, but what does that mean? How hard are the exams?

  6. Gideon Stein says:

    Gary, the system is, as you know, weighted towards rewarding growth. If it were about absolute scores, of course places like Stuyvesant, Bronx Science, etc. would be at the top.

    • Gary Rubinstein says:

      Well, 60% is based on ‘growth,’ I know that. Still, what does it mean when the high school with the most ‘progress’ still turns out kids in the bottom 5%. Do you think if these two schools swapped kids, the results would be much different?

      • Tami C. Elton says:

        Hello, Mr. Rubenstein~
        I do not teach in NY, but in FL. However, I am so disenchanted and dispirited with, not to mention disenfranchised from, the education system within which I continue to plod that I am keeping tabs on happenings in other states, as well as my own.
        That being said, it would be a great service to all education services if your suggestion of swapping kids could be achieved. Of course, that would never happen, but it is worth pondering.

        Tami C. Elton
        Palm Bay, FL

  7. Gideon Stein says:

    Would need to look at last year’s data to have even a slightly informed guess. That said, growth is at the top of what I’m focused on with my turn around schools (especially since we start with the whole school, not just start with ninth grade).

  8. Michael Fiorillo says:

    “But still it is pretty amazing that school’s with such low achievement can be ranked so high on any metric. And for other schools to get closed on the same metric is shameful.”
    – Gary Rubinstein

    “Tweed and the Bloomberg administration should hang their heads in shame at their miserable performance.”
    – Jersey Jazzman

    Would that were so, but their behavior since Day 1 strongly suggests that they are incapable of feeling shame. This is class and institutional malice on a pathological level.

    Too many people are still looking at the wrong measurements for the success of corporate education reform. If your looking for the educational, emotional and social needs needs of all children being met, you’re doomed to bitter disappointment, because that’s not what these people are about.

    But if you see their true “success” as measured by how effectively they can destabilize, dismantle and auction off a democratically-run education system, they’re succeeding all too well.

    The problem is, their “success” means misery and repression for students and teachers, tremendous spoils for politically juiced insiders, and long-term social devolution for the country.

    If only shaming these people was enough.

  9. Michael Paul Goldenberg says:

    Come on, Gary, not the old saw about getting points for signing your name. The SAT is on a 600 point scale for each section that runs from 200 to 800. So no one gets 300 “just for signing his name.” Of course, the scores you cite are awful, but that’s a different issue. No need to look at the scaled scores to see that: just look at the percentile rankings they represent and leave the old wives’ tales about getting points for signing your name to the old wives and their old husbands.

    • Gary Rubinstein says:

      Most people don’t realize that the lowest possible score on a section is 200, and that you get about 300 for ‘writing your name’ and than can get lower by getting more than 3/4 of the questions you choose to answer correct. It is important to point out since some people could think “well at least they didn’t get a 0.” Others might think that 300 is better than half of 500, which is a bit better than average and think that 300 is the 25th percentile. But is is accurate to say you get 300 for signing your name and I’ve tutored very low scoring students who have wanted to get 400s and I’ve given them strategies for answering just 5 questions per section (and getting them right) to achieve that.

  10. Paul Heymont says:

    It seems to me that one of the big problems with this kind of discussion is this: The widespread, and rightful, distrust of everything that comes from Tweed and the like bends discussion into pretzels.

    Take the discussion here about the Progress Reports. DOE has never asserted (although the News and the Post have) that they are a ranking of best school/worst school.

    They are a ranking of how well a school did on certain measures, compared with others in its peer group…and each school has its own peer group of 40 schools whose students have similar profiles when they enter. Two-thirds of the school’s comparison comes from that group, one third from a comparison with city-wide results.

    For that reason, the entire media-driven comparison of who got the most points is meaningless. ITAVA got its A by showing progress on the numbers DOE values…not by “beating” Stuyvesant.

    Whether or not you think the right measures have been chosen (I don’t) and whether or not you feel the entire Progress Report hoopla is worthwhile or valid (I don’t), please don’t distort what it is, or turn it into a debate on whether students who enter high school (shame on the system!) reading at 4th or 5th grade level should be taking as many science Regents or having SAT scores as high as those who test into Stuy/Tech/Sci.

    Let’s concentrate our fire on the non-accountable honchos and the profit system behind them that love it when we attack each other instead of fighting the daily attacks on students and teachers.

    • Gary Rubinstein says:

      True, these reports are supposed to measure ‘progress’ but what I question is whether this school has truly ‘progressed’ its students so much better than another school would have if after four years they can only get 5th percentile on the SAT. Now I’m not one to read too much into test scores, but to me this isn’t a lot of progress no matter what the starting point was.

  11. A. Evans says:

    It’s interesting that edu-crats are clamoring for more standardized testing when they blatantly seem to disregard the actual results of the tests they already administer.

    Thank you for such an incisive and compelling analysis of the inequalities and arbitrary judgment based on these findings.

    So what conclusion can we draw from this? That businessmen in charge of schools disregard facts and statistics if it doesn’t fit their political agendas?

  12. Paul Heymont says:

    ” I question is whether this school has truly ‘progressed’ its students so much better than another school would have if after four years they can only get 5th percentile on the SAT.” Well, in fact, that IS what it’s supposed to measure…the “another school” being the peer group of schools whose students, by DOE stats, start out in the same place as theirs.

    That’s why the peer group rankings are likely more useful than the overall hash, which stirs in 1/3 comparison with city-wide numbers. There would be a lot less confusion and heat–although possibly not much more value–if the PR were reduced to ONLY reporting how each school compared to its peers. If that were the case, we’d have no more cases of widespread belief that it’s a city-wide top-to-bottom ranking.

  13. Paul Heymont says:

    I notice that in my two posts above I have referred to a 2/3 v 1/3 split between peer and city measures, but that’s actually from long ago. The split is 75%/25%…but still not a useful construct.

  14. David Shulman says:

    My mother had a high school diploma. When I was in Biology class I told her about a lab I had just taken. Her words then are so prophectic now. “when you slice up Crap, no matter how you cut it, you still have crap”

    These scores are like my lab, and the results are as my mother described. It’s the exercise itself that MUST be stopped. These “measures” are killing public schools, as they were intended. Shame on all the psychometricians, and educational statisticians who didn’t call out the Bush Administration when this “crap” started—-SHAME!! And more shame on the O’Bama Administration for continuing this destruction! In NYC, the MOST shame (award) goes to Bloomberg for putting clueless, inexperienced, unqualified people in charge of schools, from Tweed to the Principals.

  15. Alan Bergstein says:

    Obviosly, PC has to be invoved in these “magical” ratings. In order to get into (my ala mater) Stuyvesant, a test has to be passed. What entrance requirements are there for and what is the great demand to entet these other “schools?” A sham of a system!

  16. Ken Bernstein says:

    there is an old trick of ranking. First determine what you want the results to be. Then find or invent a metric that demonstrates those results. Any ranking that does not have all three of Brooklyn Tech, Stuyvesant and Bronx Science at least in the top 10 should be viewed with a great deal of skepticism. It seems like the Shock Doctrine in action – persuade the public to devalue what they have long thought to be excellent schools as a means of totally dismantling public education and turning it over to others. And those of you in Scarsdale and Mamaroneck (from which I graduated in 1963), don’t be smug or smirk – you are next in their sits.

  17. Are the amazing ratings of not so amazing schools coming from the same people who base their measures/ratings largely on how swiftly and completely the schools/districts are submitting themselves to ridiculous and onerous reform and evaluation (as opposed to measures primarily of programs/practice/pedagogy/performance)? In other words, does the big prison bull give protection to his most favorite soapy shower friends, punishing those who resist?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s