The Alum-lie Spreads

Two weeks ago, The 74 published the results of The Alumni which claims to show that the graduates of certain charter school networks go on to graduate college at a rate of 3 to 5 times the rate of low-income students on average.

They say that only 9% of low-income students graduate college after six years while among the nine charter networks they studied, their graduates had college completion rates ranging from 25% (approximately 3 times 9%) and 50% (approximately 5 times 9%).

The problem with this calculation is that the charter schools are only counting students who completed 12th grade at that school (or for KIPP, 8th grade).  So if a school only has 14 graduates and 7 of them graduate after six years, it is accurate that 50% of their graduates went on to complete college, but if that cohort of 14 students was 40 students three years earlier, then their rate is really 25%.  In other words, by just counting the ‘graduates’ they get an inflated college completion rate.

In the original The Alumni article the author, Richard Whitmire, admitted as such.  He even put a comment from the KIPP network about how the other schools should use 8th grade as the cutoff so they don’t get unfairly inflated percentages.  I argued in my first post about this that 5th grade would be an even more accurate cutoff.

In a follow up article on The 74 called The Data Behind The Alumni the case is made even stronger:

The one network that insists on including students who leave the system is KIPP, which reports its college success data starting in ninth grade for students new to the KIPP system and at the end of eighth grade for existing KIPP students. YES Prep, part of the United for College Success Coalition in Texas, has promised to start calculating its college success data from ninth grade, but no figures are yet available.

All the other networks start their data set in 12th grade — and say they don’t have data that begins in ninth grade. KIPP takes a principled stand on that issue, refusing to release any results that start the tracking in 12th grade, despite the fact that it would boost its college success rate.

Within the charter community, this is turning into a hot-button issue. KIPP feels very strongly that the only honest method for reporting graduation is to start in ninth grade. In theory, a charter network could increase its college success numbers by pushing or counseling out weak students before their senior year. That would apply to any high school, not just charter high schools.

You’ve got to love the part about the other networks “say they don’t have data that begins in ninth grade.”

Also on The 74, and again to their credit, they published something by the chief executive officer of YES prep.  He brags first that the class of 2010 had a 54% college completion rate, but then, a few paragraphs later admitted about that same cohort:

When I was principal of YES Prep North Central in 2010, only 34 percent of our founding sixth-grade class went on to graduate from our campus. This unacceptably low persistence rate, a symptom of a “no-excuses” culture, needed to be addressed to align with our mission of increasing the number of students from underserved communities who graduate prepared to lead.

Suddenly the 54% turns into 54% of 35% which is just 18% college completion.

He says that from now on they are going to use the 8th grade cutoff like KIPP, though of course he should be pushing it to 6th grade based on what he just said about their huge attrition numbers.

Another interesting statistic in that follow up article is that if you want to do a more fair comparison you would want to compare not to the 9% number but to the percent of low-income high school graduates that go on to graduate college within 6 years.  To their credit, The 74 does say that this is actually 15%, to which they then say:

Even if as many as 15 percent of low-income minority students who make it through high school earn college degrees, that means these top charter networks are still doing three and a half times better.

So suddenly it goes from 3 to 5 times better to 1.6 to 3.3 times better.  This is important since surely the 3 to 5 number is the thing that is going to be remembered by people who read just the first article or one of the editorials Whitmire has published in The New York Daily News, The Wall Street Journal, and, most recently, The Hill.  There was also a report about The Alumni by another writer in The Houston Chronicle.

So after reading Whitmire’s The Hill piece which has no mention at all the issues with the calculations, I reached out to him on Twitter and we had this amusing exchange.

Screen Shot 2017-08-08 at 4.09.59 PMScreen Shot 2017-08-08 at 4.10.13 PMScreen Shot 2017-08-08 at 4.10.46 PMScreen Shot 2017-08-08 at 4.11.05 PMScreen Shot 2017-08-08 at 4.11.23 PMScreen Shot 2017-08-08 at 4.11.40 PMScreen Shot 2017-08-08 at 4.11.52 PM

And that was it.  I have no idea what his last tweet means.  I think Trump might have written it.  Why would I back off if I found flawed data.  I just don’t get it.  This is why most reformers don’t engage with me.  In any kind of debate they have with me on equal terms I absolutely embarrass them.

Advertisements
Posted in Uncategorized | 21 Comments

What Happened To The Math Regents? Part IV

Surely not every question on a New York State math Regents should be completely straight forward.  You want a test with a mix of some challenging questions so that students prepare for a test that is going to require a lot of thinking.  But a challenging question should not be challenging just because it is poorly worded or ambiguous.  In examining the recent Algebra II Regents, I have found way too many examples of ‘bad’ questions.

What is the appropriate number of ‘bad’ questions that should be permissible?  If by ‘bad’ you mean poorly worded, confusing, ambiguous, or mathematically inaccurate, then there should be none of those.  But I suppose it is a matter of taste how many ‘challenging’ questions there should be.  For me these extra ‘challenging’ questions belong at the end of a unit test about a topic and not as 2 point multiple choice question on the Regents, but I suppose I should expect one or two of those that, in my opinion, were too difficult for the test, if the goal is to accurately assess how well students in New York State understand Algebra II.

I’m absolutely sure that when a curve is made EVEN BEFORE THE TEST IS ADMINISTERED saying that a 30% is going to be curved to a 65%, that is a serious issue.  Making a test too hard, whether it is the good kind of hard because it required a lot of good mathematical thinking or the bad kind of hard where the question was worded poorly or ambiguous or just otherwise ‘bad’ is a mistake for a Regents.  The Regents should be reasonable and should have no curve whatsoever.  That should be one of the first goals in the creation of the test.

This is my last post about this particular exam, though I could do a similar thing for the Algebra I test and others, particularly Patrick Honner, have done a great job on the Geometry test.

Screen Shot 2017-08-07 at 10.29.28 AM

One problem with multiple choice questions is that there is no partial credit for getting the ‘second best’ answer.  So especially for a Regents test, you want to make the questions pretty clear otherwise you can’t distinguish a student who knows most of the material from a student who knows little of the material.  For this question the part about “If she plans to run the unit for 3 months out of the year” is an unnecessary twist.  I know that sometimes test makers like to put in extraneous information to check if a student can distinguish the relevant information from the irrelevant, but in this case I could see a student thinking this is relevant information, using it, and creating an equation based on this, which would already have an extra twist since it asking for cost per year instead of the typical total cost question, and still get no credit.

For a unit test on just this topic I wouldn’t have a problem with this being one out of 15 questions or something, just to make it tricky to get a 100 on the test, but for a Regents exam, it is not good test making to throw in this extra piece of irrelevant information with no opportunity for partial credit for following through on the question correctly after not realizing this.

Screen Shot 2017-08-07 at 10.29.41 AM

This is an example of how a question can be simultaneously ‘interesting’ and also a ‘bad’ Regents question.  This is a great application of the finite geometric series problem, and it is something that I do with my classes.  But it is something that works well as a twenty minute activity, not as something that you try to do during a Regents in the multiple choice section.  If students have not been exposed to this exact question, this would be a big challenge to decipher in the moment during the test.

And even if they really want to do this question, because it is ‘rigorous’ for them to apply the formula to an unfamiliar situation like this, the wording of this question is terrible.  Earlier in the test they have a probability question that takes an entire page and is still confusing.  In this one, they make this explanation way too short.  It is also ambiguous since the answer depends on when she puts the money in and when she eventually takes it out.  It seems to me that she puts the money in at the beginning of each month, but must take it out instantly after she makes her last deposit since if she waits until the end of the year, there will be more money in the account.  If I needed to give this question, I would say

“Jasmine decides to open a savings account.  On January 1st 2017 she puts $100 into the account.  The account earns 3% a month compounded monthly.  On February 1st 2017 she puts another $100 into the account.  She continues this until December 1st 2017.  Right after her final deposit, she withdraws all the money in the account.  Which expression is equivalent to the amount of money she withdraws?”

Again, this would still not be a great question since it has way too much going on in it.  It could be made more reasonable by instead of doing 12 months, have her make a deposit once a year for 12 years, still compounding, but not having to deal with the converting the annual interest rate to a monthly interest rate.

Ironically, if students were to just type the four answers into the calculator, only choices (3) and (4) would be over $1200 so they are the only reasonable answers here anyway.

So it’s an interesting problem that makes for a good classroom activity, but not good for testing the sum of a finite geometric series formula on the Regents.

Screen Shot 2017-08-07 at 10.30.20 AM

This is one of the open ended questions which was already basically tested in the multiple choice question number 16.  I think I’d rather have seen question 24 with the compound interest made into an open ended question and have some better multiple choice question in its place.

Screen Shot 2017-08-07 at 10.31.43 AM

The 4 point questions are an opportunity to make a question that is more interesting and that students can demonstrate their thinking.  This question is just an exercise in typing very carefully into a calculator.  The bigger problem is with the second part since the thing they are asking for, the down payment, is not part of the formula at all so students would have to be familiar with the idea that when you take a mortgage out you sometimes make a down payment first to cut money off the principal and to lower the mortgage payment.  This is a pretty sophisticated idea and one that is not explained very clearly in the statement of the problem.  The idea of ‘down payment’ is just slipped in there and very easy to overlook.  Personally, I read this question too fast the first time and didn’t even notice the ‘down payment’ thing and answered the question that I thought they were asking, which was what principal would lead to an $1100 payment, so I would have lost at least a point there.

I don’t know who the team is that made this test.  I’d hope that whoever assembled the team and gave them direction is held ‘accountable’ and not permitted to do this for future exams.

Making a good Regents exam is not a task to be taken lightly, so many people are depending on you to do it right.  I don’t know if New York State really cares that much about how good of a test it is.  The curve makes it that roughly the same percent of people pass this ‘more rigorous’ test, so in that way it makes politicians happier even if it is really frustrating for students and teachers who worked hard all year expecting to get a fair test.

I know a lot of people, including myself, who could single-handedly make better math Regents exams than we are currently seeing.  If whoever was responsible for putting together the team that created this round of Regents exams ever wants my advice on defining a better philosophy of what the tests should be like, or about the selection process for being on the team that writes the question, or if they want to let me look at a draft of the test to do this kind of critique before the test is finalized, I’d be willing to participate in that.

 

Posted in Uncategorized | 1 Comment

What Happened To The Math Regents? Part III

The New York Regents Exams are the standardized final exams for high school students in New York State.  For many years, the math Regents were very well made tests that teachers (most that I know) seemed to like.  But lately the math Regents have poorly constructed tests.  As much as I’d like to blame this on The Common Core, it actually has nothing to do with the Common Core and everything to do with whoever has been tasked to write these tests.  In two recent posts I’ve analyzed some questions on the June 2017 Algebra II Regents, and will continue with more examples here.

Screen Shot 2017-08-01 at 4.27.10 PM

Modeling real-world scenarios with exponential functions is a very interesting topic in the course which lasts for several weeks.  There are so many good ways to assess if students understand what the different numbers in the exponential equation are.  This Roman numeral thing with two Roman numerals and the four choices (I), (II), Both, Neither, is very lazy test making.  Basically they have turned this into two True/False questions that you get either get full credit for getting both, or no credit even if you know one of the two.

Roman numeral I is unnecessarily confusing.  When there is a number like a 110 in front of the e, it is generally the ‘starting point.’  So in this case it means that when t=0, or 0 years after 2010, or just 2010, the population was 110 million.  But look at Roman numeral I.  Rather than say the population in 2017 is 110 million, they say ‘The current population’ which can throw people off since ‘the current’ could mean 2010 when it was 2010.  It is a poor wording for this question which still wouldn’t be a great question even if this was worded correctly.

For Roman numeral II, I think that the ‘approximately’ can lead to extra confusion as well.  Had they made the exponent 0.039t, then they wouldn’t need to put the ‘approximately’ in at all.  But if they want to use that, they should tell what it would need to be rounded to otherwise it becomes a matter of opinion if 3.922% is approximately 3.9%.

A better question would be to have the students select from different functions to say which makes the best model after being given the starting population and the growth rate.  Or they could make four choices and ask something like “What does the 0.03922 represent?” or “What does the 110 represent?”.

Screen Shot 2017-08-01 at 4.42.36 PM

I know that many people reading this are not math teachers, but I can tell you from a math teacher’s point of view that this is an awful way to test this topic, of rational equations.  This is the only question on this entire test about rational equations, a topic that must take about two weeks or more when you include the various word problems that go along with it (like the ones where two people paint a house together).

The equation, itself, is fine for this.  But then instead of just asking for the solution set to the equation, which requires understanding that one of the two apparent solutions needs to be rejected, they do something extremely odd, which is telling the students what the first step in the solution is.  There are actually two ways to solve this problem (you could combine the fractions on the left and ‘cross multiply’), and it shouldn’t matter which way the student chooses since either way will lead to one actual solution and one extraneous solution.

Choice (3) is what’s known as a ‘distractor’ since it is something that might look tempting to someone who has a partial understanding of the topic.  They might assume that just because 0 does make one of the terms undefined, it must be an extraneous solution even though it isn’t a solution to the equation that pops up in the process.

I think in all my years of teaching this topic and in making and seeing tests on this topic, I have never seen this tested in just this way.  Perhaps if you have a test on just this topic and you have 10 questions and most of them are pretty straight forward and you want to put one like this on so that getting a 100 is tough, then I don’t have a problem with it.  But if on a statewide Regents exam, this is your only question about this unit and it is an all-or-nothing multiple choice question, this is bad test making.

Screen Shot 2017-08-02 at 9.03.50 AM

There is no reason to give the f(9)=-2 unless you are trying to force the students to calculate the d value by doing -2 = -8 + d(9-1), but why would anyone do that when it is clear that d=0.75 by just looking at the sequence itself.

Using the formula, you would get f(n)=-8.75 + 0.75(n-1) which looks a lot like choice (2) though choice (2) is not correct since there is a minus sign instead of a +.  Personally, I would have made choice (3) in the f(n)=-8.75 + 0.75(n-1) form instead of the equivalent form that they have.

For a Regents, and maybe some people will say that this is too straightforward but I think it is the right way to test this topic, what is the 1000th term of the sequence -8, -7.25, -6.5, -5.75, … or if you want to make it more difficult, in the sequence -8, -7.25, -6.5, … what term number will be 315.25, or something like that.  Since this is pretty much the only question on this topic which is a four or five day topic, you’ve got to keep it straight forward on the Regents.  Being able to answer the two questions that I propose is a true indication that you understand this formula and how to apply it.  The way this question is, it is quite possible for a student who would get something like an 80% on a full test on this topic to get zero out of two points on this one, the same as a student who would have gotten a 0% on a full test on this topic.

Screen Shot 2017-08-02 at 9.12.38 AM

They should change the function to something like f(x)=x^2+2x, or something of that complexity.  The point here is to see if the students know how to calculate the average rate of change, which I think is something they should know how to do.  But by making this unnecessarily complicated function that requires putting the calculator into radian mode even (even though I could argue that this is ambiguous, you could have degrees with a pi in them).

This turns into a question about how well the students can manipulate the calculator and a student who understands the concept of average rate of change but presses one button out of the 40 or so buttons that need to be pushed for this, will get the same amount of credit, none, as the student who doesn’t know what average rate of change is.

I think this question, more than any so far, reveals how this test making team just did not ‘get’ the idea of what a good test question is and why.

So these four questions, were actually four consecutive questions on this test.  Each could have been improved or edited to be a fine question.  If they had someone in charge who is an expert on making good test questions, maybe they wouldn’t have had to curve this test so that 30% got curved up to a 65.

Posted in Uncategorized | 2 Comments

The Alum-lie

On the heels of the latest call by the NAACP for a charter school moratorium, there has been a media blitz started by The 74 about a report called ‘The Alumni’ in which they claim that charter school graduates go on to graduate college at three to five times the rate of low income students who do not attend charter schools.

Besides being reported in The 74, it has also appeared in The Wall Street Journal and The New York Daily News.

The 74 article is written by Richard Whitmire (as is The Daily News and The WSJ Op-Ed) who is known for his biography about Michelle Rhee (haven’t heard much about her lately) and also one about Rocketship Charters (haven’t heard much about them lately).

The summary of the report says that they have tracked the students at nine charter networks and found that graduates of those charters have between 25% and 50% of those students also graduate college.  Since a commonly quoted statistic is that only 9% of low income students graduate college, these networks seem to be getting between three and five times the rate of college completion.

The major flaw in this report — and they admit this in The 74, but not in The Daily News (The WSJ is behind a paywall, if someone can read it let me know if they address it there) — is that while the 9% statistic is for ALL students who enter schools, these 25% to 50% numbers are only for the students who complete 12th grade at the schools (KIPP is an exception, they use data from students who complete 8th grade — I’ll get to that later.)

In The 74 they actually do an entire other article explaining all the issues with the data that could cause the numbers to be inaccurate, which is something I appreciated, though of course the big takeaway will still be the three times to five times numbers that will be quoted, I’m sure, for the next decade by charter zealots.

Now there is really no way to verify these claims.  We don’t have a list of students who graduated and then another list of which ones graduated college.  But there are, thankfully, a few numbers that can be analyzed in the report.  I’ve done one network so far and hopefully will be able to do more another time.

Here is a quote from The Alumni about the Uncommon Schools network:

 Uncommon Schools: For the New York–based network, the only alumni who have reached the six-year mark graduated from North Star Academy Charter School in Newark. (The alumni from its Brooklyn high school just reached the four-year mark. Of the 142 North Star students who reached the six-year mark, 71 earned four-year degrees: a 50 percent success rate.

I went to the New Jersey data page where they have the databases for the enrollment at their schools throughout the years and downloaded all the data from the 2010-2011 school year down to the 1998-1999 data.

According to their data, there were eight graduating classes that have been out of high school for at least 6 years, the classes of 2011, 2010, 2009, 2008, 2007, 2006, 2005, and 2004.

The sizes of their graduating classes were, respectively, 24, 14, 36, 19, 27, 20, 23, and 19 which is a total of 182 students, not 142.  More importantly, going back more years I found that these 12th grade classes had lost about half their students from when they entered the school as 5th graders.  The 5th grade classes for those cohorts were about 40 each.  So if they really got 71 college graduates out of those 8 cohorts, it is not a rate of 71/142=50% but instead 71/320=22%.

Yes, I know that is still ‘double’ the expected 9%, but there are other factors that might make students who apply to charters have a higher rate.  Really we can’t be sure that there really were 71 graduates, but just taking them at their word for that, it still is dishonest to claim that it is a 50% college graduation rate.

Now the KIPP network claimed a 38% college graduation rate but they claimed that the other schools had inflated numbers because the other eight networks only counted students who completed 12th grade at their schools while KIPP counted students who completed 8th grade at their schools.

Here is a quote from the article about this:

KIPP is a fervent believer that college graduation cohort data should be tracked from ninth grade — not 12th grade, the starting point that the other charter networks included in this study use.

For students who attend KIPP middle schools, KIPP tracks them when they graduate from eighth grade to ensure they are kept track of, regardless of whether they go to a KIPP high school.

For students who go to non-KIPP middle schools and start attending KIPP as high schoolers, they track them when they start ninth grade.

The problem with starting in 12th grade, argues KIPP, is that it could tempt schools to push out weaker students during high school years, thus allowing the stronger students to boost the schools’ college-going and college-completion rates.

KIPP may be right. But in The Alumni, where KIPP is the only network that is currently tracking students from ninth grade, we have decided it is important to share cohort graduation rates that start in 12th grade. What’s key to this series is learning what works in boosting that college graduation rate — lessons that could be passed along to all schools, not just charters. Moving everyone to the gold standard is the next step.

KIPP is correct that schools that only count students who complete 12th grade will have inflated scores compared to KIPP that counts students who complete 8th grade.  But what KIPP doesn’t mention is that the fairest way to make a comparison to the 9% number is to start counting at 5th grade.  KIPP actually has a pretty big attrition between 5th and 8th grade so the true ‘gold standard’ is really not used by anyone.  All the numbers are inflated.  KIPPs might be inflated less than the others, but it still is so they can whine that the others are cheating worse than they are on this statistic, but they should admit that they are doing it too, though to a lesser degree.

There isn’t a lot of detailed data included in the report, just a summary of the main findings.  And I’m not so confident we can trust the data in there anyway, but it still is interesting to see how the numbers are inflated by ignoring attrition.

Posted in Uncategorized | 9 Comments

What Happened To The Math Regents? Part II

For the past few years in New York, the standardized math final exams for high school, known as ‘The Regents’ have been curved so that 30% was a passing score.  It didn’t used to be this way.  The relevant question is:  Are the uncurved scores so low because 1) The students don’t know the math, 2) The test was flawed, or 3) Both.

We can’t get a certain answer to that question because the tests were, in my opinion, very flawed.  If future tests can be improved, maybe we will learn that the students didn’t know the math very well either, but until the tests are fixed, we won’t know.

Last post, I went though three questions that I felt could have been improved in various ways and in this one I’ll examine some more.

Screen Shot 2017-07-28 at 1.27.03 PM

I think, as a rule of thumb, a 2 point multiple choice question should not take an entire page.  This is one of the new ‘common core’ topics that was borrowed from AP statistics.  The idea is that the results of a computer simulation of what to expect with a fair spinner is compared to what happens with the actual spinner and analyzed to determine whether or not is is likely that the actual spinner was fair.

The wording of the answer choices could cause someone who understands this topic to still get this question wrong and get 0 out of 2 points since there is no partial credit.  The double negative ‘The spinner was likely not unfair’ is one source of confusion.  Another issue with this is that both choices (3) and (4) reach the correct conclusion that the spinner is likely not unfair (or put a better way, likely fair) though a student would get no credit for putting choice (4).  This would be better as a long answer question where a student could get 2 points for the correct conclusion and 2 points for the correct reasoning which they could write out in their own words.

This is an example of trying to make something that should not be a multiple choice question into one that should.

Looking over this test I get the sense that the test makers are concerned about ‘regular’ multiple choice questions that may have been good in the days where students were not permitted to use calculators since they can legally ‘cheat’ by testing the answer choices if they don’t know how to solve with Algebra.  Maybe there should be a ‘no calculator’ section like on the SAT so those ‘regular’ question.  Or maybe there should be more free response questions and fewer multiple choices.

Screen Shot 2017-07-28 at 1.44.43 PM

It’s not that this question is ‘unfair’ or ‘too hard’ but just that it is clearly written by someone who has no feel for math.  There is no reason to give the trig identity as there are two ways to solve this and if the student does not have the trig identity memorize, they can use the other method.

Also, this is not a good question because it really isn’t clear what the point of it is.  The question has trig functions in it, but it actually has nothing to do with trigonometry since the only property from trigonometry that is used is given in the problem, it is more of an algebra problem.

It’s hard to explain to non-math teachers, but this question is kind of like mathematical gibberish.  It doesn’t really serve any purpose.

Screen Shot 2017-07-28 at 1.52.00 PM This sort of question where you have to check if P(A)*P(B)=P(A and B) is a new topic added to the common core Regents.  It was always an application that we could have taught, but it seemed so contrived that I’m sure few people did.  Knowing it can be on the test, it is a pretty easy thing to spend a period on, especially since it has come up on each test for at least 2 points.

The issue here is that, like that spinner question, there are really only two choices, either it is independent or it isn’t.  But you can’t just have two choices so they threw in another concept, mutually exclusive, which some students may not have been aware of.  So a student who knows how to check for independence might still get this question wrong as they pick choice (4) instead of choice (2) figuring that this mysterious ‘mutually exclusive’ thing is probably part of the answer.  I think that a question like this should be a short answer question where the students can show their calculation and say whether they think the events are independent or not and leave it at that.

Screen Shot 2017-07-28 at 1.59.52 PM

Perhaps you’re thinking right now “This guy’s a grump.  He can probably find fault with everything.”  But you’d be wrong.  There were a few ‘good’ questions on this test and I’d be perfectly happy if all the questions on the test were good.  It is very possible, as I wrote in the previous post, to make a fair test with good questions on this new curriculum.  Teachers do it all the time during the school year for their tests and quizzes.

For this one, there are some mathematical inaccuracies which I don’t mind so much, but I should mention that in reality the B and C should probably be the same number since the 12 month period of the weather and the ‘shift’ would be the same for an accurate model, but that’t not my issue with this question.  The problem is with the answer choices.

For choice (1) it is not clear what this ‘average monthly temperature variation’ is.  I guess they are talking about the amplitude of the function, but it’s not so clear to me (and definitely not to a high school student taking this test) that those two things are related, and how.

For choice (2), again, I have never heard the term ‘midline average monthly temperature’ before.

For choices (3) and (4), I haven’t heard of a ‘maximum average monthly temperature’ either.

This temperature application of sine curves is something that I like.  Maybe they should have just included one city and asked questions about that city without using technical jargon which I really think they made up for the test.

Screen Shot 2017-07-28 at 2.08.52 PM

I’m sorry, but this question is a very stupid way to test if students know the two properties of exponents x^a/x^b=x^(a-b) and x^(-a)=1/x^a.  To have the expression written out in words is also unnecessarily confusing.  This test is a total of 86 points so to use 2 of those points, which is 1/43 of the entire test on this question makes no sense at all.  They revisit this topic in question 31 anyway in a free response.

One last note before ending this post, the big curve that made a 30% become a 65%, that curve was published before the tests were scored.  When in my career I ever gave a curve it was because I somehow messed something up.  I had a typo or some question was not worded well and I felt that it caused students to waste time so they deserved a curve.  But in this case they just made a shoddy test and then compensated with a ridiculous curve.  Much more accurate to make a reasonable test with no curve and then you can see if the students really learned the material.

Posted in Uncategorized | 3 Comments

What Happened To The Math Regents? Part I

When people try to defend the common core, they usually admit that there have been problems, but that the problem was “in the implementation.”  For once, they are right.

The Regents exams have been given since the 1930s and have changed over the years to reflect different priorities in what was considered important for students to know in math.  Most years there were three different math Regents, one for 9th graders — generally Algebra I, one for 10th graders — generally Geometry, and one for 11th graders — generally Algebra II / Trigonometry.  Though over the years the names of the courses have changed and there was even a failed experiment where they tried to make the three tests into two courses each spanning a year and a half, things were pretty steady from the 1930s until very recently.

I know there is a lot to criticize about the common core, especially in the elementary grades.  But the high school recommendations for the common core, at least the official ones from the people who designed them, are fairly vague and not really very different from what the expectations in math were before the common core.

Every state had to ‘interpret’ the common core recommendations and New York did make some odd decisions about what they felt would qualify as a ‘common core’ state curriculum, which topics to eliminate from the curriculum and which ones to add.  Even with these changes, a math course pre-common core would be about 85% to 90% the same as it was post-common core.

Another aspect of the common core besides just a tinkering with cutting some topics and adding some topics was to make ‘common core aligned’ exams, and this includes the new Regents.

As a teacher, I make unit tests all the time, and I take great pride in the quality of my tests.  I have a testing ‘philosophy’ when it comes to a math unit test.  I like to have a mix of questions, some questions, about 50% I’d say, are intentionally ‘easy’ (at least for a student who has learned the material), another 35% are ‘medium’, requiring a longer calculation, more steps, a bit more decision making, and the remaining 15% are ‘hard’ requiring students to sometimes answer a question that they haven’t exactly seen before but if they really understand the material deeply, they can still figure those out.  My tests are out of 100 points, and if I make my test properly, there should be no need to ‘curve’ the results.

I’ve also made lots of ‘final’ exams.  For those, my distribution of easy, medium, and hard questions is different.  I don’t have exact numbers on this, but I’m thinking that it is about 60% easy and 40% medium with no deliberately ‘hard’ questions.  I do this because the students are studying a full year’s worth of material on the final and they have to constantly ‘switch gears’ from one unit to another which is already tough to do so I don’t see the need to have the deliberately hard type questions, like when the have to apply something they’ve learned to a new situation.  On a regular unit test, I like those in moderation, but not on the final.

Back when I took the math Regents exams as a student in 1983, 1984, and 1985, they were very ‘fair’ tests.  If you knew your math and studied, you were sure to pass and if you were really proficient, you were likely to get between 90% and 100%.  There was no ‘curve’ on the test though you were supposed to choose something like 30 out of 35 short answer questions and 4 out of 7 long answer questions so in that way there was a small curve, but mainly the percent you got correct was your grade on the test.

Starting in the early 2000s with the Math A / Math B experiment, there seemed to be new philosophy of Regents tests.  For one thing the tests would be a lot harder.  There are different ways to make a math question ‘hard.’  It can be more steps meaning more opportunity for errors or the numbers can be harder to work with since they involve fractions or decimals or the questions can have extraneous irrelevant information to throw students off, or a basic question can be made more confusing by expressing it in an unfamiliar way.  So the new questions were harder, but to make up for this the test was ‘curved.’  As time went on this curve got more and more extreme until a score of about 30% was curved up to a 65.  Another relevant fact is that the curve is announced before the tests are scored.  So it is not that they thought they made a fair test and then realized something must be wrong because the pass rate would be so low if the passing score were 65% but that they knew that this test would be so difficult it would require such a generous curve.

This is not a sound educational idea, to make a test much harder and then to curve it.  What that does is make the scores less accurate since you can have two students get no credit on a question even though one of the students may not know the material at all and the other can know the material pretty well but have fallen for too many of the traps.

I suppose that they would say that the harder tests mean that they’ve ‘increased expectations’ or ‘increased rigor’ but if you’re going to curve a 30% up to a 65, it really isn’t increasing expectations it is just making the results inaccurate.  An unfair test is discouraging to the students who prepared for a test that would accurately assess their skills.  An unfair test is also frustrating for teachers who have been giving fair tests throughout the year only to have their students do poorly on the state made final exam.

Now I have my issues with some of the changes to the different math curricula in the transition to the common core, but as far as the Regents exams go, I am certain that it is possible to make a very fair and accurate test with those topics which would not need to be curved.

I don’t know how the Regents exams are made.  I expect that there’s a committee of question writers, maybe they are teachers and assistant principals or retired teachers. . There also must be some kind of person who is ‘in charge’ who takes all the proposed questions and assembles the test and makes sure that the test ‘flows’ and that there is an appropriate mix of easy, medium, and hard questions.  This year, for the three math tests, this team and leader have failed to create appropriate tests.  In this post I’ll look at Algebra II and maybe look at the other tests another time.  My hope is that whoever is in charge of assembling these teams, especially the leaders of the teams, will choose different, more qualified, people for next years exams.

When a teacher looks at a curriculum for a math course, he or she decides how much time should be devoted to each unit.  Some units might take 5 days to complete, like rational equations, while others will take 15 days to complete, like exponential equations.  A good Regents exam will reflect this by making the number of questions on each unit proportional to the amount of time a teacher spent teaching that topic.  Otherwise it is pretty frustrating for students and teachers alike when they spent a month learning and mastering something and it only came up in one 2 point multiple choice question.

On the June 2017 Algebra II Regents I think all math teachers would agree that having just 10 points combined (out of 86 points total on the test, or about 12%) on rational equations, radical equations, systems of equations, sequences, and complex numbers was not the proper representations for these topics would I’d estimate to be about 25% of the course.  On the other hand, the overrepresentation of exponential equations which accounted for about 25% of the test though about 12% of the course.  So I’d say they need to do a better job of deciding how many points to give to the different topics based on how long those topics take to teach.  This was not the main problem with this test, but it is something in need of improvement.

The issue with this test is that the majority of the questions are ‘bad’ questions for one reason or another.  I’m going to analyze some of these questions in this post and then look at the other questions in future posts if readers want me to continue with it.

The first part of the Algebra II Regents is the multiple choice section with 24 questions worth 2 points each for 48 points (out of 86 total for the test).  Of these 24 questions, only six of them were ‘good’ (numbers 1, 6, 7, 8, 11, and 17 for those of you following along with the test at home).  The other 18 had issues with them that made them not ‘Regents worthy’ in my opinion.

Screen Shot 2017-07-26 at 3.18.17 PM

Exponential equations are an important topic in Algebra II.  It actually takes a few weeks to teach all the different things that lead up to a question like this.  The first thing wrong with this question is that it takes too many steps for a multiple choice question.  One way to fix this would be to remove the +3 from the exponent.  The next issue is the way the choices are phrased.  Besides knowing how to solve this question, students also are expected to use the change of base formula in their solution in order to get something that looks like the answer choices.  They could have made choice (1) say log_2 6 -3, which would have been a little better.  In my opinion the answer choices should just be decimals rounded to the nearest hundredth.

Maybe they did it this way because they did not want students to be able to ‘cheat’ by just plugging the four answer choices into the equation to see which one made the expression evaluate to 48.  But the students could do that anyway by just converting the four answers into decimals and then testing them each out.

So question 2 was flawed since there were too many skills being taught in one two point multiple choice question.  If I were giving a twenty question test about just this topic, perhaps a question like this would be one of the ‘hard’ ones.  But to pick one question about a topic that takes many days to teach and make it this one instead of a more straight forward question is counterproductive.  You want a question that enables you to distinguish the students who don’t know the topic from those who do.  That’s why you want a less involved question than this.

Screen Shot 2017-07-26 at 3.40.08 PM

Though the math involved in this question is fine, each of the steps, distributive property followed by simplifying the powers of i, as a math teacher this question is just ‘weird’ looking.  I’ve rarely seen x variables and the imaginary number i put together in quite this way.  My feeling is that, by convention, the i would be between the coefficient and the variable, but I’m not sure.  Maybe the x should be a z since it is dealing with complex numbers.  There’s just something ‘off’ about this question.  Not that it’s something that if a teacher knows might be on the test isn’t something that students can easily practice and get correct, but since I don’t think I’ve ever encountered an expression like this in any math that I’ve done, it would be an ‘application’ of complex numbers that would be pretty forced.  If the goal is to show that students know powers of i, I think a better question would be one that does not have the x in it, maybe something like (2+3i)(5-2i) or something like that.

Screen Shot 2017-07-26 at 3.48.04 PM

This is a question that requires a graphing calculator.  They want to see if students know how to solve equations like this by graphing both functions and using the ‘intersect’ feature of their graphing calculators to find the solution.

When they graph the two functions, though, the only answer that seems reasonable is choice (3) since these curves do intersect at (-0.99, 1.96).  But choice (3) is actually not the correct answer here.  The answer to an equation like f(x)=g(x) is not an ordered pair, but just the x-value.  So if -0.99 was a choice it would be correct, but since the choice has the y-coordinate too, it is not considered correct.

fig07

So what is correct?  Well, students would have to realize that on the 10 by 10 grid that the calculator defaults to there were only two intersections.  But if they zoomed out to a 20 by 20 grid there would be a third intersection point at (11.29, 32.87) which makes choice (4) look good, but once again, that wouldn’t be right so fortunately there is choice (2) which is just the 11.29.

fig09

It was not a good decision to make a question like this and not have the correct answer in the basic 10 by 10 grid.  That’s the first fix I’d make to this question if I were involved.

This question could also have been improved by making all the choices have just a single number and not have any ordered pairs since I don’t think teachers spend a lot of time having a the philosophical discussion about whether a solution to an equation can be an ordered pair or is it just a number.  A discussion like that would be a bit too esoteric and boring for the vast majority of students.

Another option is to make the question ask which is not a solution and have the x-coordinate of the three intersection points plus one number that is not a solution.

OK, that’s all I’ve got in me for now.  I could do this kind of analysis (let me know if you think I should) and show how about two thirds of the questions on this test reveal a lack of understanding by the team that created this test about what the purpose of a final exam is, what sorts of questions are appropriate for a test like this, what sorts of answer choices are appropriate, and just generally getting a clue about what makes a high quality test.

The issues with this test have nothing to do with the ‘common core’ actually.  Very good tests could be created based on the common core curriculum for Algebra II.  Teachers who taught this course created good tests throughout the year on it.  The Regents should be a test that students who know their stuff should be able to pass without a huge curve to compensate for the low quality of the test questions.  Those who are paid to create these tests need to take this responsibility more seriously or they should get more competent people to make the tests.

Posted in Uncategorized | 6 Comments

Does KIPP Deserve To Expand In Philly?

Charter schools are a large component of the modern education reform movement.  In theory, the charters will get better results because with the added autonomy they get, there is also added accountability.  A charter school can lose its charter for poor performance.  ‘High performing’ charters with a proven track record will also have the opportunity to expand while ‘low performing’ charters will not have their charters renewed.  This survival of the fittest as charter schools compete with each other and with the public schools will, supposedly, cause the school system to evolve more quickly than the incremental improvements of the traditional school system.

KIPP is one of the largest and best known charter networks in the country.  There are currently five KIPP schools in Philadelphia, two elementary schools, two middle schools, and one high school.

An article recently ran on the Philadelphia Website called ‘The Notebook’ about KIPP’s applications to expand and how some in the community are skeptical because of the poor performance of the KIPP schools.

I looked at the recent school report card for their one and only high school, KIPP DuBois High School.  Though they don’t have letter grades, they do have six levels with different symbols that are essentially an A to F scale.  That school got the lowest possible rating, essentially an F.  Not only were their test scores low, but they also got the lowest possible rating in ‘growth’ in math and reading, in other words the value-added for the school which reformers claim to take very seriously.

But of course KIPP has a response to this, from the article:

KIPP’s CEO, Marc Mannella, has acknowledged publicly that some of its academic indicators have been disappointing. But KIPP officials cite evidence that its schools have had success in steering students to college.

Specifically, they say, the first 8th-grade class from KIPP’s original middle school, which graduated in 2007, boasted 35 percent of students obtaining four-year college degrees 10 years later, compared to a 9 percent rate for low-income students nationally. The result is for a cohort of 35 graduates.

I hear this a lot from KIPP with their ‘to-and-through’ college initiative.  Years ago I saw this ad on the New York City Subway.

photo16

This is quite a promise since many students who begin KIPP don’t even make it through 8th grade there.  But here they are making a promise ‘to see each child to and through college’ which seems like some kind of false advertising, I don’t know exactly what the laws are about this.

KIPP often boasts that for low income students, only 9% graduate college while in some of their networks they have even 40%.  Generally this 40% does not account for students who did not complete 8th grade there so it is not an equivalent comparison.

A few years ago I had an interesting interchange with Richard Barth, who is a co-CEO of KIPP, about this:

screen-shot-2017-03-15-at-9-26-04-pm

So clearly they aren’t interested in having a true debate using data.

So when it comes to Philadelphia with all the poor results from KIPP, the officials there have come up with one statistic that proves that they succeed at getting their students to and through college:  “the first 8th-grade class from KIPP’s original middle school, which graduated in 2007, boasted 35 percent of students obtaining four-year college degrees 10 years later, compared to a 9 percent rate for low-income students nationally. The result is for a cohort of 35 graduates.”

The first thing that is unusual is that the cohort, it says, was just 35 students.  That is not the normal size of a cohort.  So 35% of those 35 students, or about 12 students, have graduated from college and since 35% is four times 9%, that is an amazing thing.

But Pennsylvania has very good public data so it did not take me long to find out that the KIPP school that they say had 35 8th graders in 2007 actually had just 33.  And the year before they had 55 7th graders.  And the year before that they had 77 6th graders.  And the year before that, that cohort had 86 5th graders.  So 86 5th graders became 33 8th graders for an attrition rate of 62%.  And those 12 students who graduated college, well they were not 35% of 35 but just 13% of the actual 86 students who entered that school.  Suddenly their miracle statistic, the only one they could come up with to counter their poor academic results, isn’t so miraculous.

It sound like KIPP wants to double the number of schools from 5 to 10 and almost triple the number of students they serve.  My hope is that the charter authorizers can learn that even the one seemingly bright spot in their data is nothing more than a lie generated by KIPP’s PR department.

Posted in Uncategorized | 4 Comments