Here is little corrective from Jerry Bracey, posted at Assessment Reform Network listserv:
From: gerald bracey
Date: 4/28/2009 12:41:29 PM
To: Kelly Flynn
Subject: Re: NAEP?
Ms. Flynn,
You say that you are not a statistician. Clearly, neither is Mr. Kress. In fact, his own statement disproves his conclusion. He points to "big gains over the past 16 years." That cannot be a typo because there is no six year period--the period of the existence of NCLB up to the time of the latest testing--in the data. So he's giving NCLB credit for improving scores for a full 10 years when it did not exist.
Looking at the reading data, I see a one-point---one point!--gain for 17-year-olds between 2004 and 2008, a score that leave this age 4 points below their highest scores which occurred in the late 80's and early 90's.
I see a one-point--one point!--gain for 13 year olds, leaving them with precisely the same score as they had in 1992.
I see a one-point--one point!--gain for 9-year-olds. This is an all-time high, true. But most of the gain occurred between 1999 and 2004. As I have pointed out, NCLB came into existence only in 2002, experienced great implementation chaos early on--most states didn't even have their plans in to USDOE for the 2002-2003 school year, leaving only the fall of 2003 for NCLB to work its magic. Thus, most of the gains would have occurred during the Clinton administration, pre NCLB.
Looking at math, I see a one-point DECLINE for 17-year-olds from 2004 to 2008. This leaves them two points behind their all-time high in 1999.
Thirteen-year-olds show no change from 2004. Note that except for the extrapolated period between 1973 and 1978, scores for 13-year-olds had been rising since 1978.
Nine-year-olds show a 2-point gain. Again an all-time high, but for 9-year-olds math scores had been rising since 1982 and had risen 13 points by 1999. A jump occurs from 1999 to 2004, again, mostly occurring during the Clinton years for the reasons given above.
Bob Linn once observed that given the existing NAEP gains, it would take 166 years for 100% of the students to reach proficiency. We can see by the most recent data that Bob underestimated the duration needed.
It must also be remembered that while it is de rigueur currently to demean state standards, states had been raising their standards for some time, shortly after the appearance of A Nation At Risk. Such efforts render it tricky indeed to attribute any gains to NCLB
My statements are all predicated on data from the original format, but the conclusions would hold even if we used the revised format--the gains are tiny over the period considered. They are especially disappointing given the attention given to NCLB by the feds, the states and the media. I don't know why the USDOE didn't shift to the revised format reporting in 2004. Probably because it would have completely discredited Secretary Spellings' hyperbolic claims for the efficacy of NCLB at the time. Until I see some technical reports from disinterested parties, I'm treating it as statistical sleight of hand.
Actually, the national NAEP data are not all that meaningful because of the changing demographics of the nation. In 1975, 80% of 9-year-olds taking the test were white. In 2008, it was 58%. This leaves NAEP vulnerable to Simpson's paradox in which the who group shows one pattern, the subgroups a different one. The only ethnic subgroup to show stability is 17-year-old whites. All the others are up, some a lot, but they all started their rise well before they had to suffer under NCLB.
Incidentally, NAEP can be gamed. The easiest way is in terms of the percentage of students excluded from the sample for a variety of reasons.
Please feel free to share this in its entirety with Mr. Kress or, if you prefer, provide me an email and I will forward it.
Jerry Bracey
No comments:
Post a Comment