The preliminary findings of a new study of KIPP schools were released this week by
Mathematica Policy Research, the company contracted by KIPP, Inc. to conduct this longitudinal examination of the KIPP’s effects on student test scores.
To no one’s surprise, these new finding show KIPP students have higher test scores than students from a matched group of public schools. In fact,
. . .the black-white test score gap in math is typically estimated as approximately onestandard deviation at fourth grade and eighth grade (Bloom et al. 2008). Half of these KIPP schools are producing impacts large enough to cut that gap in half within three years (p. xv).
So what does this study tell us that we did not already know, since KIPP’s high test scores have been acknowledged even by its harshest critics?
We find out that KIPP schools have higher levels of grade repetition, i.e., failures, than the public schools. In the 22 fifth grade cohorts, for instance, the average failure rated was 9.5 percent, ranging from as low as 2% and as high as 18%. In public schools, repeaters in 5th grade ranged from 0% to 3%, with an average of 1.7 percent. In 6th grade KIPPs, these numbers were slightly lower but still much higher than their public counterparts.
We find out that KIPPs are more segregated than demographically matched public schools, ranging from 5 to 50 percent more segregated. Twenty of 22 of the KIPPs were significantly more segregated (pp. 2-3).
We find out that 12 of 22 schools had lower significantly lower percentages of special education students, with only one significantly higher.
We find out that 13 of 17 schools had significantly lower percentages of English language learners, and only two with higher percentages (pp. 12-13)
We find out, too, that this study appears to have made an attempt to account for high attrition rates among low performing students, rates that have been central to the criticism of the KIPP model. This study makes the case, in fact, that KIPP attrition rates that are measured by transfers among low performers are in line with the public schools:
. . . Students who transfer within-district tend to have lower baseline test scores than students who do not transfer at all. For KIPP, the baseline scores of students transferring in-district were significantly lower at 12 schools (in at least one subject); none of the KIPP schools recorded higher baseline scores for students transferring in-district. The pattern at non-KIPP schools was even more pronounced: compared to those who do not transfer, students transferring in-district had baseline scores that were significantly lower in at least one subject in all 22 sites (p. 16).
What we don’t know, however, is if the 12 KIPP schools with significantly lower baseline scores mentioned in the above quote are some or all of the “approximately half of KIPP schools [that] attract students with significantly lower baseline test scores than the districtwide average” (p. 14). In other words, is KIPP attrition higher in the KIPP schools that have significantly higher numbers of lower performers in its general population? Put another way, are the higher attrition rates in some KIPPs the same KIPPs that have higher percentages of lower-achieving baseline students? This we do not know, for the researchers are mute on this question.
What we do know from this study is that KIPP is not alone in dumping its low performers. All of the 22 public schools used in the comparisons, in fact, are playing an unending game of swapouts, where low-performing students that might threaten the schools’ AYP scores are moved from school to school in a never-ending shell game of the left behind.
What
Gary Miron astutely points out in his early critique of this study is that the KIPPs, unlike the public schools, are immune from this loser swapout game played in the publics, in that KIPPs do not accept just any student who shows up in the outer office:
The KIPP study's description of attrition only considers half the equation, when comparing KIPP schools to matched traditional public schools. The researchers looked at the attrition rates, which they found to be similar - in the sense of the number of students departing from schools. But they never considered the receiving or intake rate. Even though the researchers agree that the students who are mobile are lower performing, they do not take into account the reality that KIPP schools do not generally receive these students.
Professor Miron conducted his own quick analysis, using the Common Core database, and concluded that there is a 19% drop in enrollment in KIPP schools between grades 6 and 7 and a 24% drop in enrollment between grades 7 and 8. (This analysis only included KIPP schools that had enrollments in all three grades). In comparison, traditional public schools in these grades maintain the same enrollment from year to year.
This one factor must represent an important difference when comparing KIPPs’ performance and the publics, but how much we don’t know. Again, the researchers are mute on this one, too.
One other potential problem that pops out in this study is the participation rate among parents for the various schools. The average consent rate for all schools participating was 71 percent, but in some schools it was as low as 37 percent. Did these schools solicit participation from all parents, or were some more encouraged to participate than others? With the stringent contractual obligations that parents sign off on, and with the total compliance that KIPP demands of students and parents, it is, indeed, odd that consent rates vary so widely (p. 40). Again, the researchers do not speak to these variations.
So KIPP schools have high test scores, and there is no one in the media who seems to care about anything else that happens, or doesn’t happen, at the KIPPs. Meanwhile, the corporate reformers are pouring in hundreds of millions to perfect a total compliance model that the general public will buy for urban America. And it does not matter if disenfranchised children are having their childhoods stolen away and their minds altered for the benefit of demonstrating that poor children can be manhandled into responding on a test like middle class children.
Most everyone seems content to let the neo-eugenics methods have their way with these children whose poverty is not considered in any intervention or any treatment approved by KIPP, or any of the wannabe knock-off chain gangs modeled on total compliance and offered up to replace urban public schools. The following is a short list of factors that are not mentioned in this new study, and they are factors that no one seems to think are very important to talk about:
- KIPP students attend school approximately 50 percent longer during the school year than public school students;
- KIPP students have 9-10 hour days and 2-3 hours of homework, with school on Saturdays;
- KIPP students become part of a harsh total compliance organization through a 3 week summer program of indoctrination referred to as KIPP-notizing;
- KIPP classrooms are without distractions, since offenders are segregated or offered the opportunity to choose another school;
- KIPP schools have a laser beam focus on improving test scores that is unrelenting;
- KIPP students are regularly subjected to a regimen of “positive psychology” that combines alternating treatments of learned optimism and learned helplessness in order to instill a sense of individual responsibility and unerring behavioral control.