Stephen Krashen
The House of Representatives has recommended cutting the Striving Readers program, a program aimed at adolescents. I think this is a good idea. Striving Readers costs 200 million per year, and has not produced impressive results.
To get an idea of the effectiveness of Striving Readers, take a look at a 239 page document, Summary of 2006 Striving Readers Projects, by Abt, submitted to the US Department of Education A full description of the programs used is presented starting page 224. The programs are clearly skill-based, with only a brief mention of actual reading. There is variation among the specific programs, but they all emphasize direct instruction, including phonics, vocabulary instruction, and instruction in comprehension strategies. There appears to be no awareness of the possibility that a great deal of phonics, vocabulary and mastery of strategies emerge as a result of reading.
It is a mystery to me why this program has lasted so long with such poor results. It is also a mystery why there is not more support for libraries: The House also wants to cut a program that supports libraries in high poverty areas and that costs only 19 million per year, The Improving Literacy Through School Libraries program. In contrast to Striving Readers, the research showing the positive impact of libraries on literacy development is consistent and impressive.
The 200 million yearly cost of Striving Readers would be far better spent investing in libraries. Studies consistently show that children of poverty have very little access to books, and that increasing access increases reading, which in turn improves literacy development. The current level of funding, 19 million, amounts to less than $1.50 per child in poverty. Two hundred million would mean about $15 per year, or one book per year per child. This could rapidly close the gap, giving all children a chance to do extensive reading.
I present the results for Striving Readers programs in several different locations, in their own words. The measure used was "effect size" which measures the size of the impact of a treatment. An effect size of .2 or less is considered small. The results are dismal. The differences between Striving Readers and comparisons was quite small, with effect sizes close to zero, and typically statistical significant.
Results for Chicago: (pp. 167-8) (tier 1 = highest readers, tier 3 = struggling readers).
"After one year of intervention, there were no significant impacts on the reading achievement of grade 6 struggling readers (those assigned to Tiers 2 and 3). The non-significant effect sizes for Tier 2 and Tier 3 were .08, for both.
After two years of intervention, there were no significant impacts on the reading achievement of grade 6 struggling readers (those assigned to Tiers 2 and 3). The non-significant effect sizes for Tier 2 and Tier 3 was .04 and .08, respectively."
Danville (p. 175)
"There was a significant impact of the targeted intervention on reading achievement of students in grade 9 who received one year of treatment, with effect size of .15. There were no significant impacts of the targeted intervention on the reading achievement of students in grade 6 who received one year of the treatment, with effect sizes of.08."
Memphis (used READ 180) (p. 182)
"There were no statistically significant impacts on the reading achievement of struggling readers in grades 6-8 after one year of exposure to READ 180, with effect sizes of .05 on ITBS, and .04 on TCAP.
There were no statistically significant impacts on the reading achievement of struggling readers in grade 6-8 after two years of exposure to READ 180, with effect sizes of .01 on ITBS, and .05 on TCAP."
Newark (READ 180) (p. 189)
"For treatment students who had one year of READ 180, there were no significant effects on any of the three subtests of the Stanford Achievement Test. The effect sizes for the three subtests, vocabulary, comprehension, and language arts, were .09,.10, and .07, respectively.
For treatment students who had two years of READ 180 there were significant effects on two of the three subtests of the Stanford Achievement Test. The effect sizes for vocabulary and comprehension were .09 and .17, respectively. No significant effects were found on the language arts subtest; the effect size was .10."
Ohio (READ 180) (p.197-198)
"There was a significant impact of one year of READ 180 on grade 9-12 student reading scores on the SRI assessment. The effect size was .17. There was no significant impact of one year of READ 180 on grade 9-12 student reading scores on the California Achievement Test. The effect size was .08."
Portland (Xtreme Reading) (p. 204):
"There was a significant impact of one year of Xtreme Reading on the reading achievement of grade 7 and 8 students on the GRADE and on the Oregon State Assessment Test. The effect sizes of the impacts were .27 and .11, respectively. There were no significant impacts of one year of treatment on the reading scores of grade 9 and 10 students; on the GRADE, the effect size was .09, and on the Oregon State Assessment Test, the effect size was -.01."
San Diego (p. 213):
"After one year of intervention, there were no significant impacts on the reading achievement of grade 7 and 8 or grade 9 and 10 struggling readers. The effect sizes were.04 and .05, respectively, on the California Standards Test. The effect sizes were .12 and .05, respectively, on the DRP.
After two years of intervention, there were no significant impacts on the reading achievement of grade 7 and 8 or grade 9 and 10 struggling readers. The effect sizes were .08 and -.01,respectively, on the CST. The effect sizes were .09 and .00, on the DRP respectively."
Springfield and Chicopee: (Xtreme Reading or READ 180) (p. 222)
"After one year of implementation, READ 180 had statistically significant impacts on students reading scores at the end of grade 9. The effect size was .20. Xtreme Reading had no statistically significant impacts on student reading scores at the end of grade 9 after one year of implementation. The effect size was .04."
No comments:
Post a Comment