
What are your favorite local places for shopping, pampering or entertaining? Vote now in this year's Best Of Holiday Shopping readers' choice poll.
What made reading and writing SOLs so hard? The questions, or how they were presented?
Thursday, August 22, 2013
The latest catchphrase driving education policies across the nation and here in Virginia is “increased rigor.” It is the force behind harder end-of-term Standards of Learning exams that Virginia has introduced in science, math and now reading and writing.
Children, even the youngest among them, are expected to be on the path to college or career readiness. To get them there, they are required to know more and express their knowledge beyond the standard multiple choice test and to do so while using computer programs. The result, as could be predicted, is that test scores plummet the first year that new, more rigorous exams are administered. Case in point: math.
Two years ago, SOL math scores across the state and across grade levels plummeted. Some school districts complained they weren’t prepared for the more difficult exams; had they known what to expect, they would have taught their students better test-taking skills. Others didn’t blame the test, but rather their own approach to teaching math, finding it inadequate.
Regardless, schools made adjustments, and the results are evident in the SOL scores released this week by the Virginia Department of Education. The pass rates are rising.
Pass rates, though, for reading dropped dramatically — by double digits for elementary and middle school students — and could place many schools at risk of losing accreditation. Because the students, for the first time, were required to take the reading and writing SOL exams on a computer — with so-called technology-enhanced questions, some with multiple “right” answers — it is difficult to determine how much of the failure is due to the difficulty of the test and how much to not being up to the “increased rigor” of what was being tested.
In Roanoke, city educators were stunned by the test — which they claim kept changing, preventing them from effectively teaching test-taking skills — and the difficulty students had taking it. They determined that students with home computers did fine, while those without struggled miserably.
It’s hard to take a writing exam on a computer if you can’t type. An effort is under way to get used, donated laptops into that hands of children who lack a computer. Just one more belated adjustment in trying to catch up with changing expectations.
State officials, too, should consider whether the format of the new technology-enhanced tests deserves a passing grade before penalizing schools. In increasing subject rigor and testing rigor at the same time, it’s hard to sift through this year’s results.