Today, tens of thousands of temporary scorers are employed to correct essay questions. This year, Maple Grove-based Data Recognition Corporation will take on 4,000 temporary scorers, Questar Assessment will hire 1,000, and Pearson will take on thousands more. From March through May, hundreds of thousands of standardized test essays will pour into the Twin Cities to be scored by summer.There are several examples (all negative) that show the application of the grading 'rubric' to scoring essays:
The boom in testing has come with several notable catastrophes. The most famous happened in 2000, when NCS Pearson incorrectly failed 8,000 Minnesota students on a math test. Pearson shelled out a $7 million settlement to the students, and Gov. Jesse Ventura participated in a makeup graduation for students who were wrongly denied their diplomas. In 2010, Pearson again miss-scored two questions on Minnesota's fifth- and eighth-grade tests. Delays in its Florida scoring resulted in a $3 million fine and glitches in Wyoming led the company to offer a $5.8 million settlement.
But while a mistake on a bubble form is a black-and-white problem, few scandals have broken on the essay side of the test-scoring business.
"It requires human judgment," says Michael Rodriguez, of the University of Minnesota's educational psychology department. "There is no way to standardize that."
Now scorers from local companies are drawing back the curtain on the clandestine business of grading student essays, a process they say goes too fast; relies on cheap, inexperienced labor; and does not accurately assess student learning.
Her first project was from Arkansas, an essay written by eighth-graders on the topic, "A fun thing to do in my town."The article peters out at the end but in the comments section was the following more balanced comment from JWHyperion (maybe he works in accounting):
And that's where the troubles began.
Suddenly, she was being asked to crank through 200 real essays in a day. The scanned papers popped up on the screen and her eyes flitted as fast as they could down the lines. The difference between "excellent" and "good" and "adequate" was decided in a matter of seconds, to say nothing of the responses that were simply off the reservation. How do you score a kid who rails that his town sucks? What about an exceptionally well-written essay on why the student was refusing to answer the question?
All over the room, the teachers were raising their hands and disputing the rubric. Indovino preferred to keep her head down and just score the way she was told to.
When it comes to essay testing, there will always be controversy, just as there will always be differing opinions in the classroom when a teacher gives you a "C" on an essay and you KNOW you should have gotten an "A". There are things called human error and human subjectivity. I agree that essay testing for a standardized test has many inherent problems. Again, however, know that in most states the teachers write these questions and create the rubrics for how they should be scored. Also, most importantly people should know that the essay portion of the state assessment does not dictate whethere a student passes or fails. A student's overall score is determined by both MC questions and the essay question with the emphasis in scoring on the MC questions. Not all states include essay questions as part of their assessment. Many use a short answer response that asks students to elaborate upon information gained from the text while bringing in their own experiences and opinions. In this case, if a student writes on a completely different topic they receive a zero just as they would on a math test if the added two numbers incorrectly.
There are of course inherent flaws in the system just as there are inherent flaws in most every system. Instead of just complaining, though, it would be nice if the detractors could use some of their effort to help come up with a solution. As my principal always said, "Don't come to me with a problem unless you have a proposed solution."