Quick Takes: Automated Essay Grading Software

Automated Scoring More Efficient Than Human Grading

  Researchers Mark D. Shermis and Ben Hammer from the University of Akron announced earlier this month that there was “no significant difference” between grades given to essays by human graders and machine grading software. Though critics argue that such grading methods would encourage students to “game the system” with formulaic writing, the way updated software such as Vantage Learning’s IntelliMetric works ensures that this will not be the case. Most importantly, the machine grading of essays allows students to receive quicker feedback, which can be crucial to learning.

Automated essay scoring technology has progressed tremendously in recent years. Older interfaces like the Bayesian Essay Test Scoring System relied on simple metrics such as grammatical correctness and sentence level diversity. Such simple systems are easy to “game,” but newer software like Carnegie Mellon’s LightSIDE and McGraw-Hill’s Bookette are much more organic in their approach to grading. Instead of creating a set of rigid markers that “good” writing must contain, these programs are “trained” to mimic the grading patterns of a large number of human-graded sample essays.

Human graders in public high schools take three weeks on average to grade student papers, according to an informal study conducted by The Paper Graders, a website operated by a group of high school English teachers. Machine graders take minutes. A 1995 study conducted by the publishing company Cengage Learning found that students who receive immediate feedback on their writing improve significantly faster than those who do not. 

These systems aren’t merely as good as their human counterparts — they’re better. Because these programs are significantly faster than humans, teachers and students risk very little in adopting them, yet have a great deal to gain.

Ayan Kusari
Staff Writer

Computerized Grading Programs Allow for Cheating 

Writers, be alarmed — your papers may soon be judged by computers. A recent University of Akron study found that automated essay scoring software awarded essentially the same essay scores as did trained human graders. This finding may encourage schools to begin implementing such practices. However, computerized scoring poses problems of validity and should not be treated as adequate replacements for human graders.

Les Perelman, director of the Writing Across the Curriculum program at MIT, has successfully fooled the Educational Testing Services’ e-rater that has been used to grade the GRE and Collegiate Learning Assessment into giving high scores to unintelligible essays. Computer programs like the e-rater score essays by weighing in linguistic features such as diction complexity against the proportions of grammar and usage errors. Such programs attempt to circumvent cheating by giving the computer sample essays to “train” it to know what to look for — but the fact of the matter is that a computer is a computer. Just like how Stanford statistics Ph.D. Joan R. Ginther hit jackpot four times by figuring out the algorithm behind the lottery, determined students can figure out how to “beat” the system by incorporating words or structures that they know the computer will reward with points.          

While objective and consistent, computers cannot evaluate abstract qualities such as clarity, creativity, implied meanings and the ability to communicate with designated audiences. Furthermore, automated methods may be influenced by easily manipulatable features that inflate scores, or fail to recognize features that exemplify good writing mechanics. 

Automated substitutes may be appropriate for scoring mathematically-based subjects, but they are best left out of the multifaceted realm of writing.

— HILARY LEE
Associate Opinion Editor
 

Donate to The UCSD Guardian
$2515
$5000
Contributed
Our Goal

Your donation will support the student journalists at University of California, San Diego. Your contribution will allow us to purchase equipment, keep printing our papers, and cover our annual website hosting costs.

More to Discover
Donate to The UCSD Guardian
$2515
$5000
Contributed
Our Goal