Recent data has suggested a significant drop in students' ACT essay scores after the exam modified the essay section and its rubric last fall. After several weeks of speculation but no concrete conclusions, students and counselors alike are voicing complaints that ACT has given no explanation for disappointingly low essay scores.
In an effort to respond to the multitude of concerns from parents, students, and professionals, ACT published an official explanation of their essay scoring. You can view it here. We highlighted the most important points below for your convenience.
On average a student's writing score is 3 points lower than his or her score in English or reading. ACT insists this is not unexpected or an indication of a problem with the new test. They urge students to avoid comparing raw section scores and instead suggest looking at the percentiles. Percentiles compare a student to all of the students who took the ACT essay.
Take a look at the table below which illustrates the differences in ACT section scores at the same percentile.
You'll see that the writing scores in the 95th percentile are lower than the remaining four section scores. Why does this occur? According to ACT, each section is "designed to maintain score comparability" across different test dates but not across different sections. Essentially, the tests are designed so that a score of 25 in math is the same whether you take the test in February or June. But the tests are not designed so that a score of 25 in ACT math means the same thing as a 25 in ACT science or reading. Because the ACT measures five distinct academic skills (English, math, reading, science, and writing), it may be natural to compare strengths and weaknesses by only looking at a student's highest and lowest scores in different sections. And students may be more apt to compare their writing scores with other section scores now because the new writing section is scored on the same scale of 1-36 (instead of the old 2-12). But ACT again says this isn't a valid comparison. What is valid? Percentiles, because they stay consistent from section to section.
Every administered ACT includes some level of imprecision—a student's performance can vary depending on the day, how the student feels, etc. But this level of variation is predictable on a standardized test. After years of testing students, ACT knows that a student's English, math, reading, or science section scores can each vary 2 points up or down. So a student who scores a 25 on English on a great day might earn a 27 but on a bad test day earn only a 23. On the new writing section, though, it's wildly different. The score can vary 4 points up or down. This same student who scores a 25 on the writing section may go as high as a 29 on a good day but down to a 21 on a bad one.
ACT argues that this greater variance on the writing test is because "it is a single task, evaluated by raters using a 6-point interval scale, while other ACT tests are comprised of 40-75 questions." And they are right. A student gets one prompt and thus one shot on the essay. But in reading there are 4 different passages to "reset" if a student is thrown off. ArborBridge also speculates that, when there are only 6 points a student can score from any individual reader on any given metric on the new essay, a difference of 1 point on a metric can compound itself. Moreover, remember that the writing section is the only section scored by humans based on impressions. There's no clear right or wrong answer. This fact can further impact the variability.
The ACT's statement that claims this wider variation in writing is nothing new. It's always been the case since the ACT added the essay in 2005. It's just that on the old essay scored on a scale of 1-12, the variation wasn't so obvious. Now it is clear to students because the writing score is on the same 1-36 scale as the other sections.
ACT leaves us with a final reassurance that over time students will "become increasingly familiar with the new prompt, scores may increase."
That's true in the long term, but the real question is how will colleges respond to this situation this year? Will they use percentiles to gauge students' performance, as ACT recommends? Or will they look just at the scores? Colleges don't all have answers yet but each college will handle it individually. Some have decided to no longer require the ACT or SAT essay. All schools do get the percentile information so it is likely they will take ACT's recommendation into consideration. There also might be a third option on the table. The makers of these college exams are known to work with colleges to cater to admissions needs. It's possible that colleges could pressure ACT to adjust its scoring scale to bring the writing section into alignment with the other sections, just to make comparisons easier. But that option is likely a longer term solution and is only a possibility at this point.
Students should research their target colleges to determine if they even need the writing section. It might not be necessary for you, so you can ignore the news.
For those students who have to take the writing section and receive a lower than expected score, you can pay to have ACT re-score your essay for $50. According to an article in The Washington Post last week, some students have seen their scores jump with a re-score. So far, not many students have taken this step, but it is an option. (Note, according to ACT policy, if the re-scoring determines a student should have earned a higher score, the score on the student's record is changed to the higher score and the student will get the $50 fee refunded. But if the re-scoring finds that a student's score should actually be lower, the score is NOT changed. ACT allows the student to keep the higher initial score but will charge the $50 fee.)
Hopefully students, parents, and counselors feel more informed after reading ACT's response to the essay scoring fiasco, but there is still more to discuss surrounding ACT essay scoring and its implications. Stay tuned to ArborBridge's blog for more updates!