A counselor's guide to the digital SAT: What we learned from the first full test date

The biggest news in the testing world is that the SAT has gone digital. While some things are staying the same, much has changed.

In this series, we’re answering counselors’ biggest questions about the new SAT. We’ve already covered the ins and outs of adaptive testing, testing timelines and what your students should consider when deciding when to test, and how digital SAT scores work.

This time, we’ll talk about what we learned from the students who took the first full administration of the digital SAT in March 2023. Click below for a video version of this post.

Click to watch

Did students encounter any technical issues?

For the most part, no. Test day went remarkably smoothly.

Over 70% of ArborBridge students reported no technical issues at all. Some students did report minor issues, such as the testing app briefly freezing, but in these cases, we've heard that proctors were able to troubleshoot the issues quickly so that students' scores weren't negatively affected.

And, from what we can tell, it's not just our students who got lucky. Globally, the test seems to have gone fairly well, with no widespread reports of technical or logistical issues. There were also very few test-center closures—just 61 outside the US. For context, that same day, if you totaled the number of closures in just four US states (California, Florida, New Jersey, and New York), you'd get more closures than there were outside the US.

Did the Bluebook practice tests prepare students for the real test?

In short, yes. Overall, the official practice tests available on the Bluebook app seem to be pretty similar to the actual test. We can see this both objectively and subjectively.

Objectively... Our students' most recent practice test scores were strongly predictive of their official scores—on average, there was just a 1% difference between the two.

Subjectively... We asked our students to rate how similar the official test felt to the practice tests. On a scale from 1 (not at all like the practice tests) to 5 (identical to the practice tests), they gave it an average rating of 4.3.

While we've seen isolated reports of students who thought that passages seemed either shorter or longer than they did on practice tests, or that vocabulary questions felt either much easier or much harder. However, there was nothing approaching consensus to these reports, so for now, we are discounting their accuracy.

Were students' digital scores similar to their past paper scores?

Yes, they were very similar. Compared to the scores on their most recent paper-based tests, our students' scores increased an average of about 2%—or 28 points. This gives further support to the College Board's claim that, while the format and structure of the SAT have changed, the digital test is still comparable to the paper-based test—objectively neither easier nor harder.

How did students feel about the test?

Regardless of how hard the test was objectively, a large number of students reported that the test felt harder than past official SATs. We think they felt this way because of the test's adaptive difficulty.

These students were surprised by the higher-difficulty second modules. Students who got these modules tended to have plenty of leftover time at the end of the first modules. And, in some cases, having this extra time lulled them into a false sense of security, so they started the second modules with a slower-than-ideal pace. Then, by the time these students realized they were working too slowly, it was too late for them to adjust.

It's important to note that we don't have any evidence that these students scored any lower than they would have, had they taken the paper SAT. Instead, these reports reinforce something we already knew—namely, compared to taking a regular, nonadaptive test, taking an adaptive test requires a shift in mindset and strategy.

What did we learn overall?

Our main takeaway is that, impressively, the digital SAT went just as well as the College Board said it would. The success of the digital SAT's rollout is a stark contrast to the other major test revisions in recent memory—most notably, the tumultuous launch of other digital tests, including the digital APs in 2020 and the digital version of the ACT that international students take.

The fact that this test went so smoothly, and that results were generally as-expected, increases our confidence in the College Board and in future digital SAT test administrations.

Need more individualized advice?

If you have specific questions, reach out to our experts here. We’re happy to help in any way we can.

 

About ArborBridge

ArborBridge is the global leader in innovative, digital, one-on-one tutoring. With nearly a decade of experience teaching students online, ArborBridge supports students of all kinds: home schoolers, AP students, test preppers, and more. Our tutors specialize in creating personalized plans and in providing compassionate support for students and families.

Jordan Browne

About Jordan Browne

In addition to graduating summa cum laude from Emerson College and holding an M.F.A. from Columbia University, Jordan was a Fulbright scholar to Montenegro, where he taught seven courses for the University of Montenegro. Along with teaching writing, rhetoric, and literature at the college level, Jordan has taught test prep for several years in New York public schools and across three continents. Ever since he was young, he’s been the weird one who actually enjoys standardized tests, and, for several years now, he’s taught students of every skill level and background how to like them too—or, at least, how to get the scores they need.

You also might like: