Those of you who have heard me speak about College Admissions know my take on the issue of the validity of standardized testing as a measure of college readiness. College Board (the administrators of the SAT) and ACT have for years assumed that their tests are a good measure of the ability of a student to thrive in a selective and competitive academic environment and as a result the content of the tests have been altered only modestly in the last 70 years to reduce demographic score gaps. There is sufficient evidence however (the Bates Study and the UC Study are good examples) to at least call into question whether the SAT or ACT add any value to admissions considerations, and neither has addressed the very real concerns that academia has raised about reliance on the tests for predicting student success (Fair Test and its 1000+ partner institutions are leading the charge.) With that issue left outstanding, we now face a different problem, one rooted in the desire of ACT and SAT to remain relevant in a post-secondary educational culture that continues to question the need for standardization.
ACT and SAT have begun to sense their tenuous foothold and have instead concentrated on selling themselves as measures of high school success. In the wake of the President Obama's Every Student Succeeds Act (ESSA) of 2015, states have begun to seek out ready-made assessment alternatives to the very expensive process of developing and maintaining their own standards and tests. The pre-existing college entrance exams are low-hanging fruit, and twelve states now require the SAT for all high school juniors and another six, including New York, allow districts to decide whether they would like to require it of their students. Twelve other states require the ACT with eight more having partial arrangements.
It would seem that both tests are now claiming that they are good measures of high school success as well as indicators of college readiness. However, new evidence suggests that this too may not be the case. I wrote about this issue back in September with regard to Nevada and its reaction to low ACT scores, and the news seems to be getting worse. According to a study just released by the Assessment Solutions Group, neither the ACT nor the SAT sufficiently assesses the Florida State Standards for Math or English Language Arts, and both tests would have to be significantly augmented to fall within an acceptable margin. The tests also fail to interchangeably measure students, with some students performing wildly different when taking two or more of the tests. Using the exams for the purposes of accountability of teachers schools and districts were equally varied, which means that unless all schools or districts use the same test the results cannot be accurately compared. Unfortunately, as in Nevada, the decision to require these tests fell squarely on the shoulders of the elected officials of the State of Florida; parents, students, and teachers had no say in the matter.
The ACT and SAT need to figure out what they want to be. Swiss Army knives are the toys of boy scouts, not the tools of tradesmen. One-size-fits-all is not a good way to determine student futures, school and teacher success, budget allocations, and broad academic trends. Tools need to be well designed for their purpose, and no matter how good a tape measure you have you can't use it to hammer in a nail.