Despite declining academic preparedness, grades continue to rise. Contrary to public perception, rampant grade inflation has so eroded the meaning of grades that SAT and ACT scores are now better predictors than a student’s high school GPA of academic success in college.
Declining Academic Preparedness
According to The National Assessment of Educational Progress (NAEP, which is considered the “Nation’s Report Card”), the Reading and Mathematics proficiency of U.S. students is declining.
For 13-year-olds (the oldest students evaluated on the long-term trend assessments), Reading scores are down 7 points from their peak.
Math scores are down 14 points from their peak.
Rampant Grade Inflation
And yet, even as academic preparedness has declined, grades have risen.
In 1966, roughly 80% of surveyed students at BA-granting universities reported that they did not have A-averages in high school (only 21.8% reported that they did).
Last year, those numbers were more than reversed: only 14.2% said that they did not have A-averages in high school. An astounding 85.8% said that they did.
In October of 2023, John Latting, Dean of Admissions at Emory, said: “We’re not as trusting, frankly, of GPA these days…. Grades are definitely inflated and not as connected to true class performance as they used to be.” Instead, Emory will be “weighing ‘external assessment’ more heavily than GPA, with a particular focus on AP scores.”
That’s an astounding statement — possibly a first in U.S. history that a dean of admissions from a great college has said that high school grades matter less to them for admissions decisions than standardized test scores.
It also explains why more and more schools are considering AP exam scores in their admissions process. UCLA, for instance, does not consider SAT or ACT scores in its admissions process.
Here is the GPA of students admitted to UCLA in 2023:
Yes, you’re reading that correctly. The median (middle 50th percentile) unweighted GPA of UCLA’s admitted students is a 4.0 out of a possible 4.0. And seventy-five percent of admitted students have a 3.95 or above (again, out of 4.0!).
Is it any wonder then that UCLA still uses AP exam scores in admissions (even though it calls its admissions “test-free”) when essentially all of its admitted students have virtually identical, perfect grades? When all students have the same grades, then grades quite literally can’t predict any differences in college GPA because there is no difference in applicants’ high school grades to suggest that there would be any difference in their college grades.
I’m not cherry-picking schools either. I picked UCLA at random to look at and then next picked the University of Wisconsin-Madison. UW-Madison is a good school, but not UCLA caliber. And yet the average high school GPA of its enrolled students is a 3.9 (and 48% of enrolled students had a 4.0 out of 4.0 high school GPA).
In short, grades are inflated.
The Predictive Capacity of SAT and ACT Scores
So, it’s not so much that SAT and ACT scores have gotten better at predicting success in college (though, the tests have both changed very substantially over the decades, so this is probably true as well). But, more so, grades have just become worse and worse at predicting academic success.
How much worse?
In 2024, Dartmouth said that SAT/ACT scores are roughly 2.4 times more predictive than a student’s high school GPA. (High school GPA by itself explained about 9% of first-year GPA; an SAT score by itself explained about 22%. What about an SAT score + high school GPA? 25%. So the addition of high school GPA added only 3%.)
Also in 2024, Opportunity Insights, which studied all of the Ivy and Ivy+ colleges, found that SAT scores were 3.9 times more predictive than high school grades at these colleges. (SAT alone explained 19.3%; high school GPA alone explained 5%; SAT + high school GPA explained 19.6% — so the addition of considering a student’s high school GPA added only .3% to predicting a student’s GPA in college.)
Even though SAT and ACT scores are (much) more predictive than high school grades, it still surprises me that colleges have publicly acknowledged this is the case. But at least some have admitted it.
“SAT and ACT tests are better predictors of Harvard grades than high school grades.” – Harvard Admissions Website (2024)
“Test scores are the single largest predictor of a student’s academic performance at Yale, and this is true over all four years, and it’s true even when we control for every other available variable that we can.” – Mark Dunn, Assistant Director of Admissions at Yale, on the Yale Admissions Podcast (2024)
“Standardized test scores are a much better predictor of academic success than high school grades.” – Christina Paxson, President of Brown University, quoted by the New York Times (2024)
Most of the public has not yet caught up to just how little most students’ high school grades really mean.
A 3-Hour Test Versus 3 Years of Grades
A natural question arises: how? How is it possible that a roughly 3-hour test gives better insight into a student’s academic preparedness than a student’s cumulative GPA from 9th, 10th, and 11th grades?
Part of the explanation is, as discussed, grade inflation. But there’s more to it. Grades are so inconsistently determined: students from over 20,000 high schools in the U.S. receive grades on a variety of different grading scales, take different classes, and receive those grades from different teachers. There is no standardization such that an “A” at one school can be compared with any accuracy to an “A” at another school.
But, that’s still only part of it.
The rest of the explanation hinges on the tests themselves. Despite people thinking that students can game the SAT and ACT with tips/tricks, this is simply not the case. Sure, there are some tips/tricks, but no student can significantly increase with tips/tricks alone. In order to significantly improve, students have to actually learn the content and develop the skills evaluated on these exams. Thus, scores reflect an actual acquisition of knowledge and skill.
Even more important, however, is what knowledge and skills are tested. The SAT and ACT did not pick what to test at random. It’s not speculative fortune-telling. Collegeboard and ACT studied what knowledge and skills best predict success in college, and then they designed tests to measure those specific concepts and skills.
So, when students do test prep, they’re not learning at random. What they are learning is what is most important for them to know: rhetorical skills, the fundamentals of mathematics, reading analysis, and data analysis. And, to be honest, it’s amazing to tutor students for the SAT and ACT. You get to teach them what they should know: how to break down a paragraph to really understand its structure and meaning, how to correctly approach reading a graph (look at the x-axis, y-axis, legend, and then the data points and trend of those data points), how to construct a cohesive sentence that better accomplishes your purpose, how to turn word problems with real-world application into solvable mathematical equations, etc.
When students and parents better understand the SAT and ACT, they better understand test prep: an opportunity to learn the most important knowledge and skills needed to succeed in college.
SAT® and ACT® are registered trademarks belonging, respectively, to Collegeboard and ACT, Inc. Neither Collegeboard nor ACT, Inc. is involved with or affiliated with Summit Prep, nor does the SAT or ACT, Inc. endorse or sponsor any of the products or services offered by Summit Prep.