USNWR Rankings: Explaining Some of the Change, Inflation, & the Mechanics of Frustration

By now all law schools have their new, 2021 U.S. News and World Report rankings, and as always, reactions will be mixed. Some will be thrilled to see improvement, while others will be confused by a decline. Here, we'd like to look at some reasons schools might be disappointed by their 2021 rankings.

We'd first like to point out that the U.S. News rankings are not a perfect, nor even a good, measure of school quality. They rely tremendously on nebulous ideas of prestige, reward raw expenditures with no regard for efficiency, and fail to account for many state specific nuances. No applicant should ever make a decision about where to attend law school based on a short term fluctuation in that school's ranking. U.S. News themselves make this point, as they say here:

U.S. News stresses that the rankings themselves should not be used as the sole basis to decide to attend one graduate program or school over another. Prospective students should consider other factors as well.

Still, we'd be lying if we said the rankings aren't important, in that they clearly have a behavioral influence—fairly or unfairly. This can be shown by the tremendous attention they receive every year. They're front page stories on Above the Law and the topic of discussion at the law schools themselves among students and many faculty. For better or worse, U.S. News rankings aren't likely going anywhere any time soon.


The past couple of years have, by most accounts, been pretty fantastic for the legal education community, at least when compared to the early and mid-2010s. Applications to law schools have been up, legal hiring is improving, bar passage rates are increasing, and donors are shelling out big bucks to some law school programs.

But this success is creating a problem for schools. Their entering classes are getting more qualified, and their students are achieving greater professional success — so why aren't their rankings always reflecting this improvement? The answer to this question a combination of two problems: U.S. News' "Assessment Score" metrics, and overall rankings inflation.

Assessment Scores

40% of the USNWR law school rankings are calculated from the Peer Assessment and Lawyer/Judge Assessment scores. Those scores are what we like to call "sticky" — in the short and medium term, they simply don't move more than can be explained by random fluctuation in voter population and response rates. In the long term they can go up, but that movement typically happens after a school's ranking has already improved. It doesn't precede substantial improvement in the rankings; it's a lagging indicator (it can reinforce ranking improvement). We unfortunately can't show the many graphs and charts and underlying data we have presented to universities we work with that demonstrate this, so you'll have to take our word for it—but, if you think about it, this makes intuitive sense. Assessment Score voters have baked-in ideas about schools that it takes a lot to change. Think about your own opinion of different "brands." It would be unusual for you to to buy a new brand of ice cream over, say, your favorite type of Ben & Jerry's. People tend to be rigid in beliefs.

Overall Rankings Inflation

The other problem comes from the overall improvement in the applicant pool and the legal hiring market we mentioned earlier. When it comes to the rankings, a rising tide does not lift all boats—quite the opposite in fact. You can read the (incomplete) methodology here, but the key takeaway is this: schools are scored based on how they do in each factor in relation to the average of all ranked schools.

Now think about how improving application pools and hiring markets impact the average. If applicant numbers are going up, then law schools can enroll a more competitive student body as measured by LSAT and GPA—which is exactly what happened in 2019. 92 ranked law schools improved their LSAT medians, and 138 improved their GPA medians. It was absolutely possible for a given school to improve GPA or LSAT this year and still see a smaller "points" contribution towards their ranking from those metrics than they had last year. The same happened with acceptance rate, which had 112 schools become more selective by an average of almost a full percentage point.

Median GPA is a great example as well. Among all ranked law schools, any school whose GPA median went up by less than 0.03 saw their score contribution from GPA decline. That's not to say that GPA alone will cause rankings changes—it's just one factor among many—but it's a visible one that many people often point to and say "well we improved GPA, why didn't our rank go up?" This—"inflation"—is why.

If you want to take a look at some illustrative numbers, check here. We're only using public data, so it may vary ever so slightly from USNWR data and might miss a couple schools because USNWR doesn't always rank everyone, but it should help show a pretty close representation of how school score contribution from GPA changed from last year to now. The scale we used is entirely made up, but that's irrelevant; any scale is fine for our purposes as long as we hold the factors relative weights constant.

What about jobs? The overall improvement in hiring meant that a significant majority of schools have seen improvement in their "jobs" factor, as USNWR measures it. The average 10-month employment outcomes among ranked schools went up a few percent, especially among "full credit" long term, JD required and advantage jobs. Schools that didn't keep up with that saw their score contribution from 10-month employment go down.

Bottom line? Schools that improve don't always see that reflected in the rankings, and it's usually because other schools improved just as much, if not more. And some schools that do not improve do so because they are not always focusing on every single metric that is weighted–often intentionally. They are no "worse" a school than they were yesterday before the embargoed rankings were released.


Please keep in mind that this is just a very quick overview. We can create a very accurate USNWR ranking model replication... but in the interests of not violating our terms of agreement for data we're restricting ourselves to public data in this blog, which is fairly limited. There is a lot that goes into the rankings. If you are a university or law school, feel free to shoot us an email. We've helped many universities understand—and improve—their rankings at a very nuanced level.