Ranking The Law School Rankings

Ranking

Pepperdine Professor Ranks Law School Rankings

“Rankings! Rankings! Rankings!”
You can almost hear Jan Brady ranting about the emphasis placed on law school rankings from the media and students alike. Let’s face it: No institution is truly happy unless it’s ranked #1.
When a school drops a few spots, you can be certain a dean is updating his or her resume (or preparing for an all-out media blitz). If a school is ranked #12, the dean will complain that it should be #10, arguing that the metric hindering them is skewed or subjective. And if a school isn’t ranked at all? Well, those schools didn’t submit data and probably don’t care.
U.S. News and World Report is considered by many to be the gold standard of law school rankings, weighing subjective variables like academic and professional assessments with quantitative data like LSAT scores, GPAs, acceptance and placement rates, student expenditures, bar passage rates, and library resources. Whether you agree with how much weight they give each variable (or how they conduct their assessments), U.S. News produces the most comprehensive scope of data around. You just have to decide what’s valuable and what’s negligible.
We even produced a law school ranking at Tipping the Scales, which relies on quantitative data reflecting the “quality of the students getting into a law school and the success of the graduates going out.” This includes LSAT scores, acceptance rates, job placement rates, and median salaries for private and public sector jobs.
Alas, there are plenty of other law school rankings, which emphasize everything from faculty academic prowess to placement at top law firms and companies. That’s why we considered it a great service when Derek M. Muller, an Associate Professor of Law at Pepperdine University, compiled a list of law school rankings. In fact, Professor Muller took the added step of ranking the rankings themselves. While Tipping The Scales was disheartened that our rankings were listed at #13 (one spot below U.S. News and World Report’s ranking), we bear Professor Muller no ill will.
That said, we were fascinating by what Professor Muller valued… and what it could potentially reflect about the law school academic community.
Let’s start with Professor Muller’s top pick: The Sisk-Leiter Scholary Impact Study, which was published by the University of Minnesota. Here is Muller’s analysis of this tool:
“Drawing upon the methodology from Professor Brian Leiter, it evaluates the scholarly impact of tenured faculty in the last five years. It’s a measure of the law school’s inherent quality based on faculty output. In part because peer assessment is one of the most significant categories for U.S. News & World Report rankings, it provides an objective quantification of academic quality. Admittedly, it is not perfect, particularly as it is not related to law student outcomes (of high importance to prospective law students), but, nevertheless, I think it’s the best ranking we have.”
This analysis makes a strange argument: Quality is judged by academic output, not teaching excellence (which is reflected in measurements like bar passage rates, starting salaries, and placement rates). Worse, it conflates thinking and writing ability with teaching ability, assuming great scholars also make great teachers. Here’s the worst part: Sisk-Leiter fails to include any input from students and employers, the ultimate consumers of the product delivered by law schools. While this tool is indispensible for identifying schools whose academic performance may exceed their reputations among deans and tenured faculty, it falls far short in measuring the real value of its product to the larger society.
Muller’s second pick, The National Law Journal’s 250 Go-To Rankings, offers a more reasonable metric: schools whose graduates landed positions with the nation’s top 250 law firms. While Muller admits that these rankings don’t “include judicial clerkships, or elite public interest or government positions,” we agree with him that associate hiring at these firms is a key metric for measuring elite schools. That said, employment in big law is dropping, as more attorneys move to small firms and entrepreneurship. As a result, this ranking will grow increasingly extraneous over time. Sadly, the 2013 rankings have been removed from The National Law Journal website, so we were unable to compare their rankings against others.
Rounding out Muller’s top 3 are rankings from The Princeton Review (via Taxprof). Student feedback drives much of this data, which includes separate rankings for areas like best professors, classroom experience, and quality of life. It’s worth noting that The Princeton Review rankings are somewhat problematic, as the methodology is vague and there’s no overall ranking of the schools. What’s more, their sampling technique often involves handing out surveys randomly or having schools distribute them to students they choose. As a result, the data could be skewed in numerous ways. Still, it yields some insights, particularly in the area of “Best Career Prospects,” which is partially derived from salary and placement data (and lists stalwarts like Columbia, Chicago, Harvard, and NYU near the top).
For the remaining rankings, check out Muller’s Excess of Democracy blog. In particular, you may enjoy Muller’s #7 ranked site, The Law School Transparency Reports, where students can compare data like cost projections, employment rates, clerkships, and incoming GPAs and LSATs.
Source: Excess of Democracy