The West Coast wins this round: Though Yale University Law School comes first in nearly every ranking of the nation’s best law schools—including the ubiquitous U.S. News & World Report ranking—Tipping the Scales’ 2013 analysis puts Stanford Law School at the front of the pack.
Of course, the top of the list has no shortage of East Coast schools. No. 2 Yale is right on Stanford’s heels, followed by No. 3 Harvard, No. 4 Penn—which is No. 7 on U.S. News’ list—and No. 5 Columbia. Aside from Stanford, every top five school in our ranking is an Ivy League institution in a large, Northeastern metropolitan area.
U.S. News’ top five includes at least one Midwestern school, putting Chicago at No. 4 (along with Columbia). In our ranking, however, Chicago is only No. 12, unexpectedly outdone by No. 7 Duke and No. 8 Northwestern.
TIPPING THE SCALES’ APPROACH
What accounts for the differences? Tipping the Scales’ ranking zeroes in on two key dimensions of the J.D. experience: the quality of the students getting into a law school and the success of the graduates going out. Bottom line is, these metrics are simple to understand and they get at what really counts in a law school education. Applicants want to know that their classmates will be as good as they are, that a school is highly selective in crafting its classes, and that at the end of the experience they will have a job and sizable compensation.
In our ranking, the scores for schools’ acceptance rates and median LSAT results are weighted 25% each. We reward schools that can be choosy about the students they accept. Another 25% depends on the percentage of graduates that don their caps knowing they have jobs lined up. Along with the fact that this figure says a lot about schools’ career services, states release bar exam results at different times., and we didn’t want to give certain schools geography-based advantages. Finally, median private sector salaries and median public interest salaries count for 12.5% each. Money isn’t everything, but it’s undeniably important for the many lawyers saddled with student loans.
We left out information that’s harder to quantify and far more likely to be suspect if not downright flawed. For example, in U.S. News’ ranking, input from deans and other faculty members accounts for 25% of schools’ index scores. Those opinion surveys are little more than popularity contests because deans and faculty have only limited knowledge of what is going on at schools other than their own. And they can be deeply flawed anyway due to possible sampling errors.
For similar reasons, we also don’t believe that the opinions of legal professionals count for all that much. Most of them would only vote for their alma maters, anyway. Yet, U.S. News annually polls law firm partners, state attorneys general and federal and state judges and their opinions count for 15% of that magazine’s methodology. And we certainly don’t include a fuzzy category used by U.S. News called “faculty resources”—expenditures per student, student-faculty ratio, and library resources. Frankly, that’s all well and good but only gets in the way of the more important criteria to determine the true quality of a law school.
The simpler a ranking is, the better. That’s because rankings that measure school quality on too many metrics are harder to interpret. You often can’t tell the exact reasons why a school is either falling or climbing in a ranking when there are too many factors influencing the result. And more often than not, the additional metrics only detract from the key measurements that are really important. The TippingTheScales’ methodology does away with this problem, bringing greater clarity to a school’s specific ranking.