Top Ten Ranking Brian Leiter's Law School Rankings

 ¶  Faculty Quality Rankings: Scholarly Reputation, 2003-04
 ◊  Advertise Here!

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

March 25, 2003

Ranking based on detailed surveys completed by 150 leading legal academics

Introduction
This latest addition to Brian Leiter's Educational Quality Ranking (EQR) site reports the results of a survey of more than 150 leading legal scholars; it thus replaces previous reliance on U.S. News academic reputation surveys (about which more below).  "Objective" measures of faculty quality--per capita productivity and citations--are still available elsewhere on this site http://www.utexas.edu/law/faculty/bleiter/rankings02/rankings.html, and these objective measures correlate rather well with the latest survey results reported here (except for schools like Minnesota and Colorado, which have had significant faculty changes in the interim).  Since high-quality survey data may ultimately be more informative than "objective" measures, it is my intent, for now, to rely on this data.  Since any aggregation rule for combining the results of this survey with objective data about faculty or student quality is also bound to be controversial, the EQR site will cease these aggregations in favor of presenting the individual measures for students to weigh as they deem appropriate.

Please note that other data on this same site--citations by scholar (not school), Supreme Court clerkship placement, placement in law teaching jobs--has been updated in the last two years, and may also prove useful for prospective students.

The Survey and the Method
Between March 3 and March 21, 2003, more than 150 leading legal scholars from around the nation completed the most thorough evaluation of American law faculty quality ever undertaken.  Scholars were invited to participate in the EQR survey based on the following criteria:

  1. Only active and distinguished scholars were invited.  These are the scholars most likely to have informed opinions about faculty quality.
  2. Multiple faculty from every school evaluated were invited
  3. .
  4. Diversity in terms of seniority was sought in the evaluator pool.
  5. Diversity in terms of fields and approaches was sought in the evaluator pool.

Evaluators were not permitted to evaluate either their own institution or the institution from which they had received the highest law degree.

The quality of evaluators in this survey is unparalleled: it includes the President and President-elect of the Association of American Law Schools; a dozen members of the American Academy of Arts & Sciences, the nation's most prestigious learned society; dozens of the most frequently cited legal scholars in numerous fields; and leading figures, junior and senior, in corporate law, criminal law, health law, constitutional law, jurisprudence, international law, comparative law, legal history, feminist legal theory, and many other fields.  The complete list of evaluators appears in Appendix A.

Evaluators were presented with faculty lists for 2003-04, based on the best present knowledge about who is teaching where next academic year.  Many important faculty moves for 2003-04 were accounted for in the faculty lists evaluators received, including, for example:

Stuart Benjamin’s move from Texas to Duke
Richard Delgado’s move from Colorado to Pittsburgh
John Duffy’s move from William & Mary to George Washington
Merritt Fox’s move from Michigan to Columbia
Elizabeth Garrett’s move from Chicago to Southern California
Deborah Malamud’s move from Michigan to NYU
Ronald Mann’s move from Michigan to Texas
Andrei Marmor’s (part-time) move from Israel to Southern California.
Hiroshi Motomura’s move from Colorado to North Carolina
Robert Peroni’s move from George Washington to Texas
Michael Perry’s move from Wake Forest to Emory
Stephen Perry’s move from Penn to NYU
Arti Rai’s move from Penn to Duke
Jim Rossi’s move from North Carolina to Florida State
Chris Sanchirico’s move from Virginia to Penn

But some faculty moves transpired subsequent to the survey.  For example:

Ellen Deason is moving from Illinois to Ohio State.
Martha Fineman is moving from Cornell to Emory.
Thomas Gallanis is moving from Ohio State to Washington & Lee
Larry Garvin is moving from Florida State to Ohio State.
Jack Goldsmith is moving from Chicago to Virginia.
Michael Heise is moving from Case Western to Cornell.
Thomas Merrill is moving from Northwestern to Columbia.
Dale Oesterle is moving from Colorado to Ohio State.
Robert Post is moving from Berkeley to Yale
Peter Shane is moving from Carnegie-Mellon to Ohio State.
William Simon is moving from Stanford to Columbia
Aviam Soifer is moving from Boston College to University of Hawaii (to become Dean)

Where subsequent moves were likely to have affected the evaluations in specialized ares of legal scholarship, they are noted below.  Ordinarily, the addition or loss of one faculty member is not likely to have a significant effect on a school's overall score, but there can be exceptions, especially for prominent senior figures moving to or from smaller faculties (e.g., Fineman).  Ohio State, which had multiple hires, likely would have picked up .1 or .2 in its overall score.

69 schools that had some plausible claim to being in "the top 40" in terms of faculty quality were included in the survey. The complete instructions to evaluators appears in Appendix B and the full list of schools evaluated appears in Appendix C.  Note that this is not supposed to be a list of the "top 69" schools; plainly other schools could have been included in the mix, but as it turns out the borderline contenders for inclusion did not come close to the top 40 in the survey.

The EQR survey is confined to "the top 40" faculties overall precisely because there is reason to think that active legal scholars, who are engaged in reading and producing scholarship, are in a good position to give a meaningful evaluation of at least the top 40 law faculties.  In addition, the "top 20" in eleven different specialty areas are also included.  There is reason to think that the 69 schools surveyed would dominate the "top 20" in the specialty areas.  (Note:  Deans may request complete results for their school, where those results are not available here; e-mail bleiter@mail.law.utexas.edu.)

Evaluators were presented with faculty lists identified only by number; no school names were provided.  This was an attempt to avoid the "halo" effect.  Of course, most evaluators could identify the schools, but it clearly makes a significant difference to begin an evaluation by reading a list of faculty names and then saying, "ah-ha, this is Harvard," as opposed to starting an evaluation by seeing "Harvard."  As one evaluator remarked:

"Giving the schools numbers instead of names is useful; it reinforces a strong message [even if] it does not cloak the identity of the schools."  And as another wrote:  "There's a surprising difference between my perception of faculty quality [based on school name] and how I rank them when I am faced with a list of faculty members."

The Rankings

Evaluators were asked to evaluate schools on a scale of 1 (weak) to 5 (excellent). Evaluators plainly had different centers of gravity in their rankings, with some giving few 5s, while others gave hardly any 1s. In order to guard against strategic and utterly idioscynratic voting, "outlier" scores were dropped. Outlier scores were defined as follows: the distribution of 5s, 4s, 3s, 2s, and 1s for each category was charted; where there were not more than two 5s or two 1s in a category where the school in question had either no 4s or no 2s, the scores were deemed to be "outliers" and dropped. (So, e.g., if a school in some category had received ten 5s, seventy 4s, and sixty 3s, no 2s, but then one or two 1s, the 1s would be dropped.) This resulted in approximately a dozen scores of 1 being dropped out of several thousand scores entered. This indicates, of course, that evaluators performed their tasks honorably. The mean and median scores printed below are after the outlier scores were dropped.

The size of each faculty is also listed in parentheses following the school name. There is some controversy over the impact of school size in a survey like this. Some argue--plausibly in my view--that larger faculties have an advantage, because it is more likely that an evaluator will know the work of someone on the faculty. Others contend that large schools deserve this advantage: if a school has more faculty doing work known to more scholars, doesn't that mean it's a stronger faculty? Still others deny that large schools get any advantage, and, indeed, that they are at a disadvantage. As one evaluator wrote: "If I had only ever heard of one person on a faculty, one interpretation is that that was a weak faculty." Thus, a large faculty with only one or two names known to an evaluator may be at risk for faring worse. In any case, students can see for themselves the differences in faculty size, below.

Other biases work their way into surveys of opinion about faculty quality. Schools on the two coasts are usually at an advantage in opinion surveys, because schools are more tightly clustered, geographically, and faculty tend to know each other better, both professionally and socially. Unsurprisingly, evaluators score faculties strong in their areas of expertise more highly; although a mix of evaluators (in terms of expertise) participated, it is clear that a slightly different mix would have some impact on the results. Since it is reasonable to suppose that changes in the mix of evaluators (even equally qualified evaluators) could lead to changes of .1 in either direction, I list runners-up both for the overall rankings and in the specialty rankings.

Brian Leiter's Top 40 Law Schools Based on Faculty Quality, 2003-04

Rank School (number of faculty) Mean Median
1. Yale University (53) 4.8 5.0
2. Harvard University (80) 4.7 5.0
  University of Chicago (35) 4.7 5.0
4. Stanford University (40) 4.5 4.5
5. Columbia University (61) 4.3 4.5
  New York University (82) 4.3 4.5
7. University of California, Berkeley (54) 4.2 4.5
8. University of Michigan, Ann Arbor (52) 4.1 4.0
  University of Texas, Austin (73) 4.1 4.0
10. University of Virginia (60) 4.0 4.0
11. University of Pennsylvania (45) 3.9 4.0
12. Georgetown University (87) 3.8 4.0
  University of Southern California (43) 3.8 4.0
14. Cornell University (32) 3.7 3.75
  Northwestern University (50) 3.7 4.0
  University of California, Los Angeles (62) 3.7 4.0
17. Duke University (47) 3.5 3.5
18. Vanderbilt University (39) 3.4 3.5
19. Boston University (54) 3.3 3.5
  University of Iowa (40) 3.3 3.5
21. University of Minnesota, Twin Cities (49) 3.2 3.0
22. George Washington University (66) 3.1 3.25
  University of Illinois, Urbana-Champaign (35) 3.1 3.0
  University of San Diego (47) 3.1 3.0
  University of Wisconsin, Madison (49) 3.1 3.0
26. Fordham University (58) 3.0 3.0
  George Mason University (40) 3.0 3.0
28. Cardozo Law School/Yeshiva University (40) 2.9 3.0
29. Emory University (40) 2.8 2.75
  University of California, Hastings (51) 2.8 3.0
  Washington University, St. Louis (38) 2.8 3.0
32. Boston College (36) 2.7 2.5
  Ohio State University (44) 2.7 2.5
  University of California, Davis (27) 2.7 3.0
  University of North Carolina, Chapel Hill (46) 2.7 2.5
  Washington & Lee University (33) 2.7 3.0
37. Chicago-Kent College of Law (37) 2.6 2.5
  College of William & Mary (30) 2.6 2.5
  Rutgers University, Camden (39) 2.6 2.5
40. Arizona State University (35) 2.5 2.5
  Indiana University, Bloomington (40) 2.5 2.5
  Tulane University (45) 2.5 2.5
  University of Arizona (33) 2.5 2.5
  University of Colorado, Boulder (29) 2.5 2.5
  University of Miami (48) 2.5 2.5

Runners-Up for the Top 40:

School (number of faculty) Mean Median
American University (51) 2.4 2.5
Brooklyn Law School (43) 2.4 2.5
Rutgers University, Newark (39) 2.4 2.5
University of Florida, Gainesville (59) 2.4 2.5
University of Notre Dame (37) 2.4 2.5
 ◊  Site Sponsor