Top Ten Ranking Brian Leiter's Law School Rankings

 ¶  Faculty Quality Based on Scholarly Impact, 2005
 ◊  Advertise Here!

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

July 2005
Corrected Version
April 2006

This is a ranking of the top 30 law faculties based on a standard “objective” measure of scholarly impact:  per capita citations to faculty scholarship.  We looked only at the top quarter of each faculty, largely for logistical reasons--it made the study more manageable--but partly because the scholarly standing of a school depends more on its best faculty, than its average. 

Impact was measured using Westlaw's JRL database rather than TP-ALL, since the latter includes on-line versions of treatises (for example, Wright & Miller on Federal Practice & Procedure) and thus would artificially inflate the counts for schools at which these treatise authors teach.  Names were searched as,

Brian /2 Leiter

except where multiple middle initials or similar factors made necessary a wider scope.  To guard against false positives with common names, ten to twenty of the "hits" were reviewed; the percentage that were false positives was then multiplied against the total number of hits returned, and that amount was subtracted from the citation total.  To fix the top quarter it was, of course, necessary to search one-third to one-half of the faculty.  In response to feedback on the 2003 study, citation counts were halved for part-time faculty, so that this year’s study is a more accurate measure of the full-time faculty’s scholarly impact. The citation counts were completed over the course of several days in early July 2005—roughly, two years since the last study--thus obviating the need for adjustments in the counts to reflect changes in the size of the database. 

Impact as measured by citations has important limitations as a proxy for scholarly reputation, and it is worth noting those in detail before going further.  Why might the correlation between impact and actual academic quality break down?  My colleague Richard Markovits aptly summarizes some of the problems:

Since many frequently cited articles are cited because they contain succinct statements of boilerplate propositions of law or of a particular academic approach to some set of issues, or because they fall squarely within a particular academic paradigm whose proponents make a practice of citing each other, the frequency of an author's citations has little to do with his influence, much less with the quality of his work. (“The Professional Assessment of Legal Academics,” 48 Journal of Legal Education, 417, 423-4 [1998].)

Although Professor Markovits leaps too quickly to his conclusion, he has certainly identified a genuine worry about the use of citations. Indeed, we might identify six kinds of phenomena at work here which skew the correlation between citation and quality.

Although Professor Markovits leaps too quickly to his conclusion, he has certainly identified a genuine worry about the use of citations.  Indeed, we might identify six kinds of phenomena at work here which skew the correlation between citation and quality. 

First, there is the industrious drudge:  the competent but uninspired scholar who simply churns out huge amounts of writing in his or her field.  Citation practices of law reviews being what they are, the drudge quickly reaches the threshold level of visibility at which one is obliged to cite his or her work in the obligatory early footnotes of any article in that field.  The work is neither particularly good, nor especially creative or groundbreaking, but it is there and everyone knows it is there and it must be duly acknowledged. 

Second, there is the treatise writer, whose treatise is standardly cited because like the output of the drudge it is a recognized reference point in the literature.  Unlike the drudge, the authors of leading treatises are generally very accomplished scholars, but with the devaluation of doctrinal work over the past twenty years, an outstanding treatise writer--with a few exceptions--is not necessarily highly regarded as a legal scholar.

Third, there is the “academic surfer,” who surfs the wave of the latest fad to sweep the legal academy, and thus piles up citations because law reviews, being creatures of fashion, give the fad extensive exposure.  Any study counting citations, depending on when it is conducted, runs the risk of registering the "impact" of the fad in disproportion to its scholarly merit or long-term value or interest.

Fourth, there is work that is cited because it constitutes “the classic mistake”:  some work is so wrong, or so bad, that everyone acknowledges it for that reason.  The citation and organizational preferences of student-edited law reviews exacerbate this problem.  Since the typical law-review article must first reinvent the wheel, by surveying what has come before, the classic mistake will earn an obligatory citation in article after article in a particular field, even though the point of the article may be to show how wrong the classic mistake is.  True, some authors of classic mistakes may have excellent reputations; but who among us aspires to be best remembered for a "grand" mistake?

Fifth, citation tallies are skewed towards more senior faculty, so that faculties with lots of “bright young things” (as the Dean of one famous law school likes to call top young scholars) won’t fare as well, while faculties with once-productive dinosaurs will.

Sixth, citation studies are highly field-sensitive.  Law reviews publish lots on constitutional law, and very little on tax.  Scholars in the public law fields or who work in critical theory get lots of cites; scholars who work on trusts, comparative law, and legal philosophy do not.

So for all these reasons, one would expect scholarly impact to be an imperfect measure of scholarly quality.  But an imperfect measure may still be an adequate measure, and that is almost certainly true of citation rates as a proxy for impact as a proxy for reputation or quality.

The overall ranking was based not only on mean per capita impact for the top quarter of each faculty, but also the median per capita impact.  This was to guard against the distorting effect of having one or two faculty with enormously high citation counts on an otherwise low-cited faculty.  (The data on mean and median per capita impact follow the overall ranking.)  The final score—the basis for the “overall” ranking that follows--is the sum of the normalized scores for mean and median per capita impact divided by two

Interpretive Comments on the Results: The University of Chicago Law School continues to lead everyone else in per capita citations, despite the fact that only half the citation totals for Frank Easterbrook and Richard Posner were counted (since they are both part-time at the Law School). Indeed, Chicago would lead everyone in per capita citations even without counting Judges Easterbrook and Posner at all. Chicago not only has two of the three most frequently cited full-time law professors in America (Richard Epstein and Cass Sunstein; the third is Laurence Tribe at Harvard), it also has a host of faculty who weren’t in the top quarter, but are still in the elite group of faculty cited in more than 1,000 articles (for example, Albert Alschuler, Douglas Baird, Saul Levmore, Martha Nussbaum, Eric Posner, and David Strauss).[1]

The other noticeable changes since 2003 all have tangible explanations: Duke added Erwin Chemerinsky from Southern California and Curtis Bradley from Virginia; UC Hastings added Geoffrey Hazard from Penn and Joan Williams from American; Penn, in addition to losing Hazard (who retired in December 2005) to Hastings, lost one of their other most-cited faculty members, Edward Rubin, to the Deanship at Vanderbilt (Penn is also hurt by having many of its “top” faculty being mid-career folks, whose citations counts are necessarily lower; only two faculty—C. Edwin Baker and Paul Robinson—have garnered citations in more than 1,000 articles, for example); Cornell lost its most-cited faculty member, Jonathan Macey, to Yale; Texas added Bernard Black from Stanford; Michigan saw Yale Kamisar retire; Miami lost John Hart Ely; Illinois added two new faculty who are now in the top-quarter for citations, David Hyman and Larry Solum; and so on.

Again, do bear in mind that citations are only one measure of faculty quality, with some well-known limitations noted above. I very much doubt that in a new reputational survey of leading legal scholars, like the one we conducted in 2003, that Duke would turn up in the top ten (though it would surely, and deservedly, place better than its 17th-place showing in 2003), or Berkeley in the top five, or Penn outside the top 20 (or top 15, for that matter). However, if you compare the 2003 reputation data (the last column, below) to the 2003 per capita citation results, one can see that there is some reasonably good correlation between the two, with some notable exceptions. So a significant improvement, or decline, in per capita citation rank between 2003 and 2005 is likely to be reflected in some change in reputation rank as well.

The Corrected Version (April 2006) adjusted primarily for a handful of errors of omission (at Pittsburgh, George Washington, William & Mary, BU), that is, faculty who were in the top quarter of a particular school’s citation counts, but had not been properly credited in the July 2005 version. In one additional case, Vanderbilt, the top quarter was wrongly calculated (13 instead of 11). Because it has become clear that too many readers take the “top quarter” in citations to be equivalent to the “top quarter of the faculty,” I am no longer printing the list of faculty in the top quarter in this version. I may print it in future versions, so that there are opportunities for corrections, as there were on this occasion.

OVERALL TOP 30 BASED ON MEAN AND MEDIAN PER CAPITA IMPACT JULY 2005

Rank 2005

School

Score

Per Capita
Citation Rank
2003

Reputation
Rank 2003

  1

University of Chicago

100

  1

2

  2

Yale University

77

  2

1

  3

Harvard University

68

  3

2

  4

Stanford University

61

  4

4

  5

University of California, Berkeley

50

  6

7

  6

New York University

45

  7

5

  7

Columbia University

43

  5

5

  8

Georgetown University

37

  8

12

  9

Duke University

31

15

17

 

University of Texas, Austin

31

11

8

11

Cornell University

29

  9

14

12

Northwestern University

28

13

14

13

University of Michigan, Ann Arbor

27

10

8

 

University of Virginia

27

11

10

15

University of California, Los Angeles

26

18

14

16

George Washington University

24

20

22

17 Boston University 23 25 19

 

University of Colorado, Boulder

23

15

40

19

Emory University

22

18

29

 

University of Illinois, Urbana-Champaign

22

21

22

  Vanderbilt University 22 23 18

22

University of Pennsylvania

20

13

11

23

George Mason University

19

23

26

 

University of Arizona

19

25

40

 

University of California, Hastings

19

38

29

 

University of Minnesota, Twin Cities

19

25

21

 

University of San Diego

19

30

22

28

Arizona State University

18

25

40

 

Ohio State University

18

33

32

  University of Iowa 18 33 19
 

University of Pittsburgh

18

30

Not in top 40

Other Schools Studied

 

Brooklyn Law School

17

25

Runner-up
for top 40

 

University of Southern California

17

21

12

 

Chicago-Kent College of Law

16

33

37

 

Washington University, St. Louis

16

30

29

 

Cardozo Law School/Yeshiva University

15

33

28

 

Fordham University

15

33

26

 

University of North Carolina, Chapel Hill

14

Not in the
top 40

32

 

College of William & Mary

12

Not in the
top 40

37

Other Schools on Which Preliminary Data Was Collected
(but which were clearly not going to make the top 30 after that data was in)

 

University of California, Davis

 

38

32

 

University of Miami

 

15

40

 

University of Wisconsin, Madison

 

38

22

TOP 30 BASED ON MEAN PER CAPITA IMPACT

Rank

School

Normalized
Score

Raw Mean Citations per
Top Quarter of Faculty
(number of faculty)

   1

University of Chicago

100

3924 (9)

   2

Yale University

66

2595 (13)

   3

Harvard University

58

2286 (20)

   4

Stanford University

52

2044 (11)

   5

University of California, Berkeley

43

1668 (14)

   6

New York University

38

1508 (19)

   7

Columbia University

37

1473 (15)

   8

Georgetown University

35

1358 (23)

   9

Duke University

28

1093 (12)

 10

University of Texas, Austin

27

1044 (16)

 11

University of Michigan, Ann Arbor

25

970 (14)

 12

Northwestern University

24

940 (12)

 13

Cornell University

23

897 (9)

 

University of Virginia

23

922 (15)

15

Vanderbilt University

22

855 (11)

 16

University of California, Los Angeles

21

819 (16)

 17

Emory University

20

772 (12)

 

George Washington University

20

756 (18)

19 Boston University 19 727 (12)
 

University of Illinois, Urbana-Champaign

19

738 (10)

21

George Mason University

18

700 (9)

 

University of California, Hastings

18

719 (13)

 

University of Pittsburgh

18

700 (10)

 24

University of Arizona

17

680 (8)

 

University of Iowa

17

649 (10)

 

University of Pennsylvania

17

669 (13)

 27

Chicago-Kent College of Law

16

620 (10)

 

University of Colorado, Boulder

16

639 (8)

 

University of Minnesota

16

611 (13)

 

University of San Diego

16

614 (13)

Other Schools Studied

 

University of Southern California

15

590 (10)

 

Arizona State University

14

541 (11)

 

Ohio State University

14

563 (12)

 

Brooklyn Law School

13

528 (12)

 

Cardozo Law School/Yeshiva University

13

526 (11)

 

University of North Carolina, Chapel Hill

13

519 (11)

 

Washington University, St. Louis

13

493 (10)

 

Fordham University

12

473 (16)

 

College of William & Mary

12

467 (11)

TOP 30 BY MEDIAN PER CAPITA IMPACT

Rank

School

Normalized
Median Score

Raw
Median Score

High/Low
Citation Count

   1

University of Chicago

100

2540 (9)

9100/1630

   2

Yale University

  87

2220 (13)

4480/1800

   3

Harvard University

  78

1985 (20)

8110/1300

   4

Stanford University

  69

1750 (11)

3100/1220

   5

University of California, Berkeley

  56

1425 (14)

4390/760

   6

New York University

  52

1330 (19)

3010/1000

   7

Columbia University

  48

1230 (15)

3300/900

   8

Georgetown University

  39

  980 (23)

4520/530

   9

University of Texas, Austin

  35

  880 (16)

2650/560

 10

Cornell University

  34

  855 (9)

1770/520

 

Duke University

  34

  855 (12)

3760/560

 12

Northwestern University

  32

  800 (12)

2450/510

 13

University of California, Los Angeles

  30

  755 (16)

1370/570

 

University of Virginia

  30

  770 (15)

1900/570

 15

University of Colorado, Boulder

  29

  740 (8)

1040/220

 

University of Michigan, Ann Arbor

  29

  740 (14)

1910/610

 17

George Washington University

  28

  705 (18)

1830/410

 18

Boston University

  26

  650 (12)

1300/390

 19

University of Illinois, Urbana-Champaign

  25

  645 (10)

1250/370

 20

Emory University

  23

  585 (12)

1590/370

 

University of Pennsylvania

  23

  590 (13)

1100/340

 22

Arizona State University

  22

  570 (11)

1170/180

 

University of San Diego

  22

  555 (13)

1330/260

  Vanderbilt University   22   560 (11) 2010/410

 25

Ohio State University

  21

  520 (12)

1030/220

 

University of Arizona

  21

  530 (8)

1520/490

 

University of Minnesota

  21

  520 (13)

1220/360

 28

Brooklyn Law School

  20

  500 (12)

1130/230

 29

George Mason University

  19

  470 (9)

3390/320

 

University of California, Hastings

  19

  475 (13)

2950/290

 

University of Iowa

  19

  470 (10)

2240/280

 

University of Southern California

  19

  475 (10)

1070/400

Other Schools Studied

 

Fordham University

  18

  465 (16)

  990/290

 

Washington University, St. Louis

  18

  450 (10)

  880/300

 

Cardozo Law School

  17

  440 (11)

  840/380

 

University of Pittsburgh

  16

  450 (10)

3280/260

 

Chicago-Kent College of Law

  15

  390 (10)

1080/300

  University of North Carolina, Chapel Hill   15   390 (11) 1150/200
 

College of William & Mary

  11

  300 (11)

1650/200

Remember, as noted, that citations are field-sensitive: constitutional law, Critical Race Theory, feminist legal theory, international law, intellectual property, and law and economics, among others, are all high citation fields: schools strong in those areas will fare better than those whose strengths lie elsewhere. By contrast, tax, wills & estates, property, admiralty, legal philosophy, labor law, and comparative law are much lower citation fields; schools with substantial strengths in those areas will not, in virtue of those strengths, fare well in a study like this. Citation counts are also seniority sensitive: it’s hard to break into the top quarter of one’s faculty in citations for younger scholars, easier for someone who has been publishing for 20 years.

In the earlier version I had listed the top-quarter of most-cited faculty; this was helpful in terms of eliciting corrections, but pernicious insofar as these lists were interpreted as identifying the “top faculty” at the respective schools, which they manifestly did not, because of the problems with citation counts noted. In lieu of listing individual faculty, let me note that every faculty member in the top quarter at Chicago, Yale, Harvard, Stanford, and NYU was cited in more than 1,000 articles (as of July 2005). Here are the number of faculty at other schools cited in at least 1,000 articles (as of July 2005): Columbia (13); Georgetown (12); Berkeley (9); Duke (4); Michigan (4); Texas (4); UCLA (4); Virginia (4); Emory (3); George Washington (3); Northwestern (3); Brooklyn (2); BU (2); Chicago-Kent (2); Cornell (2); Hastings (2); Illinois (2); Minnesota (2); Ohio State (2); Penn (2); San Diego (2) Vanderbilt (2); Arizona (1); Arizona State (1); Colorado (1); George Mason (1); Iowa (1); North Carolina (1); Pittsburgh (1); Southern California (1); William & Mary (1).


[1]After the study was completed, we learned that Professor Alschuler will take early retirement at Chicago and move to Northwestern for 2006. This would not affect the results for Chicago, since Professor Alschuler was not in the top quarter, but would boost Northwestern a notch, tying or perhaps passing, Cornell (obviously this result might be affected by other changes between now and 2006).

 ◊  Site Sponsor