This Essay presents a list of the fifty most-cited legal scholars of all time, intending to spotlight individuals who have had a very notable impact on legal thought and institutions. Because citation counting favors scholars who have had long careers, I supplement the main listing with a ranking of the most-cited younger legal scholars. In addition, I include five specialized lists: most-cited international law scholars, most-cited corporate law scholars, most-cited scholars of critical race theory and feminist jurisprudence, most-cited public law scholars, and most-cited scholars of law and social science. (For those readers who cannot wait to see the actual lists, Tables 1–7 are on pages 8–11.)

The utility of citation totals as indicators of scholarly quality or even of scholarly influence is controversial, but they have been shown to correlate positively with informed subjective assessments. The danger in relying on such counts is that, because they are so convenient, they will be disproportionately relied upon relative to their actual probative value. There are a number of significant biases in citation statistics, and there are a variety of pitfalls that should be avoided in attempting to compile meaningful citation data. I will describe these biases and pitfalls when I explain the derivation and methodology of my study. It is my hope that I have produced tabulations that, although they clearly have imperfections, can serve as examples of careful analysis. Such examples are sorely needed after flawed proposed “scholarly impact rankings” by the U.S. News and World Report threatened to have a harmful effect on legal education.

TABLE OF CONTENTS

I. Background and Methodology

A. Early Works and the Landes & Posner Critique

Citation analysis has been around for a long time in law. Indexes of cases cited by the cases printed in reporter volumes may be found as far back as 1743, when an English reporter, Raymond’s Reports, contained “A Table of the Names of the Cases” in which “The cases printed in Italic are cited cases.”1 In 1857, Samuel Linn published a full book titled An Analytical Index of Parallel Reference to the Cases Adjudged in the Several Courts of Pennsylvania.2 Sixteen years later, Frank Shepard began to print citations to Illinois Supreme Court cases on gummed paper for subscribers to post into their reports volumes.3 Eventually, Shepard’s Citations expanded to a nationwide system of bound books and supplements listing subsequent citations to judicial decisions and other legal sources.4 A former vice president of the Shepard Company, William Adair, suggested in a 1953 letter to Eugene Garfield that the citator principle of Shepard’s Citations could be used as an indexing technique for scientific literature.5 Garfield pursued the suggestion, creating the Science Citation Index.6

The Science Citation Index proved to be extremely successful and established citation indexing as a basic tool of bibliographic research and the sociology of science.7 One of Garfield’s innovations was to focus attention on papers cited so frequently that they attained the status of what he called “citation classics.”8 In 1985, inspired by the citation-classic concept, I published a study in the California Law Review of the most-cited law review articles.9 In the opening paragraph, I wrote:

Such a project falls somewhere between historiography and parlor game, and I will not claim any more significance for it than is warranted. It is my hope, however, that by listing these articles I will draw attention to writings that, by virtue of their objectively measured impact, deserve to be called classics of legal scholarship.10

The California Law Review study seemed to strike a responsive chord in the legal community. By 1997 the Wall Street Journal printed a front-page story about me, headlined “‘Citology,’ the Study of Footnotes, Sweeps the Law Schools.”11 The word “citology,” describing my trivial pursuits, may even have entered the English language, since the Guardian included it in a “Glossary for the 90’s” later in 1997.12 The occasion for this news coverage was my 1996 update of the most-cited-articles enumeration.13 That update provoked a response from Professor William Landes and Judge Richard Posner.14 “The most questionable feature of Shapiro’s method,” they wrote, “may be the ordering of articles rather than of authors. . . . Ranking articles is not well-suited to the central purpose of analyzing citations to scholarly work, which is to construct a meaningful (not definitive) quantitative measure of a scholar’s influence or reputation.”15 Landes and Judge Posner also noted that “[t]he inclusion of citations to books would yield a different picture of influential scholarship from that sketched by Shapiro’s articles. [Professor Ronald] Dworkin, for example, one of the most influential legal academics of the last half century, does not appear at all on Shapiro’s lists.”16

In my reply to Landes and Judge Posner, I agreed with them that ranking authors and including citations to books were more interesting than ranking articles, but I felt that the research challenge of amassing the data for authors and books was formidably difficult—indeed, well-nigh impossible.17 I left the door open, however, stating that “[i]f there are any wealthy foundations out there willing to fund an extensive research project, I would be happy to consider undertaking such studies.”18 This plea for a sugar daddy was answered when the West Group and the Institute for Scientific Information (the latter was Garfield’s company) stepped forward and provided me with a database of citations to or by legal articles between 1981 and 1997.19 Their database included articles, books, and other publications as cited sources, and it ranked the most-cited authors within its coverage. This supplied me with the building blocks that I needed to determine which scholars to search for high citation counts in other databases.20 My resulting study, “The Most-Cited Legal Scholars,” appeared in the Journal of Legal Studies in 2000.21

B. HeinOnline and Further Citation Scholarship

After publishing the Journal of Legal Studies piece, I felt that it was a onetime production that, because it was dependent on a unique access to data, could never be updated or improved upon. I did not foresee the entry into the legal research landscape of the William S. Hein & Co.’s superb product, HeinOnline. HeinOnline includes, among other resources, a nearly comprehensive database of English-language law reviews going back hundreds of years.22 The law review database readily provides the number of times that a given author has been cited in the covered law reviews. (This total number of citations is arrived at by adding together the number of citations to each individual article by that author.)

As I studied the powerful capabilities of HeinOnline, I realized that it had great potential for helping me create a list of the most-cited legal scholars of all time. I asked Hein’s president, Shane Marmion, whether Hein could provide me with a ranking of the most-cited authors in the coverage of HeinOnline. He was able, very generously, to give me a ranked list of their two thousand most-cited authors.

The list of the two thousand most-cited HeinOnline authors was not, however, the end of my labors. In fact, it was only the beginning because it did not present a complete picture of citations to legal scholarship. It did not include citations to books. Books, whether scholarly monographs, student texts, or practitioner-oriented treatises, are part of legal scholarship—indeed a very important part. Some of the foremost scholars have published primarily books, as Landes and Judge Posner pointed out with regard to Dworkin. (Other examples include Professors Charles Alan Wright, Wayne LaFave, Catharine MacKinnon, and Grant Gilmore.)

There is no magic wand search that will generate a ranking of most-cited scholars based on citations to articles and citations to books. The best that can be done is to develop a thorough list of people who are likely to have high citation counts and run searches for each one to find citations to his or her publications. I took the HeinOnline ranking of the top two thousand article writers and considered each of these scholars as candidates for my “top fifty of all time” list. The citation total for each one was calculated by adding the HeinOnline citations-to-articles number to the numbers of citations to their books (if they had published books). In order to be consistent with the HeinOnline methodology, I added together the number of citations to each individual book by that author. I devised each book search to precisely capture citations to the book in question. For example, I searched “ely democracy and distrust” to capture citations to Professor John Hart Ely’s book Democracy and Distrust: A Theory of Judicial Review. These book searches were conducted in the full text of law reviews on HeinOnline. Thus, I counted only citations by law review articles.

I had to deal with the possibility that there could be some scholars who published often-cited books but did not write enough for law reviews to be in the “Hein 2,000.” Therefore, I did extensive research on important legal books and consulted data that I had from previous studies of highly cited legal authors. Through these methods, I am confident that I did not miss any candidates for inclusion who would have amassed the nearly eight thousand citations necessary to make the “top fifty” list, but in honesty, I cannot absolutely guarantee that this is the case. Similar considerations apply to the other “most-cited” lists below.

C. Other Methodology Decisions

Before I move on to the lists themselves, let me mention some other parameters. I limited my rankings to U.S. scholars. This rule kept Sir William Blackstone off the enumeration; I believe he would have placed third. If a book or article had more than one author, each author received full credit for citations to that publication. The sole exception to this rule was that if there were four or more authors of a book or article, I did not count it for anyone’s citation total unless one person was plainly identified as the primary author. If a scholar was listed as an editor of a book, I excluded citations to that edited volume from the scholar’s total because the citations were probably to a chapter or essay written by someone else.

As in my previous study of the most-cited scholars, I did not exclude self-citations or negative citations. Self-citations, through which an author might inflate his or her citation count, are not likely to have much effect on the very large totals that I am dealing with in these lists. Negative citations—citations for the purpose of criticism—likely would not cause an undeserving author to make my rankings given that anyone who is criticized in print thousands of times must be a controversial but important contributor to the scholarly conversation. As I have noted previously, although the purposes underlying particular citations may be various—even capricious—and not all citations merit equal weight, large numbers of citations are strong evidence of scholarly impact.23

One departure that I have made from my first study of the most-cited legal scholars is that I have eliminated the distinction that I made then between scholars who were predominantly authors of practitioner-oriented treatises or student-oriented texts and others. In that study, I put the most-cited treatise and text writers in a separate list. My theory behind the special treatment was that treatises and texts tend to be practical rather than theoretical or creative, their compilation is often heavily reliant on uncredited assistants, and the original author may accrue many citations for editions published long after his or her death. However, I have realized that the practical work of professors like William Prosser or Charles Alan Wright may have more importance for the world than the fancier products of more “ivory-tower” scholars. In addition, the treatise and text authors generally also published scholarly monographs or law review articles. Their names do not stand out as inferior to other often-cited scholars, an insight that Professor Akhil Reed Amar pointed out to me.

My methodology differs from that of the other rankings of most-cited legal authors. Because I went to great efforts to encompass citations to books, I present a much more complete picture of notable scholarship than the rankings produced by Hein. I was also able to provide more targeted data. The excellent, intelligently designed data-collection methods of Professors Brian Leiter and Gregory Sisk use full-text Westlaw searches based on authors’ names and thus cover citations to articles, books, and other publications.24 However, these searches pick up extraneous nonbibliographic mentions such as, hypothetically, “Professor Cass Sunstein served several years in the Obama administration.” They also miss coauthors if the citation has “et al.” in it. Leiter’s and Sisk’s rankings are further limited to short time periods and do not provide the historical “of all time” information that I am interested in.25

But enough on methodology. The actual rankings that I have compiled are in the following Part. The numbers after the names are the total citations to the books and legal articles by that individual.

II. The Scholars and the Schools

A. Most-Cited Legal Scholars

Table 1: Most-Cited Legal Scholars of All Time

1.

Richard A. Posner

48,852

2.

Cass Sunstein

35,584

3.

Ronald Dworkin

20,778

4.

Laurence H. Tribe

20,745

5.

Richard A. Epstein

16,782

6.

Oliver Wendell Holmes, Jr.

15,633

7.

William N. Eskridge, Jr.

15,570

8.

Mark A. Lemley

15,540

9.

Frank H. Easterbrook

14,971

10.

William L. Prosser

14,761

11.

John Hart Ely

13,255

12.

Roscoe Pound

12,446

13.

Kenneth Culp Davis

12,287

14.

Karl N. Llewellyn

11,814

15.

Mark V. Tushnet

11,761

16.

Bruce Ackerman

11,619

17.

Charles Alan Wright

11,601

18.

Akhil Reed Amar

11,375

19.

Frederick Schauer

11,222

20.

Herbert Wechsler

11,185

21.

Erwin Chemerinsky

11,147

22.

Daniel A. Farber

11,146

23.

John C. Coffee, Jr.

10,731

24.

Henry M. Hart, Jr.

10,556

25.

Guido Calabresi

10,504

26.

Robert H. Bork

10,464

27.

Wayne R. LaFave

10,423

28.

Daniel R. Fischel

10,359

29.

Lon L. Fuller

10,260

30.

Richard Delgado

9,925

31.

Alexander M. Bickel

9,786

32.

Frank I. Michelman

9,155

33.

Eric A. Posner

9,101

34.

Martin H. Redish

9,083

35.

Lawrence Lessig

8,802

36.

Lawrence M. Friedman

8,584

37.

William M. Landes

8,538

38.

Gerald Gunther

8,509

39.

Antonin Scalia

8,498

40.

Catharine A. MacKinnon

8,270

41.

Harry Kalven, Jr.

8,267

42.

Grant Gilmore

8,241

43.

Felix Frankfurter

8,168

44.

Duncan Kennedy

8,113

45.

Deborah L. Rhode

7,944

46.

Owen M. Fiss

7,890

47.

Jonathan R. Macey

7,881

48.

Thomas W. Merrill

7,878

49.

Louis Henkin

7,736

50.

Lucian A. Bebchuk

7,629

Table 2: Most-Cited Younger Legal Scholars (scholars born in 1970 or later)

1.

Daniel J. Solove

4,656

2.

Orin S. Kerr

3,875

3.

Rachel E. Barkow

2,307

4.

Brandon L. Garrett

2,263

5.

Neal K. Katyal

2,110

6.

Peter K. Yu

1,978

7.

Oona A. Hathaway

1,798

8.

Douglas Kysar

1,747

9.

Timothy Wu

1,689

10.

Samuel Bagenstos

1,674

11.

Rebecca Tushnet

1,624

12.

Orly Lobel

1,478

13.

Michael B. Abramowicz

1,455

14.

Oren Bar-Gill

1,455

15.

Catherine M. Sharkey

1,417

16.

Abbe R. Gluck

1,415

17.

Derek P. Jinks

1,334

18.

R. Polk Wagner

1,319

19.

Neil Richards

1,298

20.

Brannon P. Denning

1,278

Table 3: Most-Cited Corporate Law Scholars of All Time

1.

Frank H. Easterbrook

14,971

2.

John C. Coffee, Jr.

10,731

3.

Daniel R. Fischel

10,359

4.

Jonathan R. Macey

7,881

5.

Lucian A. Bebchuk

7,629

6.

Ronald J. Gilson

6,388

7.

Reinier Kraakman

5,760

8.

Larry E. Ribstein

5,306

9.

Stephen M. Bainbridge

5,204

10.

Melvin A. Eisenberg

5,138

Table 4: Most-Cited Critical Race Theory and Feminist Jurisprudence Scholars of All Time

1.

Richard Delgado

9,925

2.

Catharine A. MacKinnon

8,270

3.

Deborah L. Rhode

7,944

4.

 Judith Resnik

6,722

5.

Reva Siegel

6,443

6.

Martha L. Minow

6,410

7.

Derrick Bell

5,410

8.

Carrie Menkel-Meadow

5,220

9.

Kevin R. Johnson

4,882

10.

Robin L. West

4,450

Table 5: Most-Cited International Law Scholars of All Time

1.

Eric A. Posner

9,101

2.

Louis Henkin

7,736

3.

Myres S. McDougal

6,583

4.

Jack L. Goldsmith

6,261

5.

Harold Hongju Koh

5,539

6.

Curtis Bradley

4,888

7.

Abram Chayes

4,418

8.

Thomas Franck

4,299

9.

Anne-Marie Slaughter

3,584

10.

Jordan J. Paust

3,514

Table 6: Most-Cited Law and Social Science Scholars of All Time (excluding economics and history)

1.

Lawrence M. Friedman

8,584

2.

Marc Galanter

6,836

3.

Sanford Levinson

6,395

4.

Robert C. Ellickson

5,226

5.

Dan M. Kahan

5,052

6.

Tom R. Tyler

4,726

7.

Edward S. Corwin

4,712

8.

Jeffrey J. Rachlinski

3,893

9.

Frank B. Cross

3,626

10.

Hans Zeisel

3,400

Table 7: Most-Cited Public Law Scholars of All Time (excluding constitutional law; including administrative law, environmental law, criminal law, and legislation)

1.

Cass Sunstein

35,584

2.

William N. Eskridge, Jr.

15,570

3.

Kenneth Culp Davis

12,287

4.

Herbert Wechsler

11,185

5.

Daniel A. Farber

11,146

6.

Wayne R. LaFave

10,423

7.

Thomas W. Merrill

7,878

8.

Stephen G. Breyer

6,711

9.

Louis L. Jaffe

6,427

10.

Joseph L. Sax

6,421

Some explanation of the lists above is in order. The “top fifty” list of scholars, based on searches performed in May 2020, is the all-time historical list. There is an inevitable bias toward people active in the last few decades, as the law review literature was smaller in earlier time periods and the opportunities for being cited were correspondingly fewer. It is striking that Justice Oliver Wendell Holmes Jr., ranks sixth despite being disfavored in this way. The placements of Professors Prosser, Roscoe Pound, Karl Llewellyn, Henry Hart Jr., and Justice Felix Frankfurter are also impressive. (Some of the earlier figures who may have failed to qualify because of the aforementioned bias are Justices Louis D. Brandeis, Benjamin N. Cardozo, and Joseph Story; Judge Jerome Frank; Professors Zechariah Chafee Jr., Arthur Corbin, J. Willard Hurst, James Landis, John Henry Wigmore, and Samuel Williston; and Charles Warren.)

Twenty years after my previous study of the fifty most-cited scholars, it is interesting to see who has appreciably scaled the ladder. Sunstein has ascended from fifteenth to second.26 Professor Richard Epstein has gone from twelfth to fifth.27 Professor William Eskridge Jr., who was not even included in the original top fifty, is now seventh.28 These three are protean authors who write prolifically about multiple fields. Professor Mark Lemley, also not included at all in the first listing, evidences the rise in importance of intellectual property law in placing eighth.29

At the other end of the spectrum, there is a strong bias against younger scholars. Many thousands of citations were needed to make the all-time list—such a high bar that it would be virtually impossible for anyone not well into middle age to make the cut. To counter that bias, as well as to draw attention to the contemporary scholarly community, I have compiled the ranking of the twenty most-cited younger scholars.30 I define “younger scholar” as a person born in 1970 or later. It should be noted that, even with the younger list, the requisite citation count for inclusion makes it difficult for anyone now in their thirties to qualify.

The University of Chicago Law Review asked me to supply five specialized most-cited rankings based on areas of law.31 Two of the five fields chosen were straightforward: international law and corporate law. The others require some clarification. Critical race theory and feminist jurisprudence are combined because of the similar questions and themes they address. “Public law” encompasses administrative law, environmental law, criminal law, and legislation, but not constitutional law. “Law and social science” groups law and political science, law and sociology, and law and psychology, but not law and economics or law and history. Appearing on one of the specialized lists does not mean that that category has been the only subject of someone’s scholarship. Indeed, many of these listees have worked in more than one field. My classifications of which scholars belong in which category involved some close calls where I inevitably had to make judgments that others might disagree with.

B. Law Schools Taught at by Most-Cited Scholars

It may be of interest to examine the law schools where the fifty most-cited legal scholars of all time taught. The following are the schools where the most of the fifty taught. If someone taught at more than one school, each school is credited.

Table 8: Law Schools Taught at by Most-Cited Scholars of All Time

University of Chicago

15

Harvard University

15

Yale University

13

Columbia University

7

Stanford University

7

University of Minnesota

5

University of California, Berkeley

4

Georgetown University

3

University of Illinois

3

Northwestern University

3

University of Pennsylvania

3

University of Texas

3

University of Virginia

3

The representation of University of Chicago faculty members is extraordinary, including among others the number one and two scholars (Judge Posner and Sunstein), the number five scholar (Epstein), the number nine scholar (Judge Frank Easterbrook), the number fourteen scholar (Professor Kenneth Culp Davis), and the number fifteen scholar (Llewellyn). Judge Richard Posner and Professor Eric Posner form a unique father-and-son “most-cited” team. Harvard, Yale, Columbia, and Stanford are predictably prominent in Table 8. The less famous law schools at the University of Minnesota and the University of Illinois also make strong showings.

C. Law School Degrees by Most-Cited Scholars

In terms of the law schools from which the top fifty scholars graduated (J.D. or LL.B. degree), the following have the most alumni. It appears, at least in the small sample size of the highest citation counts, that the training of preeminent scholars is more concentrated in a few schools than the employment of preeminent scholars.

Table 9: Law School Degrees by Most-Cited Scholars of All Time

Harvard University

17

Yale University

16

University of Chicago

6

University of California, Berkeley

2

Harvard was the dominant law school of the nineteenth and early twentieth centuries, and this is reflected in the temporal patterns of top fifty listees. Up to 1966, Harvard had twelve graduates and Yale had five. From 1967 on, Yale has had eleven and Harvard has had five. Many of the leaders of the iconoclastic and interdisciplinary movements of the last hundred years in U.S. law—such as legal realism, law and economics, law and society, critical legal studies, and feminist jurisprudence—were graduates of Yale or the University of Chicago.

D. Limitations on Compiled Data on Most-Cited Scholars

The individual names featured on my rankings, and the general patterns that might be inferred from them, may be skewed in various ways. I have already mentioned biases of chronology, hurting the chances of both earlier scholars and later scholars. Another bias relates to the subject areas about which legal scholars write. To quote my earlier study:

Some topics have a much larger scholarly literature than others. A reasonably prolific commentator on constitutional law will have far more opportunities to be cited than even the most important writer on wills. I call this “the Langbein factor.” John H. Langbein is a major scholar in the areas of trusts and estates, legal history, and comparative law, but none of these subjects is known for having a huge literature in American periodicals. Therefore, although Langbein has amassed over 800 [now over 6,000] citations, an impressive number for the fields in which he publishes, he falls short of any all-time citation rankings.32

Other examples of small-law-review-literature areas of law include family law, international law, labor law, and various subdivisions of business law.

Three topics that loom large on my all-time list are constitutional law, jurisprudence, and law and economics. Constitutional scholars who are included start with Professors Sunstein, Laurence Tribe, Eskridge, Ely, Mark Tushnet, Bruce Ackerman, Amar, Frederick Schauer, Herbert Wechsler, and Erwin Chemerinsky, and continue down. There were additional listees who contributed to constitutional law scholarship without it being their main focus. Jurisprudence is represented by Judge Posner, Dworkin, Justice Holmes, Pound, Llewellyn, and others. Law and economics scholars include Judge Posner, Sunstein, Epstein, Lemley, Judge Easterbrook, and Judge Calabresi, to name only those on the first half of the list.

The tabulation of most-cited younger legal scholars naturally reflects the emphases of recent times. One subject area clearly dominates here: the cluster of technology, intellectual property, and privacy, spilling over into First Amendment law and law and economics. At least nine of the twenty fall into this category, highlighted by the first two, Professors Daniel Solove and Orin Kerr. Another topic that stands out among the research and teaching of the younger scholars is criminal law, a specialization of numbers two through five (Professors Kerr, Rachel Barkow, Brandon Garrett, and Neal Katyal). Constitutional law is also prominent. International law, torts, and law and economics each have more than one adherent.

Another aspect of my rankings that may appear skewed is the representation of women. Only two of the fifty most-cited legal scholars of all time are women, Professors MacKinnon and Deborah Rhode. I attribute the low number of women scholars on that list to the historical scarcity of women in legal academia and the legal profession, prejudice against those women who did participate in law, and sociological factors such as the greater demands on women to juggle work and family obligations. There is, however, evidence of progress to be found in my list of most-cited younger legal scholars. Here, we see that six of the top sixteen are women.

It is highly likely that in the future the percentage of women among most-cited legal scholars will continue to increase. Over 52% of law students are now women.33 The most eye-opening statistic is that, in 2020, every one of the editors-in-chief of the flagship law reviews at the sixteen law schools highest-ranked by U.S. News and World Report was female.34

III. Ranking the Scholarly Impact of Law Schools: A Bridge Too Far

A. The Merits of Citation Counting

I mentioned at the outset of this article that citation counting is controversial as a tool for assessing scholarly quality or even scholarly influence. Citations may be made for a variety of purposes. Garfield, having in mind citations in the hard sciences, identified some of these purposes as

providing background reading, identifying methodology, paying homage to pioneers, identifying original publication or other work describing an eponymic concept, identifying original publications in which an idea or concept was discussed, giving credit for related work, substantiating claims, alerts to a forthcoming work, providing leads to poorly disseminated work, authenticating data and classes of fact—physical constants and so on—disclaiming works of others, and disputing priority claims.35

Citations in law have some similar characteristics to scientific citations but also possess distinctive features. Legal writers often cite sources of law such as cases and statutes and regulations, but that is not the kind of citation that I am focusing on. The kind that I am focusing on—citations to legal scholarship—may be made to provide the source of a quotation, to invoke an argument by a previous scholar, to repeat empirical evidence provided by a previous scholar, to point the reader to publications that are helpful for understanding the issues being discussed, to describe the history of an idea or legal development, to give credit to a previous scholar, to criticize a previous scholar, to buttress the credibility of the citing author by invoking a prestigious previous scholar, to impress the reader with the citing author’s erudition and thorough research, or to help colleagues or the citing author’s institution by drawing attention to them. Some of these motivations are intellectual in nature, while some are social or psychological.

If a legal scholar is frequently cited by other legal scholars, this may mean that his or her ideas have been persuasive, or it may mean that he or she is well-connected, or both. “Citedness” is the product of various intellectual and social factors, but it does appear to have a relation to informed subjective judgments of impact. As far back as 1957, Professor Kenneth Clark surveyed psychologists about their estimation of which of their colleagues had contributed most to the discipline.36 Clark compared the evaluations with six indicators of “eminence” and found that the variable showing the highest correlation with the peer ratings was the number of citations to the psychologist’s publications.37 In 1973, Professors Jonathan and Stephen Cole concluded from data in the Science Citation Index that “straight citation counts are highly correlated with virtually every refined measure of quality.”38 The Coles discovered that the citation totals of scientists were correlated with the number of awards garnered.39 Their research and that of others showed a correlation between large totals of citations and Nobel Prizes.40

By 1979, Garfield was able to point to seven major studies linking citedness with “peer judgments, which are widely accepted as a valid way of ranking scientific performance.”41 Six years later, librarian Stephen Bensman compared reputational ratings of university departments with total citation rates for the departments and found a correlation so high (R = .92) that Bensman remarked that “citations and peer ratings appear to be virtually the same measurement.”42

Recognition of the value of citation counting, coupled with its cost-effectiveness relative to more expensive procedures like peer review, has led to increased use throughout the academic world of citation measures as aids in evaluating scientists, scholars, journals, departments, schools, and even the intellectual output of entire countries. Popularity of citation methods has been accompanied by criticism of their overuse and underlying rationale. A recent review article stated that

the application of citation indicators has [ ] been criticized more generally, with respect to their validity as performance measures and their potentially negative impact upon the research system. . . . Seglen (1998) examined problems attached to citation analyses and concluded that “. . . citation rates are determined by so many technical factors that it is doubtful whether pure scientific quality has any detectible effect at all . . . .”43

The article concluded, though, that “[n]owadays, it is often taken for granted that citations in some way measure scientific impact, one of the constituents of the concept of scientific quality. More attention has been paid to methodological issues such as appropriate methods for normalizing absolute citation counts.”44

Most of my discussion above, like most of the discourse about citation analysis in general, has related to the biomedical and physical sciences. Are citations to legal scholarship fundamentally different from scientific citations, or are the issues similar? I believe that significant differences exist because in science there are strong norms about evidence and the acceptance of theories, and consensus usually develops relatively quickly. In law, on the other hand, evidence and acceptance are heavily politicized, and controversies are not easily resolved. Therefore, the meaning of high citation totals can be unclear and can be viewed very differently depending on what side of the political fence one is on. Conclusions about “quality” are elusive, and “impact” is the most one can hope to measure. There is also perhaps more of a class system in law than in science. Professors at prestigious law schools have easier access to prestigious journals than professors at nonelite schools, and their opinions are more likely to command respect. Gaining acceptance for scholarly legal arguments may be more of a social process than an intellectual one.

My conclusion about the utility of citation totals in legal scholarship is that they should not be regarded as affirmations of the correctness or quality of the scholar’s ideas. I keep coming back to the neutral word “impact.” Citations are indications that a scholar has commanded attention and has produced publications that have been useful to other scholars or provocative enough to inspire criticism. That kind of indication does not justify regarding “most-cited” lists as absolute proof of the listees’ greatness or brilliance. That kind of indication does not justify routinely employing citation counts as assessments of the scholarly merit of individuals or schools. The biases that affect these counts—those biases against younger scholars and against female scholars, for example—highlight the problems inherent in relying on these metrics as a proxy for scholarly value.

B. A Bridge Too Far

Recently, legal academia narrowly escaped exactly what I have just warned against: the routine employment of citation counts as assessments of the scholarly merit of individuals and schools. The U.S. News educational ratings behemoth announced in 2019 that they would issue “scholarly impact rankings” for law schools. U.S. News described this project, which was later cancelled, as follows:

U.S. News & World Report is expanding its Best Law Schools data collection with the goal of creating a new ranking that would evaluate the scholarly impact of law schools across the U.S. The intent is to analyze each law school’s scholarly impact based on a number of accepted indicators that measure its faculty’s productivity and impact using citations, publications and other bibliometric measures. U.S. News is collaborating with William S. Hein & Co. Inc., the world’s largest distributor of legal periodicals, to complete this analysis. To begin the process, U.S. News is asking each law school to provide U.S. News with the names and other details of its fall 2018 full-time tenured and tenure-track faculty. This information will be used to link the names of each individual law school’s faculty to citations and publications that were published in the previous five years and are available in HeinOnline. . . . This includes such measures as mean citations per faculty member, median citations per faculty member and total number of publications. U.S. News will then use those indicators to create a comprehensive scholarly impact ranking of law schools.45

The existing U.S. News law school rankings exert enormous influence on applicants, on employers, and throughout the legal-education ecosystem. Law schools that do not place well on these rankings may find their very existence endangered. This kind of clout is too much for even perfect methodologies based on citation counts to reasonably bear. If the methods used are flawed, such ratings are hard to justify. And, although the U.S. News projected methodology was never revealed, it was certain to be flawed.

The principal flaw that we know would have marred the U.S. News data was that it would have ignored citations to scholarship in books. We know this would have been true because HeinOnline, the source of the U.S. News data, omits books in its citation totals. Another flaw was that the U.S. News data would have ignored citations to nonlegal articles because HeinOnline omits nonlegal articles in its citation totals.46 As I have stated, books are a substantial part of legal scholarship. Interdisciplinary work is increasingly important in law schools, so nonlegal articles are also significant in law schools’ scholarship. Therefore, U.S. News would not have created “a comprehensive scholarly impact ranking of law schools.”47 Schools with book-writing or interdisciplinary faculties would have been underestimated.

Another problem with the U.S. News data taken from HeinOnline was inequity based on subject areas of publications. Because, as described above, some legal fields have fewer opportunities to be cited than others, law schools that emphasize less-cited fields would have been slighted in comparison to other schools. A school with a large tax program, for example, may suffer in rankings based on citations.

Law schools that have younger faculties may be vibrant and innovative precisely because of that fact, but this would probably not have been reflected in citation counts. Older professors who have had decades of writings to be cited would likely have had many more citations to their work and would have strongly influenced the U.S. News compilation. The Society for Empirical Legal Studies has linked this issue to the question of diversity, arguing the following:

Law faculties for many years were mostly closed to women and members of marginalized minority groups. Under a HeinOnline-driven ranking system, law schools would go to great lengths to retain faculty members with long tenures and publication records, even those who have more recently become less productive. This in turn would reduce schools’ ability to hire and tenure junior faculty members, who increasingly hail from more diverse demographic backgrounds. Simply put, using HeinOnline is bound to negatively affect these groups and, therefore, to harm faculty diversity nationwide.48

Aspects of the HeinOnline system that are reasonable in themselves could result in serious distortions when enlisted for law school ratings. To give one alarming example, HeinOnline gives full citation credit to each coauthor of a multiauthor article. Fair enough—I have adopted the same policy for my “most-cited scholar” lists. But in the context of comparing the citation totals of schools, a law school could be incentivized to sign on many coauthors for their faculty’s articles in order to get a lot of “bang for the buck” as the articles feed into the U.S. News rankings via HeinOnline citation counting.

The threat of coauthor mania may seem farfetched, but the already-existing U.S. News ranking formulas have been known to inspire gaming. Some law schools have pursued tortured stratagems to increase median LSAT scores, grade point averages, and other admissions statistics.49 Some law schools have hired their own graduates in order to bolster the percentage of graduates who are employed.50 Other numbers, such as per capita expenditures on students and faculty-student ratios, have sometimes been manipulated.51 Such gaming increases the costs of legal education without any real academic benefit.52 The U.S. News scholarly impact rankings could have been expected to promote similar machinations and to draw resources away from other, less-measurable priorities like teaching and public service.

The prospects for legal citology are not all negative. If rankings of very highly cited legal authors are designed and implemented thoughtfully—with care to avoid the many possible biases, errors, and omissions—they can be meaningful records of scholarly impact. The pains that I have taken in the present study—to include books, to spotlight some subject areas that might be shortchanged in the “top fifty” list, and to create a separate roster of younger scholars—illustrate how one form of citation analysis can navigate away from serious problems. With these kinds of enhancements, “most-cited” lists can shine a light on the people who have been prominent in scholarship.

Can similar efforts succeed in producing a ranking of law schools’ scholarly impact (based on citation totals) that is meaningful and unbiased? I conclude with regret that the answer is “no.” U.S. News and Hein are very well-intentioned and dedicated to providing useful information to people who are consumers of or participants in legal education. Hein has a fabulous database of law review articles and took enormous and sophisticated measures to supply U.S. News with high-quality data. The task of using citation counts to rate schools, however, is fraught with difficulties, and I do not believe the difficulties were realistically soluble.

C. Specific Problems with U.S. News’ Methodology

Let’s look at the four specific problems that I have outlined. I was able to include citations to books in my “most-cited” lists data because the universe of scholars who might be candidates for inclusion on the lists was (barely) manageable with regard to the number crunching needed. To find citations to books of every law professor in the country would involve several orders of magnitude more work. The problem of unequal citation opportunities—created by disparities among the different areas of law—could not be addressed without analyzing the subject areas, computing the relative size of each subject’s law-review literature, discovering the topics of every professor’s scholarship, and applying a normalizing relative-size adjustment to their respective citation totals.

The bias against younger scholars could only be combatted by creating cohorts of scholars based on their ages and number of years of teaching and, for every law professor in the country, applying a cohort-adjustment factor to their citation number. To avoid coauthor mania, all coauthored articles and books would have to be identified and the full-credit-for-each-coauthor approach used by Hein would have to be changed to give fractional citation credit to each individual cited.

Even if the prohibitively huge amount of labor necessary to avoid those four vexing problems were not prohibitively huge, there would still be the issue of whether the citation ranking information was worth all the costs. The visibility of U.S. News ratings would force law schools to distort their core missions in order to do well in a particular definition of “scholarly impact.” There would also be specific expenses, such as outsized salaries paid to citation superstars. Like the Allied armies’ battle at Arnhem in 1944, the inevitable overemphasis on citation totals is a bridge too far.

After two years of delays, U.S. News wisely decided to pull the plug on its proposed scholarship ranking.53 However, the lessons learned from this controversy are worth remembering in case similar misguided initiatives surface in the future. The costs of rigorously and objectively incorporating faculty scholarly impact into the ranking process are far too high, and schools and students will pay the price for an undisciplined approach. Though I have long been—and will continue to be—a proponent of legal citology, it is a field fraught with methodological quagmires and potential bias; only with the greatest caution should it be considered for incorporation in an already flawed system for ranking law schools. As I have endeavored to show here, however, with careful analysis, we can learn a great deal from citology about the legal academy’s past, present, and future.

  • 11 Robert Raymond & Baron Raymond, Reports of Cases Argued and Adjudged in the Courts of King’s Bench and Common Pleas 4 (n.p. 1743).
  • 2See generally 1 Samuel Linn, An Analytical Index of Parallel Reference to the Cases Adjudged in the Several Courts of Pennsylvania (Philadelphia, Kay & Brother 1857).
  • 3See Patti J. Ogden, Mastering the Lawless Science of Our Law: A Story of Legal Citation Indexes, 85 Law Libr. J. 1, 27–28 (1993).
  • 4See generally Shepard’s Citations (1948).
  • 5See Ogden, supra note 3, at 43.
  • 6Eugene Garfield, Citation Indexing 7, 16 (1979).
  • 7See id. at 16–18.
  • 8Eugene Garfield, Introducing Citation Classics: The Human Side of Scientific Reports, in 3 Essays of an Information Scientist 1, 1–2 (Eugene Garfield ed., 1980).
  • 9See generally Fred R. Shapiro, The Most-Cited Law Review Articles, 73 Calif. L. Rev. 1540 (1985).
  • 10Id. at 1540.
  • 11Paul M. Barrett, Citology,’ the Study of Footnotes, Sweeps the Law Schools—Thank a Yale Librarian Who Got His Start as a Child Interested in Baseball Stats, Wall St. J., Jan. 22, 1997, at A1.
  • 12David Rowan, Last Word: Glossary for the 90’s, The Guardian, May 10, 1997, at TT82.
  • 13See generally Fred R. Shapiro, The Most-Cited Law Review Articles Revisited, 71 Chi. Kent L. Rev. 751 (1996).
  • 14See generally William M. Landes & Richard A. Posner, Heavily Cited Articles in Law, 71 Chi. Kent L. Rev. 825 (1996).
  • 15Id. at 827.
  • 16Id. at 826.
  • 17See Fred R. Shapiro, Response to Landes and Posner, 71 Chi. Kent L. Rev. 841, 841 (1996).
  • 18Id. at 842.
  • 19See Fred R. Shapiro, The Most-Cited Legal Scholars, 29 J. Legal Stud. 409, 411 (2000).
  • 20See id.
  • 21See generally id.
  • 22See Joe Gerken, The Invention of HeinOnline, AALL Spectrum, Feb. 2014, at 17, 19–20.
  • 23See Shapiro, supra note 9, at 1543.
  • 24See Brian Leiter, Measuring the Academic Distinction of Law Faculties, 29 J. Legal Stud. 451, 455–57 (2000); Gregory Sisk, Valerie Aggerbeck, Debby Hackerson & Mary Wells, Scholarly Impact of Law School Faculties in 2012: Applying Leiter Scores to Rank the Top Third, 9 U. St. Thomas L.J. 838, 850 (2012); Gregory Sisk, Valerie Aggerbeck, Nick Farris, Megan McNevin & Maria Pitner, Scholarly Impact of Law School Faculty in 2015: Updating the Leiter Score Ranking for the Top Third, 12 U. St. Thomas L.J. 100, 117–18 (2015); Gregory Sisk, Nicole Catlin, Katherine Veenis & Nicole Zeman, Scholarly Impact of Law School Faculties in 2018: Updating the Leiter Score Ranking for the Top Third, 15 U. St. Thomas L.J. 95, 95 (2018); Gregory Sisk, Measuring Law Faculty Scholarly Impact by Citations: Reliable and Valid for Collective Faculty Ranking, 60 Jurimetrics 41, 44–45 (2019).
  • 25See, e.g., Leiter, supra note 24, at 457.
  • 26See Shapiro, supra note 19, at 424–25.
  • 27See id.
  • 28See id.
  • 29See id.
  • 30The searches for this table were conducted in July 2020.
  • 31The searches for these tables were conducted in August 2020.
  • 32Shapiro, supra note 19, at 413.
  • 33See Am. Bar Ass’n, Where Do Women Go to Law School? Here Are the 2018 Numbers, ABA for L. Students (Feb. 28, 2019), https://perma.cc/Q9UR-E6VQ.
  • 34See generally Karen Sloan, It’s a Sweep: Women Take Over Editor-in-Chief Positions at All Top 16 Law Reviews, Including Yale, Conn. L. Trib., Jan. 23, 2020.
  • 35Dag W. Aksnes, Liv Langfeldt & Paul Wouters, Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories, 9 Sage Open, no. 1, Feb. 2019, at 4 (citing Eugene Garfield, Can Citation Indexing Be Automated?, in 1 Essays of an Information Scientist 84, 85 (Eugene Garfield ed., 1977)).
  • 36See Kenneth E. Clark, America’s Psychologists: A Survey of a Growing Profession 31–32 (1957).
  • 37Id. at 51–52.
  • 38Jonathan R. Cole & Stephen Cole, Social Stratification in Science 35 (1973).
  • 39See Jonathan R. Cole & Stephen Cole, Scientific Output and Recognition: A Study in the Operation of the Reward System in Science, 32 Am. Soc. Rev. 377, 379, 389–90 (1967).
  • 40See Jonathan R. Cole & Stephen Cole, Measuring the Quality of Sociological Research: Problems in the Use of the “Science Citation Index”, 6 Am. Sociologist 23, 23–24 (1971); Eugene Garfield, The 250 Most-Cited Primary Authors, 1961-1975, in 3 Garfield, supra note 35, at 326, 337–47.
  • 41Garfield, supra note 8, at 241, 251. The seven studies are the following: Grace M. Carter, Peer Review, Citations, and Biomedical Research Policy: NIH Grants to Medical School Faculty (1974); Alan E. Bayer & John Folger, Some Correlates of a Citation Measure of Productivity in Science, 39 Socio. Educ. 381 (1966); Charles L. Bernier, William N. Gill & Raymond G. Hunt, Measures of Excellence of Engineering and Science Departments: A Chemical Engineering Example, 9 Chem. Eng’g Educ. 94 (1975); E. Garfield, Citation Indexes for Studying Science, 227 Nature 669 (1970); Joseph P. Martino, Citation Indexing for Research and Development Management, 18 IEEE Transactions Eng’g Mgmt. 146 (1971); Irving H. Sher & Eugene Garfield, New Tools for Improving and Evaluating the Effectiveness of Research, in Research Program Effectiveness 135 (Marshall C. Yovits et al. eds., 1966); Julie A. Virgo, A Statistical Procedure for Evaluating the Importance of Scientific Papers, 47 Libr. Q. 415 (1977).
  • 42Stephen J. Bensman, Journal Collection Management as a Cumulative Advantage Process, 46 Coll. & Rsch. Librs. 13, 22–23 (1985).
  • 43Aksnes, supra note 35, at 2 (quoting Per O. Seglen, Citation Rates and Journal Impact Factors Are Not Suitable for Evaluation of Research, 69 Acta Orthopaedica Scandinavica 224, 226 (1998) (omissions in original)).
  • 44Id.
  • 45Robert Morse, U.S. News Considers Evaluating Law School Scholarly Impact, U.S. News & World Rep. (Feb. 13, 2019), https://www.usnews.com/education/blogs/college-rankings-blog/articles/2019-02-13/us-news-considers-evaluating-law-school-scholarly-impact.
  • 46In the present study, I too excluded citations to nonlegal articles written by legal scholars, but I am defining my focus as legal scholarship, whereas U.S. News was purporting to measure the total scholarly activity of law schools.
  • 47Morse, supra note 45.
  • 48Open Letter to U.S. News & World Report, Soc’y for Empirical Legal Stud. (Oct. 28, 2019), https://perma.cc/8M2U-TZ55.
  • 49See Darren Bush & Jessica Peterson, Jukin’ the Stats: The Gaming of Law School Rankings and How to Stop It, 45 Conn. L. Rev. 1235, 1251–53 (2013).
  • 50See id. at 1253–54.
  • 51See id. at 1255–57.
  • 52See U.S. Gov’t Accountability Off., Higher Education: Issues Related to Law School Cost and Access 25 (2009) (explaining that competition among law schools for higher ranking increases cost of attendance because schools offer higher salaries to attract better faculty, hire more faculty to improve their faculty-to-student ratio, and increase expenditures per student generally); see also David Segal, Law School Economics: Ka-Ching!, N.Y. Times (July 16, 2011), https://perma.cc/P7PB-4CP3 (“Part of the US News algorithm is a figure called expenditures per student. . . . The more that law schools charge their students, and the more they spend to educate them, the better they fare in the US News rankings.”).
  • 53Joshua Fischman & Michael A. Livermore, Rankings Shift Could Force Big Changes at U.S. Law Schools, Bloomberg Law (Aug. 19, 2021), https://perma.cc/4NJY-JUQ9.