Typical law review articles not only clarify what the law is, but also examine the history of the current rules, assess the status quo, and present reform proposals. To make theoretical arguments more plausible, legal scholars frequently use examples: they draw on cases, statutes, political debates, and other sources. But legal scholars often pick their examples unsystematically and explore them armed with only the tools for doctrinal analysis. Unsystematically chosen examples can help develop plausible theories, but they rarely suffice to convince readers that these theories are true, especially when plausible alternative explanations exist. This project presents methodological insights from multiple social science disciplines and from history that could strengthen legal scholarship by improving research design, case selection, and case analysis. We describe qualitative techniques rarely found in law review writing, such as process tracing, theoretically informed sampling, and most similar case design, among others. We provide examples of best practice and illustrate how each technique can be adapted for legal sources and arguments.

Introduction

For over a century, American legal scholars have participated in the realist project, understanding law not as an autonomous, independent system of rules, akin to geometry, but as the product of heated political, economic, and societal conflicts.1 When interpreting and evaluating the law, American legal scholars rarely limit themselves to doctrinal analysis of legal texts; they draw on diverse historical and contemporary examples to make theoretical claims more plausible. Legal scholars, however, do not usually approach this exercise as an empirical one. Indeed, legal academics often assume empirical techniques are useful only for statistical analyses.

Qualitative empirical methods commonly used across the social sciences are not systematically used to study law.2 This is surprising because qualitative methods are particularly well suited for analyzing the types of evidence, and developing the types of arguments, we typically see in law reviews. Court decisions alone offer unusually extensive and in-depth perspectives on law, on the actions of various stakeholders, and on the societal context in which these operate. Constitutions, statutes, administrative regulations, depositions, and interrogatories are among the many readily available sources lawyers draw from. Moreover, the events embedded within legal processes that produce these pieces of evidence are interconnected. For example, rules of precedent link cases, making the sequence in which cases are decided very important. Qualitative analysis tools are specifically designed to study these interdependencies, and thus are particularly useful for legal scholars. These tools are different from statistical techniques, which often require that an observation’s occurrence does not influence whether another occurs.

Instead of drawing on qualitative techniques, legal scholars depend heavily on doctrinal analysis tools to conduct research. Doctrinal analysis and social science methods often lead scholars to choose and evaluate evidence in conflicting ways. For example, doctrinal tools prompt legal scholars to focus on cases in which the highest national court introduces a significant ruling that breaks from precedent. From a doctrinal analysis standpoint, focusing on such cases makes sense: higher courts can overrule lower courts, and it would be malpractice to ignore major changes in the law. As a result, many books and articles focus on US Supreme Court cases such as Brown v Board of
Education of Topeka3 and Roe v Wade.4 However, to make sound generalizations about law and society, emphasizing pathbreaking cases is often inappropriate, because they are idiosyncratic.5

To illustrate our approach, we identify extraordinary law review articles that apply qualitative methodologies effectively. Unfortunately, these articles are rare. A simple search in Hein­Online shows that, while over 84 percent of the articles published in the last fifteen years use the word “example,” only 7 percent reference qualitative and quantitative techniques used in the social sciences.6

Indeed, as Table 1 shows, quantitative methodologies are more commonly referenced in law reviews than are qualitative approaches. For example, while 4,284 articles mention random sampling, a common technique in quantitative work, only 281 articles refer to purposive sampling, a common qualitative technique.7 And even when a methodological technique is referenced, it is often applied incorrectly.8

Such limited use of qualitative methods is surprising because legal scholars are deeply concerned about the problems these methods address. Concerns about cherry-picking evidence, for example, trouble legal academics; however, few use qualitative sampling and case selection techniques. As Table 1 indicates, legal scholars are even more unfamiliar with qualitative techniques used to test and analyze theories. For example, only 136 articles referenced “process tracing,” a common method for testing causal propositions.

Table 1.  Search for Methods Terms in HeinOnline (2000–2015)9

In this Essay, we direct legal scholars to qualitative techniques appropriate for distinct research goals. We draw on Professor Martha Minow’s categorization to identify legal scholarship archetypes.10 A major category of legal projects focuses on doctrine. Some seek to restate doctrine, often by organizing case law and focusing on new developments.11 Others recast doctrine, revealing similarities among seemingly different cases.12 Many doctrinal research projects suffer from selection bias; authors emphasize examples that confirm their typologies, ignoring cases that don’t fit.13 Sampling methods are particularly helpful for these projects and allow legal scholars to generalize beyond the specific cases they analyze in depth.

Another set of legal projects aims to establish causal connections between the law and political, societal, or economic developments. Some use historical analysis to explain developments in the law and legal institutions; others engage in policy analysis, identifying legal problems and proposing solutions.14 These projects are more analogous to social scientific inquiries. To make strong causal claims, legal scholars must systematically identify and eliminate plausible alternative explanations of the outcome. To do so, we recommend two qualitative techniques. First, careful case selection can help legal scholars identify circumstances in which their theories can be effectively tested. Second, careful within-case analysis helps bolster the conclusions. This requires researchers to derive multiple empirical implications from their preferred explanations. If a great number of these implications prove true, then the researcher’s argument becomes more plausible. We describe a variety of case selection and case analysis techniques in the pages that follow.

Figure 1.  Qualitative Methods Appropriate for Different Claims15

Figure 1 above can help legal scholars locate the most appropriate methods for their projects. Doctrinal analysis tools should suffice for scholars who wish only to describe a few cases in depth. When, however, scholars wish to generalize these descriptive claims to a broader population of cases, sampling techniques are needed. And all causal claims require careful thinking about counterfactuals. In forming counterfactuals, scholars imagine plausible, alternative outcomes to the one that occurred, or alternative mechanisms to the one commonly assumed, and identify what factors led to the outcome chosen rather than the alternatives.

In the pages that follow, we start with some thoughts on identifying puzzles. We then discuss sampling and case selection techniques. We detail how scholars can use random and theoretically informed sampling to increase arguments’ generalizability and discuss case selection techniques. We then introduce process tracing, describing the importance of interdependent observations and detailing how to effectively use process tracing when observations are linked temporally and in a path-dependent manner.

  • 1. See William W. Fisher III, Morton J. Horwitz, and Thomas A. Reed, eds, American Legal Realism 232–33 (Oxford 1993).
  • 2. See Table 1.
  • 3. 347 US 483 (1954).
  • 4. 410 US 113 (1973).
  • 5. See generally, for example, Gerald N. Rosenberg, The Hollow Hope: Can Courts Bring About Social Change? (Chicago 2d ed 2008). But see Part II.B.1 for an analysis of The Hollow Hope as an example of most difficult case design.
  • 6. See Table 1.
  • 7. It is important to note that the information in Table 1 has several limitations. First, when we conducted the HeinOnline search, we recorded the number of articles that contained a given keyword. Due to time and resource constraints, we were not able to systematically check the context in which each of these keywords was used. For example, this means that an article that mentioned “random sampling” could have actually used random sampling as part of its research design, or it could have been discussing this sampling method’s use in a cited article. Similarly, articles that mentioned “example” could possibly be using this word in other ways besides presenting examples of their arguments. Despite these limitations, we believe that these searches provide key insight into the pervasiveness of these different methods in legal scholarship; indeed, the number of articles that either discuss or actually implement these methods in some fashion is indicative of whether these methods are common in the field. As such, the information presented in Table 1 illustrates the motivation for the Essay.
  • 8. For example, we found that many legal scholars who referenced purposive sampling went on to identify respondents who were the easiest to access. Social scientists argue that, while appropriate for hard-to-reach populations, convenience sampling raises significant concerns about bias. See Krista J. Gile and Mark S. Handcock, Respondent-Driven Sampling: An Assessment of Current Methodology, 40 Sociological Methodology 285, 286, 321–23 (2010).
  • 9. To develop this list, we consulted methods syllabi, textbooks, and colleagues from law, sociology, political science, anthropology, economics, and history. Each row offers the total number of articles using a term or a closely related term stemming from the same root. For example, the total for “process tracing” includes “process trace,” “process-tracing,” and “process tracing.” A search in Westlaw yielded similar results. We conducted this search for articles published before 2000 to see if qualitative methods use increased over time, but found that growth was at best moderate.
  • 10. See generally Martha Minow, Archetypal Legal Scholarship: A Field Guide, 63 J Legal Educ 65 (2013).
  • 11. See id at 65.
  • 12. See id at 66.
  • 13. Julia H. Littell, Evidence-Based or Biased? The Quality of Published Reviews of Evidence-Based Practices, 30 Children & Youth Serv Rev 1299, 1300 (2008) (describing the potential for confirmation bias in research and reviews).
  • 14. See Minow, 63 J Legal Educ at 66 (cited in note 10).
  • 15. We thank Professor Kevin Quinn for this figure.