Tech platforms serve as private courthouses for disputes about speech, lodging, commerce, elections, and reputation. After receiving allegations of defamatory content in top search results, Google must decide between protecting one person’s public image and another’s profits or speech. Amazon adjudicates disputes between consumers and third-party merchants about defective or counterfeit items. For many small businesses, layoffs and bankruptcy hang in the balance. This Article begins to uncover the processes that these platforms use to resolve disputes and proposes reforms. Other important businesses that intermediate, such as credit card companies ruling on a disputed charge between a merchant and consumer, must by federal law provide timely notice, a reasonable investigation, and other procedural minimums. In contrast, platforms have almost unfettered discretion. Under intense public pressure, Facebook recently began building an independent oversight board that can overrule content moderation decisions. But whether other platforms will follow is unclear, and Facebook’s oversight board has significant limits. If the largest platforms face limited competition while serving as the primary arbiters of disputes in the information age, they warrant mandated procedures as did financial institutions before them. The procedures would aim to improve the administration of justice through public accountability and separation of at least one of platforms’ executive, legislative, and judicial powers.

TABLE OF CONTENTS

Introduction

In the fall of 2017, the world’s largest social network put hundreds of women in “Facebook jail,” indefinitely suspending their accounts for posting “men are scum.”1 Such incidents have contributed to a growing realization that internet platforms are the equivalent of the modern public square for speech.2 But the societal impact extends beyond speech. By suspending an account (deplatforming), Facebook, Twitter, and other social networks can sever isolated or vulnerable populations from their support networks.3 Amazon has swiftly destroyed many entrepreneurs’ liveli-hoods by delisting them.4 Despite being the leading referee of reputation, Google has traditionally declined to edit search results,5 thereby ensuring that one law student’s interviewer saw her through the lens of false accusations that she slept “her way into Yale.6 The platform ecosystem wields devastating sanctions beyond silencing speech.

Additionally, the emphasis on governance obscures another institutional dimension: dispute resolution. The widespread posting of “men are scum” originated as one woman’s response to sexist comments.7 Amazon often delists one seller based on another seller’s or a consumer’s accusations, sometimes fabricated for self-serving purposes.8 From the perspective of one of these small merchants, “Amazon is the judge, the jury, and the executioner.”9  Although scholars analogize platforms to “sovereign states,” their focus is not on how these entities handle disputes.10 Since a platform is a site “where interactions are materially and algorithmically intermediated,”11 the inattention to dispute resolution has left one of the information age’s vital organs of power underappreciated.

This Article begins to fill that gap by illuminating the inner workings of what has arguably become society’s most important private judicial system. Drawing on official company policies, unofficial complaint forums, interviews, and other sources,12 it provides case studies of the dispute processes designed by Airbnb, Amazon, Facebook, and Google. It then offers a framework for reforming those internal civil procedures.

This project builds upon and integrates three remarkably distinct strands of scholarship. The most directly in dialogue is that on platform governance—one of the most vibrant, visible, and vast bodies of literature over the past few decades.13 Scholars in this area have emphasized that platforms such as Twitter, Facebook, Google, and Amazon exert quasi-sovereign influence over commerce, speech, elections, and myriad other spheres of activity.14 Those analogies between platforms and governments provide normative foundations for procedural regulation because the Due Process Clause constrains state actors’ rulings.15 The comparisons also implicitly show why platform dispute resolution merits greater attention. The U.S. Constitution divides authority among three branches. To focus only on the state as a whole, with passing references to the judicial branch, would insufficiently explicate how the state governs. If the pervasive analogies between platforms and governments are to be taken seriously, platforms’ judicial role must be taken seriously as well.

Platform procedure also speaks to a second, less visible body of scholarship: alternative dispute resolution (ADR). ADR scholars have begun to use technology to operationalize procedural justice in businesses—especially cross-border marketplaces like Amazon and eBay.16 But they have mostly emphasized how companies can voluntarily adopt informal, nonadjudicatory mechanisms for improving dispute resolution—such as online mediation, which allows the parties to work it out.17 They have paid less attention to how the law might require firms to improve formal adjudicatory processes.18

The ADR literature sits in tension with the third foundational strand of scholarship: procedural privatization. Both strands share a broadly defined goal of access to justice.19 But whereas ADR proponents embrace alternatives to courts, procedural privatization proponents—mostly from the perspective of civil procedure and contracts—tend to critique the inability to access courts.20 They have filled volumes documenting the problems surrounding a particular type of ADR: mandatory arbitration.21 In contrast to ADR scholars’ emphasis on confidentiality, privatization scholars decry arbitration’s lack of transparency.22 Procedural privatization scholars have also painted binding arbitration as “a clandestine effort to tilt the scales of justice,”23 resulting in “an unconstitutional deprivation of litigants’ property and court access rights.”24 As this Article shows, platform procedures raise related concerns.

An integration of those three literatures is more than academic. Creating a judicial system with predictable procedures was pivotal to establishing a government built on laws: “The constitutional guarantee of liberty implies the existence of an organized society maintaining public order, without which liberty itself would be lost in the excesses of anarchy.”25 Tech platforms have created a judicial system that plays an increasingly centralized role in maintaining public order. In significant contexts where financial platforms serve as gatekeepers for vital participation in society, federal laws regulate dispute resolution.26 For instance, for-profit credit bureaus—whose credit reports determine whether someone can obtain employment, receive a loan, or rent an apartment—must provide timely notice to each party and conduct reasonable investigations.27

It is worth considering analogous rules for large online platforms. A more imaginative legislative agenda would go beyond the current calls for transparency to consider a broader array of procedures, such as user class actions and an independent appeals board. These mandates would provide an accountability structure for platforms’ formidable power to punish.

Part I surveys how tech platforms resolve disputes. Part II provides reference points by examining existing mandates on platform procedure for credit card companies, credit bureaus, and online publishers. Because the literature has devoted the least attention to designing solutions, Part III comprises the bulk of this Article’s discussion. It begins to sketch a system for platform dispute resolution and provides options for specific rules, such as limitations on platforms’ termination of accounts.

Before turning to the main discussion, several points of clarification are in order. This Article’s core question—how to reform private dispute resolution—requires weighing economic, social, and moral factors. There is no rigorous or uniformly embraced equation for determining whether additional expenditures on an extra layer of procedure are worth the added equity or impartiality. Moreover, competition can pressure businesses to advance procedural justice, as I have argued elsewhere.28 The dynamic nature of platform procedure makes intervention more precarious.

These and other analytic constraints are revisited in greater depth in the final Section on objections. They are important and not to be dismissed lightly. The indeterminacy means that readers will, perhaps based on their priors, inevitably come to different, defensible conclusions on the best path forward. At the same time, there is reason to think that markets have failed in the case of the largest online platforms.29 Moreover, it is valuable to recognize that the reservations about legal action are in many ways universal to the challenge of regulating new industries, faced before in oil, banking, transportation, and elsewhere. And the difficulty in knowing the value of an extra layer of procedure was surely the case at the founding of the U.S. judicial system.

This Article aims to highlight some of those similarities and provide a richer institutional account, which will inform the coming construction of a regulatory architecture for platforms. The focus on dispute resolution rules should not be read to imply that such procedural regulations are sufficient by themselves to solve the diverse issues raised by platforms, including racially biased algorithms, anticompetitive conduct, and disinformation.30 Procedural changes can have ripple effects on substance, but others have tackled these and substantive laws more directly. Moreover, I have elsewhere discussed complementary regulation of platforms, such as antitrust breakups,31 regulatory monitoring,32 mandated data access,33 a technology meta-agency,34 and increased liability.35 Dispute resolution rules are only part of a much larger project.

Finally, although I believe that the law should require at least some procedural reforms to address market failures, platforms could in the alternative adopt them voluntarily. The structures and rules below thus offer a menu of options that both private sector designers and public policy makers can use to build a more effective system of platform justice.

I. Punishment by Platform

The roots of platform sanctions lie not in public courts, but in private ordering. Even in rare instances when platforms’ arbitration clauses do not block access to courts, the time and expenses involved make the formal legal system “effectively unavailable to all but wealthy individuals and businesses.”36 Moreover, courts defer to platforms’ internal rules for controlling user conduct.37 Below are case studies of those internal rules at four leading tech platforms, each representing a category: Amazon and other marketplaces join buyers and sellers; Facebook and other social media companies connect users and followers; Airbnb and other sharing economy enterprises pair customers and servicers; and Google and other search engines link information seekers to publishers.

A. Marketplace Platforms: Amazon

Third-party merchants account for more than half of Amazon’s sales.38 Initially, Amazon strove to minimize its involvement in disputes between sellers and consumers. In response to numerous early complaints about offensive book listings, the company announced that “Amazon believes it is censorship not to sell certain books simply because we or others believe their message is objectionable.”39 As another example, when buyers clicked on the link for filing a refund claim or leaving negative feedback, Amazon provided a pop-up notice saying, “You must contact the seller before filing a claim.”40 Now, however, book banning by Amazon has become common.41 And Amazon has developed an extensive and largely automated internal adjudicatory system that handles millions of disputes annually, more than all U.S. federal courts combined.42

The stakes can be high for these adjudications. The company’s main sanctions are product bans and account terminations.43 Many merchants have built their entire operations around Amazon’s promise of providing access to mass markets.44 When merchants suddenly lose access, it can leave them scrambling. Three-quarters of Amazon’s third-party sellers have between one and five employees.45 As one former Amazon employee—whose full-time job is now helping merchants navigate the Amazon appeals process—put it: “If they don’t get their Amazon account back, they might be insolvent, laying off 10, 12, 14 people, maybe more. I’ve had people begging me for help. I’ve had people at their wits’ end. I’ve had people crying.”46 The point here is to illustrate the impact of Amazon’s sanctions, rather than to imply that Amazon is doing anything improper with them.

How are these suspensions and terminations determined? An algorithm typically flags an account for deplatforming, but a human is often involved later in the process, particularly once someone appeals the suspension. Amazon identifies problematic sellers partly through an algorithm that monitors “defective” orders. An algorithm flags sellers for having a high order defect rate (ODR), and Amazon may deactivate accounts with ODRs above 1%. The buyer makes an order defective by (1) leaving negative feedback, (2) filing a claim with Amazon, or (3) requesting that the credit card company reverse the transaction.47

For negative feedback, ratings of one or two stars out of five will increase the merchant’s ODR. Merchants can immediately challenge feedback with a few clicks on Amazon’s Feedback Manager portal.48 An Amazon bot then takes the first pass at determining whether to remove the feedback. For instance, the bot erases feedback containing profane language—including, in at least one case, the word “damn.”49 The bot also looks for certain words that indicate the review may be valuable, and thus more important to preserve even if a seller complains.50 Merchants can respond to the feedback, so that anyone viewing the post will get both sides of the story.51 However, merchants’ formal options for removal are limited, and the site is plagued by fake reviews.52

A separate dispute resolution process unfolds when buyers request a refund. Under its guarantee program, Amazon provides refunds to buyers if an item does not arrive within three days of the maximum estimated delivery date, the buyer received the wrong item, or the buyer returned the item to the merchant without receiving a refund.53 If the item meets one of these criteria, Amazon deducts the funds from the seller’s account. The seller receives an email detailing the buyer’s grievance and must respond within three days. Based on a review of this information, Amazon decides whether to rule for the buyer or seller. If the platform decides to uphold the refund request, merchants can appeal within thirty days by providing further evidence.54 Although a high ODR is a primary avenue for account suspension, a seller can file a complaint about another seller by clicking a “Report abuse” link.55 Amazon is quick to freeze accounts at the first sign of an issue.56

That readiness to suspend accounts allows sellers to exploit the dispute resolution process to sabotage competitors. Sellers create fake glowing reviews on competitors’ sites, aiming to trigger Amazon’s automated policing system that continually monitors for suspicious entries. Even merchants who have recognized the ploy, and attempted to alert Amazon to the presence of such fake positive reviews on their product pages, have still found their accounts suspended.57 In one case, Amazon delisted a small seller because its rival, Snuggle Pet Products, alleged that the seller’s puppy sleep aid infringed on patents.58 The claim was spurious, based on one unenforceable patent from 1895 and another for an unrelated Japanese “combustion device.”59 But the puppy sleep-aid vendor had to go to federal court to make its case, and in the meantime lost considerable sales from the suspension of its best-selling item.60 Amazon responded in 2019 by launching a patent adjudication system relying on patent lawyers as third-party adjudicators.61

Although some terminations are permanent, sellers have the opportunity to appeal. The burden rests on the suspended account holder to satisfy Amazon regarding the alleged behavior by submitting a “plan of action to reinstate selling privileges.”62 To help navigate this process, law firms and consulting practices have sprung up dedicated to Amazon “seller account[ ] reinstatement.”63 For some categories of disputes—such as allegedly fake reviews—critics say that the company’s process rewards sellers who admit guilt and explain how they will rectify the behavior moving forward, like a convicted criminal offering a reentry plan.64 In the words of one former Amazon employee who now represents merchants, “it is a system of guilty until proven innocent.”65

It is difficult to know the truth and representativeness of any particular depiction of an erroneous or unfair outcome. Either way, Amazon’s and other high-volume marketplaces’ dispute systems are worthy of attention due to their magnitude, transparency limits, and ruinous sanctions.66

B. Social Platforms: Facebook

Suspending accounts and taking down content are everyday events in major social networks. In the 2018 “Grab Them by the Ballot” campaign, organizer Dawn Robertson sought to increase voter turnout by posting unedited images of women of all ages who were nude except for, say, a small balloon covering a private part.67 The campaign took off but also received intense criticism and complaints from large numbers of users on Facebook and Instagram.68 Facebook responded by suspending Robertson’s accounts.69 Because Facebook can “engineer” elections and increasingly influences decisions such as whether to vaccinate children,70 there are few more pressing tasks for the country than figuring out access and misinformation on social networks.

In addition to cutting off vital avenues for sharing information and for speech, account termination may deprive a user of valuable property. One Facebook user’s account was “permanently disabled” after his brother passed away.71 Because he used his Facebook account to save most of the pictures he had of his brother, he lost access to them.72 In another case, a writer in New York slowly built the pieces of her book on Instagram only to have her account suspended indefinitely due to an alleged copyright violation for only a tiny portion of her photos.73

In its early days, Facebook (which today owns Instagram) relied heavily on users flagging questionable behavior to moderate content and suspend accounts.74 Now it relies heavily on a “classifier,” or bot, trained on employees’ removal practices to flag problematic content and accounts.75 Once suspended or terminated, the user faces obstacles to rejoining even with a fake account. Facebook has what it calls “advanced detection systems” that swiftly deactivate a new account linked to a previously suspended party even if opened with a different email, on a different computer, and in a different location.76

Traditionally, Facebook gave, at best, a vague explanation for suspending an account. It might have simply stated the reason as “suspicious activity.”77 However, the company’s approach shifted beginning in 2018, as the social network came under intense public and bipartisan congressional criticism for censorship, election influence, and privacy missteps.78 To improve transparency in the wake of those challenges, the company made the unusual decision to publish its content takedown procedures, which it contended had “long been in place.”79 Facebook committed to publishing any changes in a searchable archive. Further, Facebook said it would notify the poster of any removed content. The poster then has the option of challenging that decision, at which point, within twenty-four hours, the original decision will be reviewed by a human.80 Thus, fourteen years into its existence, Facebook began offering an internal appeals system.81

The company similarly resisted addressing misinformation at first, but under pressure pursued a middle ground. Facebook prefers not to remove content. Instead, it limits the distribution of spurious posts and of all content by accounts repeatedly found to share fake news.82 The company is thus the ultimate “arbiter of truth and falsity in the practical sense that it chokes off distribution of purportedly false content.”83 But it bears emphasis that it makes those determinations as a third party—not only because users help to identify material in need of a closer look, but also because Facebook has partnered with independent fact-checkers, including the Associated Press.84 Those fact-checkers are analogous to court-appointed neutrals allowed by the Federal Rules of Evidence.85

Facebook acknowledges that its “enforcement isn’t perfect.”86 And those subject to the ultimate punishment—expulsion—have often found the processes inadequate. That inadequacy has driven desperate users to seek alternatives. One suspended user went to Facebook’s careers website but, instead of submitting a job application, petitioned for account reinstatement.87 A human resources employee responded to clarify that job applications were inappropriate for such a request—but the employee still reactivated the account.88

Those desperate for a second look at their case now have another option. Originally described by Mark Zuckerberg as a kind of “Supreme Court,” the new Oversight Board has the authority to overrule content moderation decisions by applying the company’s policies and weighing the “public interest.”89 Facebook users can request that the Board review the company’s decisions, putting the Board in a dispute resolution position.90 The Board will also issue public explanations for its rulings and value the precedent set by its prior decisions.91 Because it remains in its infancy and is the first of its kind, the Board’s ultimate contribution to Facebook’s platform procedure is unknown. But it is one of several reforms that have moved Facebook toward procedural justice in the “age of alternative facts.”92

C. Sharing Platforms: Airbnb

Airbnb leverages the threat of account suspension to maintain quality control, stating that it suspends accounts if a host rejects too many reservations, responds too slowly, or receives low ratings.93 A broader set of foundations for suspension include complaints from guests about specific incidents.94

Although most cases are not so clear-cut, Airbnb states that it does not need to justify the reasons behind its suspensions.95 The lack of explanation frustrates hosts, many of whom have shared their stories on a website for grievances by guests and hosts, airbnbHELL.96 Sometimes the company mentions a vague rationale for locking an account, such as “security reasons,” without providing further explanation.97 Because many people rely on Airbnb income to pay their bills, and some purchase homes depending on that income to pay the mortgage, mistakes can lead to missed payments and even foreclosure.98 More so than Facebook’s, Airbnb’s adjudications implicate property interests analogous to those in traditional constitutional due process proceedings.99

Airbnb’s sanctions also include marking the host’s account. For instance, users will see a notification on a listing if the host has previously canceled a reservation within twenty-four hours.100 These procedures punish the host for conduct assumed to have caused a prospective guest discontentment, even in the absence of a complaint.101

Guests have also found their accounts suspended. Cadence Lux, an adult performer who used the site to find a safe place to sleep while traveling, had her account closed unexpectedly. Airbnb also denied her from opening an account under her legal name, stating that her identity was “associated with activities that pose a risk to the Airbnb community.”102 Similar interference with people’s ability to travel affordably and safely—if adopted widely in the travel industry—could for certain groups lessen freedom of movement and equal treatment that many take for granted.

Lux’s termination illustrates a predictive dispute-prevention strategy. An Airbnb patent describes artificially intelligent technology that scans people’s online life to “determine a trustworthiness score or compatibility score of the person based on the behavior and personality trait metrics using a scoring system.”103 The tool can lower a score if the user has “authored online content with negative language[ ] or has interests that indicate negative personality or behavior traits.”104 Airbnb explains how it uses this technology on its website: “We use predictive analytics and machine learning to instantly evaluate hundreds of signals that help us flag and investigate suspicious activity before it happens.”105 Another way of viewing this technology is as a means of preventing others from having negative experiences—or preventing disputes from ever arising. An individual classified as risky has no recourse or even visibility into the grounds for that determination.

D. Search Platforms: Google

The results of search engines such as Google exert a tremendous influence on people’s reputations and speech visibility.106 That responsibility requires Google to intermediate disputes between seekers, providers, and subjects of information. For instance, fans disgruntled with the final season of Game of Thrones manipulated Google search results so that an image of the two lead writers would appear among the top results when anyone searched for “bad writers.”107 The tactic, known as “Google bombing,” has had numerous high-profile successes, such as yielding President George W. Bush as the top listing when anyone entered “miserable failure” following his widely criticized disaster-relief response to Hurricane Katrina.108 Google admits in its official blog that it sometimes intervenes directly to squash these efforts, although it has not disclosed how it reaches those decisions.109

Google bombing illustrates a point of differentiation from Amazon, Facebook, and Airbnb. In many instances, only one of the parties has established a contractual relationship with Google related to the dispute. Amazon and Facebook certainly implicate third parties external to their platforms, through the sale of counterfeit goods or postings about nonusers.110 But the core conflicts for those other platforms directly result from both parties voluntarily participating in the platform. Inclusion in a Google search requires no such consensual participation. Game of Thrones writers cannot exempt themselves from being discussed on web pages, and small businesses have no say in whether Google allows users to rate them on a five-point scale.

One of the fundamental tensions giving rise to search disputes is between website publishers seeking prominence and the subjects of those sites wanting privacy—or at least accuracy. Google’s ruling on what to leave prominent determines commercial success, dating prospects, and hiring decisions.111 Its quasi-judicial role is prominent in Europe because lawmakers have created a “right to be forgotten,” requiring search engines to decide whether each request to delist a web page satisfies the statutory conditions.112 Less well understood from a dispute resolution perspective is that, even in the United States, Google similarly intermediates. Upon request, the company delists explicit content and sensitive information, including financial and medical data.113 It also considers petitions to expunge sites that engage in “exploitative removal practices,” such as requiring people to pay to take down mugshots.114

Google has made requests to take down approved categories of information relatively easy for users. The user clicks through a series of online forms with straightforward, multiple-choice questions—such as: “Have you contacted the site’s website owner?” and “The information I want removed is”—followed by a list of categories of information.115

Although filing requests is relatively seamless, the company has also wholly withdrawn itself from adjudicating large categories of disputes. Google does not delist business review websites, such as RipoffReport.com, even if they engage in exploitative behavior.116 Thus, websites can force a mom-and-pop shop to pay to avoid having its reputation tarnished. The search engine has similarly refused to involve itself in disputes related to its ubiquitous Google business review pages, which allow anyone to rate businesses on a five-star scale.117

Like with Amazon reviews, competitors and consumers have weaponized Google reviews.118 Some who are not even customers of the business have demanded payment to refrain from leaving negative feedback, and many small businesses have found their revenues plummet upon the appearance of allegedly fake Google reviews.119 For instance, Gee McCracken built a web-based weight reduction company that grossed over a million dollars in sales annually, but her customers are scared away by what she insists are inauthentic reviews. After contacting Google, she summarized her experience by saying, “I could not get anyone to listen to me.”120 Her company dissolved, along with her life savings.121 Google’s official policy pages reinforce McCracken’s observation by announcing that “Google doesn’t get involved when merchants and customers disagree about facts, since there’s no reliable way to discern who’s right about a particular customer experience.”122

Shielded by the Communications Decency Act,123 the search engine has also long declined requests to delist defamatory statements, hate speech, and misinformation.124 Google thus has attempted to stay neutral in the face of an emerging “post-truth society” in which “what is true matters less than what we want to be true.”125 For instance, following public backlash because the top result “for a search of ‘jew’ was the URL jewwatch.com, a site featuring anti-Semitic content,” Google responded that “it does not ‘remove a page from [its] search results simply because its content is unpopular or because we receive complaints concerning it.’”126

The company has softened that stance somewhat in recent years and now demotes hate speech and related content, but still strives to avoid involvement in conflicts among seekers, subjects, and publishers of information.127 Indeed, even when courts have ordered Google to take down content, the company reviews each court order and declines to comply if it believes the order is “false.”128 The appropriateness of ignoring court orders that Google deems false is subject to debate, but as a descriptive matter Google essentially operates as a higher authority, reviewing de novo the accuracy and desirability of state and federal court defamation rulings.129

Notably, Google’s early content architects described their operations in legal terms. The original takedown policies arose organically but erred heavily on the side of free speech and accessibility. Over time, users and content moderators—sometimes pressured by the public—flagged issues that challenged existing policies, and the team would then escalate to a set of content decision makers.130 For instance, by the late 2000s, the ultimate content determinations for YouTube video takedowns went to a committee of three of the company’s most senior executives.131 That committee would opine on specific cases and rule on general policy—such as whether videos extolling weed should be allowed.132 Although the people have changed and layers were added, that basic internal appeals structure still exists, beginning with frontline content moderators and escalating up through bosses to—in some extreme cases—the CEO. The rest of the organization “adjudicate[s]” those updated policies aiming for “consistency” and “creating precedent.”133 The process evokes images of the founding of a legal system.

In at least one way, however, Google’s process is very different from the U.S. court system: it provides limited visibility for those punished. Unlike a merchant on Amazon, a business owner listed in Google’s rating pages (or whose website appears in its searches) has no account with Google unless it is an advertiser. But Google has maintained a sharp separation between its search and advertising arms, and for purposes of search, treats those with advertising accounts no differently.134 Thus, unlike users subjected to sanctions by other major platform categories, those demoted in search results will not learn about that development upon signing into their account. A business must search for itself to learn of its demotion.135 Moreover, for a company that suddenly finds itself on page four of the results instead of page one, Google’s silence makes it impossible to know whether it dropped on the merits or as punishment.

In short, Google is more than a neutral provider of search results. Its status as the world’s most important information gatekeeper thrusts it into the middle of disputes. Compared to Amazon, Facebook, and Airbnb, Google more extensively avoids dispute resolution by exempting whole categories of conflict and often cutting accused parties out of the process. However, even when refusing to adjudicate, Google is in a court-like role. After all, some courts can refuse to hear cases. When lower-level employees have subject matter jurisdiction, they apply rules established by chief-level executives and clarified by internal case history. The complainant may wait, hoping for the demotion of degrading content, unaware that moderators already denied the petition. Those whose speech has disappeared may not know that they were even part of a secretive adjudicatory process until after Google renders a verdict.

* * *

Most large platforms have developed intricate and extensive procedures for adjudicating disputes—or for declining to do so. These online systems implicate real-world livelihoods—the ability to work, travel, socialize, speak publicly, and stay reputable. Unlike federal courts’ procedural rules, however, platforms’ rules are influenced by an economic analysis that prioritizes profit. In part for this reason, platforms originally sought to cut costs by letting parties handle disputes themselves—and to the extent possible, continue to prefer that approach today.

From those laissez-faire origins, large online intermediaries have developed organizational tools to act swiftly and decisively, sometimes after only one event. Their sanctions—as with courts’—may be either monetary or injunctive. Marketplace platforms choose between these two sanctions and can immediately debit merchants’ accounts to reflect the outcome of consumer claims. Platforms that do not as routinely process transactions, such as Google and Facebook, have more limited monetary remedies. They and marketplace platforms instead wield the ability to block access to the commercial world or a means of public speech and visibility in the digital age.

There is no guarantee that a human will hear a case. Platforms can adjudicate through algorithmic assessments of current and past behavior—including unrelated behavior from myriad external data points collected by other tech companies, such as social media posts. Akin to an accused criminal receiving notice that a prosecutor has pressed charges, users may suddenly receive initial notifications that they have violated an Amazon, Airbnb, or Facebook rule. In the best-case scenario, the users may then have the opportunity to respond. Often, however, the first communication is more like a trial court judge’s initial ruling. At that point, the only avenue may be an appeal—whether formal or informal—with an assumption of guilt rather than innocence. The expanded privatization of U.S. justice through platforms’ internal dispute systems deserves scrutiny.

II. Existing Platform Procedure

One of the key policy decisions moving forward is the extent to which platform decision-making will remain private. This Part lays the foundations for that inquiry by reviewing the existing federal laws imposing procedural oversight for credit card billing errors, credit report mistakes, and copyright violations. Existing procedural directives are not limited to the contexts discussed in this Part. Numerous other dispute resolution laws exist, requiring insurers, airlines, cable companies, and other businesses to take specific steps in resolving conflicts with their customers.136 Since the case studies below involve platforms, they are more relevant than many other contexts, but are nonetheless merely a sample of a larger universe that can inform the design of new platform procedures.

A. Credit Card Companies as Adjudicators

Like Amazon and Airbnb, many financial institutions operate as platforms in that they facilitate transactions between two independent parties. For instance, when Los Angeles resident Elah Feder found a new apartment, the landlord asked her to transfer the $1,500 deposit through Venmo, a mobile payment app. Thirty minutes later, she received a message: “[P]retty sure you have the wrong person.”137 Feder had accidentally spelled her landlord’s name as Stephen, instead of Steven. However, Stephen refused to return the money unless instructed to do so by Venmo. Venmo would not intervene, despite the written admission from the recipient stating that it was a mistake.138 Mistaken transfers are common on Venmo, but the company adopted a policy of telling the transferor to “send a message through the app requesting the money’s return.”139 Ultimately, getting money back on Venmo depends on a lawsuit or the kindness of strangers who received an unexpected windfall.140

Mistakes are also common with credit card purchases. Through the 1960s, credit card companies often ignored consumers’ protests about merchant billing errors or fraud.141 That response left the consumer simultaneously fending off a credit card company demanding payment and an antagonistic retailer. Now, however, credit card users can fix problems like the one that Feder faced easily and immediately.

The Fair Credit Billing Act of 1974142 mandated that any consumer who uses a credit card be able to challenge an erroneous charge, for reasons including not receiving the goods, receiving goods that did not conform, or being the victim of fraud.143 Because credit card companies have existed for considerably longer than tech platforms, they provide examples of one type of platform’s developed dispute resolution systems. Normally by pressing a button on the credit card’s website and filling out a few online forms, the consumer initiates that process, known as a “chargeback” because the card issuer immediately subtracts the disputed balance from the amount owed by the consumer.144 The process ultimately requires the credit card issuer to rule for one side.

Before ruling against a consumer, the credit card issuer must at least conduct a “reasonable investigation” within ninety days and explain to the consumer the reasons for rejecting the claim.145 Moreover, upon request, the credit card issuer needs to provide documentary evidence for why it rejected the claim.146 Although mandated, the credit card company’s adjudication gets its auth-ority not from any public law, but from the contract, in which the consumer and merchant agree to subject themselves to the chargeback process.147

Courts have interpreted these statutory requirements as imposing minimal burdens on both disputing parties. In Burnstein v. Saks Fifth Avenue,148 the plaintiff submitted a chargeback claim because she believed that Saks had billed her twice for the same jackets and pants.149 She had flagged the duplicate transactions on the phone to Saks Fifth Avenue, but the credit card issuer argued that she had failed to specify the exact dollar amounts in her formal letter to the company. The court rejected that argument: “The utility of the [statutory] dispute resolution scheme would be greatly diminished if a creditor could simply throw up its hands and opt out of the statutory process upon encountering any ambiguity or lack of specificity in a consumer’s claim.”150 The ease of triggering the statutory process increases access to dispute resolution for unsophisticated consumers.

Credit card issuers also receive considerable leeway in how they fulfill their requirements. Courts typically decline to second-guess the substantive outcomes of chargeback investigations—or as the Burnstein court put it, “[t]here is . . . no penalty for ‘wrong guesses’ made in good faith.”151 Federal law “establishes only the procedural framework for dispute resolution, and does not concern itself with the substantive outcome of this process.”152 Following that influential ruling, courts have interpreted the requirement that credit card issuers undertake a “reasonable investigation” as requiring only “a reasonable attempt to investigate.”153

ADR scholars have criticized chargebacks as insufficient because of limited consumer awareness and the lack of opportunity for amicable settlement.154 Some have also taken issue with the remedies, which do not allow for damages beyond a refund.155 There are inevitable abuses in providing consumers with an “undo button,” as demonstrated by one married couple who initiated a chargeback after their wedding because the colors on their wedding cake were “too bright.”156 But the baker ultimately received payment after submitting photos of the couple and guests laughing and eating the cake, along with screenshots of guests raving about the dessert.157 The process thus ensures that both sides have the chance to respond to baseless accusations.

Scholars in fields outside of ADR, especially consumer advocates, view chargebacks more positively. One of the main goals for chargebacks was providing a mechanism for consumer protection.158 Among consumers who use chargebacks, satisfaction is high—they rarely complain about the process or bring suits.159 Part of this satisfaction stems from financial institutions’ strong legal incentive to rule in favor of consumers, which allows them to avoid investigating.160 An estimated 79–90% of consumers are successful upon bringing a chargeback.161

Regardless of whether consumers win, however, chargebacks provide them leverage through a dispute resolution mechanism that is free, accessible, and fast.162 The law prohibits questionable practices, such as responding to the chargeback by submitting or threatening to submit a negative report about the consumer’s credit record.163 And the immediate reversion of funds into the consu-mer’s account serves as a kind of temporary injunction, preventing a cash-strapped borrower from having to pay a crushing and inaccurate debt—or be subject to collection efforts—until resolution of the matter.164 Chargebacks also give a voice to consumers who otherwise would have no plausible avenue for being heard.165

Nor have those benefits to consumers necessarily come at the expense of merchants. Another principal goal of the system is facilitating commerce by fostering trust. Because consumers feel secure in using credit cards, merchants benefit from increased sales and financial institutions earn revenue from a greater number of transactions.166 Moreover, issuers automate much of the process of resolving chargeback disputes, using artificial intelligence, which lowers costs compared to the earlier prevailing option of disputing a canceled check.167

Thus, although chargebacks may fall short of the relationship-oriented processes embraced by ADR scholars, they are quick and efficient while still allowing both sides to submit evidence. The case of credit cards illustrates how a category of platforms has successfully implemented a private dispute resolution system because of government directive.

B. Credit Bureaus as Adjudicators

The big three credit bureaus—Equifax, Experian, and TransUnion—make decisions about disputes between third parties.168 These three companies provide credit reports, accompanied by a FICO score, for almost every adult in the United States. The reports consist of information mostly from financial institutions, including credit card companies’ details about late payments or maxed-out card limits—either of which would drive someone’s credit score down.169 The bureaus mediate conflicts when a third party reports information that the consumer believes is inaccurate and wants removed from the record. The subjects of their disputes most closely resemble those of search engines: misinformation and reputation.170

The stakes of these disputes are immense. Credit reports inform decisions including whether someone qualifies for loans, credit cards, and bank accounts.171 About half of employers also pull credit reports before hiring someone, and landlords check them before renting to a tenant.172 For these reasons, observers have remarked that someone who has lost their good credit navigates society with a scarlet letter and is “dead to the world.”173 That characterization is sometimes literally true. In one case, the court upheld James McKeown’s lawsuit against Equifax for failing to adequately process his dispute about information on his record provided by the department store Sears.174 Sears had reported McKeown as deceased, which made it difficult for him to obtain a loan.175

In part to ensure that such disputes “function fairly, accurately, and efficiently,”176 Congress passed the Fair Credit Reporting Act of 1970177 (FCRA). Under the act, consumers have the right to inspect their reports.178 Upon request, credit bureaus must disclose “key factors” that may have negatively affected a consumer’s score,179 essentially requiring an explanation of the potential reasons for any credit denial. Upon receiving a consumer complaint, they also must “follow reasonable procedures to assure maximum possible accuracy of the information concerning the individual about whom the report relates.”180 The statute thus sets in motion a compulsory dispute resolution process with the credit bureau as an intermediary between the consumer and the furnisher of credit information, such as a bank reporting the nonpayment of a loan.

Once the consumer has produced information contradicting the credit bureau, courts have interpreted the statutory investigation mandate as necessitating additional verification beyond the original source.181 In Dennis v. BEH-1, LLC,182 Experian listed in a credit report a prior debt collection lawsuit against Jason Dennis by his landlord as successful.183 That description matched the court register’s erroneous initial description of the case, but the court clerk later correctly filed the final entry as “Dismissal Without Prejudice.”184 Dennis informed Experian that its listing was incorrect, and Experian obtained the final stipulation between Dennis and his landlord through a contractor who described Experian’s information as accurate.185 The court held that Experian fell “far short” of reasonable diligence because, instead of looking at the file in its possession, it relied on the top-level mistaken assertion of the contractor.186 When consumers take the unusual step of filing a lawsuit, courts have proven willing to uphold claims of unreasonable procedures.187

The resulting system is far from perfect. A high number of consumers’ files—26% in one study—have material errors that could influence the credit score.188 Furthermore, observers have argued that the procedures required of credit bureaus provide inadequate transparency and fail to impose liability sufficient to discourage bureaus from conducting a rubber-stamp investigation.189

Despite these flaws, the procedural mandates may still be helping if the error rate and procedural injustices would otherwise be even higher. Moreover, the incentives of credit bureaus differ from those of credit card companies and most tech platforms. Most notably, whereas users can leave most online networks,190 they cannot opt out of having credit reports collected about them. As a result, like Google, credit bureaus have weaker incentives than either credit card companies or Amazon, Airbnb, and Facebook to design dispute resolution processes that appeal to consumers.

Notwithstanding these differences, the case of credit bureaus is instructive in weighing analogous mandates for online platforms. The discussion below draws on shortcomings in the credit bureau system to inform the design of online platforms’ dispute procedures.191 Additionally, credit bureaus provide another example of congressional willingness to impose procedures on platforms that play a central role in many spheres of human activity. The FCRA’s legislative history reveals that the act’s drafters intended “to protect an individual from inaccurate or arbitrary information.”192 Online platforms are susceptible to related challenges.193 Congress was particularly concerned about the increasing speed of information transfer—already, in 1970—raising the potential for injustice and significant injuries. Credit bureau regulations demonstrate that an information gatekeeper’s harsh mistakes can drive procedural legislation.

C. Platforms as Copyright Adjudicators

Businesses lose billions of dollars annually because websites share copyrighted materials without payment.194 Congress did not want the fear of copyright violations to have a chilling effect on the internet, so instead of punishing a platform when a third party posts illegal content, the Digital Millennium Copyright Act of 1998195 (DMCA) established a detailed process to resolve disputes between the alleged copyright holder and alleged copyright infringer.196

When a platform, or other online publisher of third-party information, receives a compliance notice that material on its site violates copyright law, to be protected from liability it must remove the material “expeditiously.”197 Disney sent out a barrage of these takedown requests to Etsy, Vulture, and other sites for wildly popular Baby Yoda merchandise and content posted shortly after the beloved character’s initial appearance in The Mandalorian.198 After receiving these takedown notices, the platform is instructed to notify the alleged infringer of the takedown and send any “counter notice” from that party to the alleged copyright holder.199 The online service provider then can place the material back on the internet and still avoid liability if the accuser does not file a lawsuit within ten days.200 The DMCA thus puts online service providers into the role of a private adjudicatory system, coordinating communications between the two parties and ultimately administering a ruling on whether to delete the content and terminate accounts for repeat infringement.201

The law has had the intended effect of shielding publishers from litigation, as “the vast majority of [takedown] notices likely are never subject to the scrutiny of a court.”202 Instead, the system emphasizes efficiency.203 Most large companies automate the takedown process, creating chaotic “algorithmic law enforcement.”204 Online publishers must respond to large numbers of automated notices by other companies, with Google alone receiving half a billion takedown requests in 2015.205 Companies holding significant copyrights also often err on the side of challenging content, such as one movie studio’s automated system sending a takedown request for a student’s book report about Harry Potter posted online.206

From a dispute resolution standpoint, the copyright regime is flawed.207 The online publisher’s default response is to take down content to avoid liability without, for example, stopping to analyze whether it should instead leave up an original forty-five-second stop-action Lego movie produced by a ten-year-old boy.208 This has encouraged the internet’s growth by lessening the likelihood that online content publishers will be sued for third-party copyright violations. But empirical evidence indicates that large companies abuse the takedown process, often causing the removal of perfectly legal content.209

One of the DMCA’s major shortcomings is the counter-notice provision, which aims to protect content posters by allowing them to respond to the takedown request.210 A large-scale survey found that parties rarely send counter notices, in part because “the typical target of a DMCA complaint has little or no knowledge of copyright law, and little capacity to make informed estimates of the risks attendant on filing a counter notice.”211 The few who do use the counter-notice provision may be copyright pirates—from locations such as Russia and Ukraine—who know the copyright holder will not file a lawsuit in their foreign jurisdiction.212

Like with credit card chargebacks,213 the DMCA incentivizes a particular outcome. Specifically, to qualify for safe-harbor liability protection, the platform should resolve the dispute in favor of the last party to comply with the statutory sequence of back-and-forth communications.214 The copyright holder has the last shot because if it files a lawsuit, the platform must take down the material to benefit from the liability shield.215 The platform could decide to leave the material up if it thought the case was invalid, but it would be taking a risk in doing so.

Although flawed, obligatory copyright dispute resolution expands the sphere of mandated procedures beyond financial institutions. The DMCA demonstrates that large online platforms already must comply with a federally mandated procedural system for at least one type of dispute. The act also indicates how a failure to consider power and information asymmetries—particularly how wealthy firms might co-opt the system—can undermine the ideals of balanced dispute resolution.

The array of examples discussed in this Part normalizes a policy intervention that might otherwise seem strange and extreme: treating a private company like a public entity forced to follow detailed compulsory procedures in resolving customer disputes. The discussion below will draw on these examples in exploring the design and normative foundations for a broader set of federal rules.

III. Enhanced Platform Procedure

Two main questions frame the path forward: First, what are the normative foundations for new mandated online platform procedures? Second, what might such mandates entail? In exploring these questions, this Part relies heavily on the model of the U.S. civil trial as the starting point. Attempting to recreate that flawed institution within private platforms would certainly be a mistake. However, it would also be a mistake to ignore the principles, mechanisms, and lessons offered by a centuries-old experiment in justice. Indeed, arguably as much can be learned from the failures of the civil trial as its successes. Probing the civil trial for ideas is not the same as saying that it provides all the necessary blueprints. Moreover, to work in cyberspace, any judicial blueprints used would need to be transformed rather than copied.

For these reasons, this Part also moves beyond the U.S. civil trial in a number of ways. Most importantly, administrative agency oversight of mandated rules would be preferable to a private enforcement model, as outlined in the section on enforcement. Moreover, the discussion draws on a non-exhaustive set of alternative dispute resolution contexts, including the case studies from Part II and examples from abroad. Nonetheless, the civil trial is the inevitable reference point for U.S. jurists. The structure and procedure of the civil trial thus provides a vehicle for exploring a dispute resolution architecture for the information age that will look very different from its courthouse analogue.

A. Normative Foundations for Mandating Platform Procedure

The discussion so far offers normative foundations by analogy. In response to preliminary evidence of flawed dispute resolution in other important platform contexts, lawmakers have imposed procedural minimums.216 With their opacity, provocation of discontent, and crushing sanctions, Amazon, Facebook, Google, and other online platforms arguably offer insufficient dispute resolution.217 Some might find it persuasive to impose procedural mandates on tech platforms in light of their similarities to financial platforms. Although the policy case could rest on that analogy alone, simply because the government passed that legislation does not make it right.

The decision whether to intervene would benefit from a normative framework. The leading normative foundation for regulation is to address a market failure.218 There is reason to believe some large platforms face insufficient competition, due to factors such as high switching costs and network effects.219 They also produce troubling externalities, such as harming someone’s reputation or business who may not be on the platform.220 In such circumstances, existing markets will not provide the level of procedure that more competitive markets would.221 For instance, due to the concentrated nature of the cable industry, customer service was terrible. Consequently, Congress passed legislation leading to rules that required basic minimums such as human beings available to answer calls during business hours.222 Procedural mandates can be seen as aiming to correct a market’s failure to provide the competitive level of customer service.

Constitutional law offers another potential lens, through due process.223 To be clear, recent case law suggests that as a matter of constitutional law, due process protections do not apply because platforms are not state actors.224 The doctrine nonetheless supplies a framework for providing minimum procedural safeguards before depriving someone of liberty or property.225 Specifically, the Supreme Court has established three factors to weigh in such instances:

First, the private interest that will be affected by the official action; second, the risk of an erroneous deprivation of such interest through the procedures used, and the probable value, if any, of additional or substitute procedural safeguards; and finally, the Government’s interest, including the function involved and the fiscal and administrative burdens that the additional or substitute procedural requirement would entail.226

In terms of the first factor, commercial platforms serve as gatekeepers to markets, deciding which individuals and companies gain or maintain access.227 To be delisted by Google is to become “invisible to the general public.”228 The Court has elsewhere acknowledged that social networks function as the “modern public square” because they “provide perhaps the most powerful mechanisms available to a private citizen to make his or her voice heard.”229 Platform procedure clearly implicates substantial private interests.

The second factor in the due process framework is the risk of erroneous decisions.230 Given the private nature of platform conflicts, inadequate information exists about the overall performance of these dispute resolution processes to quantify mistakes. Nor is it clear where the line should be drawn. However, anecdotal, judicial, and empirical information paints a bleak picture of arbitrariness and discrimination.231 Academics have demonstrated that Google systematically shows lower-paying job advertisements to women than men,232 and Facebook settled lawsuits by the ACLU and others based on similar evidence.233 As one former Amazon employee describes the company’s dispute resolution process, the result is “very inconsistent and hit or miss. You’re at the mercy of a different person each time. And that person’s performance assessment is based on the number of cases completed, not the quality or consistency of the decision.”234 Absent evidence to the contrary provided by platforms, there is a basis for proceeding with the assumption that platforms’ erroneous decisions are unacceptably high.

The motivation for mandating credit report procedures speaks to the first two due process factors. According to a congressional sponsor of the FCRA, which imposed dispute resolution on credit reporting companies: “We certainly would not tolerate a Government agency depriving a citizen of his livelihood or freedom on the basis of unsubstantiated gossip without an opportunity to present his case. And yet this is entirely possible on the part of a credit reporting agency.”235 In the platform context, Amazon deprives some small business owners of their livelihoods by delisting them without allowing them to present their cases.236 A modern form of “unsubstantiated gossip”—product reviews—often drives these suspensions.237 Processes designed to minimize this unsubstantiated gossip would decrease the likelihood of error.

The final factor, the platform’s interests, infuses the due process analysis with a practical limitation. It would be unrealistic to require a full trial for every account suspension, even if the risk of error would decrease. We must therefore examine the burden of imposing a given procedure. Costs to the platform are not solely monetary. If the law prohibited Amazon from suspending the account of a merchant selling defective products, consumers and the platform could be harmed from the procedural delay—and thus from the imposition of additional procedures. This third factor may limit mandates that unreasonably restrict the platform’s interest in acting expediently against harmful users.

The more straightforward application of this third factor, however, is the cost of administering additional procedures. As a starting point, platforms already have extensive systems in place.238 Depending on the new procedures that would be adopted, the costs could range from minimal to substantial. A rule requiring extensive discovery would be costly. But since most of these processes are already automated, allowing a party to submit information in an online form to be provided to the adjudicator would be low cost.239 Again, the cases of credit agencies, credit card chargebacks, and copyright takedowns speak to the third factor. Financial institutions and online platforms have thrived despite the costs of compulsory procedures for large-volume disputes.240

Additionally, in competitive markets, there is strong evidence that the added trust and legitimacy gained from effective dispute resolution systems improves a company’s profitability due to better customer retention and increased customer engagement.241 Yet even in competitive markets, businesses sometimes focus so excessively on the short term, growth, or their core products that they ignore the importance of dispute resolution.242 The potential monetary gains from improved dispute resolution lessen the costs in the due process analysis. Thus, while the final factor helps determine which procedures to adopt, it does not defeat a proposal for mandating at least some.

Again, a due process analysis is unnecessary for lawmakers to order platforms to change their behavior. As mentioned above, the state has ordered financial and online platforms to play the role of courthouse in other contexts, without relying on a due process justification.243 The necessity of solving an important problem thus typically dictates the policy, rather than an explicit normative framework.244 Nonetheless, competition and the due process analysis provide frameworks for considering proposals for legislation mandating minimum platform procedures and inform the harder question of which specific mandates to adopt.

B. New Structural Checks and Balances

This Section and the next begin to sketch a menu of options for federal rules of platform procedure (“platform rules”). These rules could be required based on the normative case outlined above—to the extent allowed by the First Amendment.245 If so, the size of the platforms subject to any such decrees would need to be set to avoid unduly burdening smaller or emerging platforms before they have the chance to establish themselves. It would also be necessary to decide which of these rules to require at what time, a decision that should respond to the evolving public dispute resolution options available, including administrative agency oversight. As an alternative to mandates, tech executives could use this set of ideas to adopt voluntary dispute resolution.

A threshold issue is how to define success. The due process doctrine provides guidance by emphasizing that the extent of the procedures should grow with the gravity of the potential injustice—as long as the corresponding burden for implementing those procedures is not too high.246 However, due process sets the constitutionally acceptable floor,247 and thus has lower ambitions than would a designer seeking to build a platform dispute system that maximizes either effectiveness or legitimacy. For instance, due process allows an administrative agency to use upon appeal an adjudicator who is a peer of the original adjudicator and in the same office.248 This arrangement would be unacceptable for a designer wanting a more neutral arbiter to maximize procedural justice, which consists of voice, respect, speed, trustworthiness, and neutrality.249

Instead of using the constitutional floor as the standard, this Article’s procedural references go beyond due process to draw on two projects with more relevant aspirations. The first is the federal effort to build a judicial system. The second is ADR proponents’ efforts to create an entirely online dispute resolution system.250 It would be misguided to impose a private version of the cumbersome Federal Rules of Civil Procedure (“Federal Rules”) on platforms. It would also be a mistake to adopt perfection as the standard for a privately mandated system, since every existing judicial system has shortcomings. Ultimately, the goal is to balance efficiency, innovation, and procedural justice to improve the mass dispute resolution process of the information age.

1. Platform common law.

Some, if not all, platforms already value precedent. The founding documents for Facebook’s Oversight Board specify that “any prior board decisions will have precedential value and should be viewed as highly persuasive when the facts, applicable policies, or other factors are substantially similar.”251 That precedential value is limited to the Board’s subsequent decisions, rather than Facebook’s. Nonetheless, it represents a small step toward building a limited sphere of common law for content moderation.

Other platforms would also benefit from moving toward greater consistency in the application of their internal policies. After all, precedent adds predictability and fairness to the law, while lessening the arbitrariness of excessive adjudicatory discretion.252 Since a large platform today can deliver divergent rulings faced with the same set of facts,253 anchoring verdicts in prior cases would improve the administration of justice.

Of course, some level of transparency would be essential for users to be able to predict likely outcomes and argue their cases based on prior decisions.254 And platforms must not be restrained from improving their substantive decisions simply because of what they have done in the past. But a norm of precedent would not block innovation. Instead, once a substantive policy is established, the idea would be for the platform to apply rules to users consistently.

A more difficult question is how different platforms’ decisions should influence one another. Should Twitter’s resolution of an identical content moderation question have any bearing on how Facebook decides a given case? Clearly, different categories of platforms—such as marketplaces and social networks—will generally require different platform laws. Therefore, they must have some leeway to adopt context-specific decrees and tailored procedures.255 That need for customized substantive rules does not, however, prevent interconnections.

It is currently hard to imagine cross-platform precedence proving desirable. There are dangers in pushing platforms toward homogeneity and virtues in allowing them to compete by trying to solve similar problems in different ways. But if policy makers were to seek greater uniformity, common law courts offer a potential model. Decisions in the same jurisdiction on the same topic carry the greatest weight, but other jurisdictions’ cases on the same topic can be influential. By analogy, platforms, like states, may adopt their own substantive rules. Twitter may then consider the decisions of another platform, such as Facebook, in making its own decision, without having its autonomy infringed. The degree of relevance would be influenced by the similarity of the platform and the particular issue being decided.256

2. Platform appeals boards.

Pushing platforms toward greater internal consistency—and facilitating the use of other companies’ related decisions as persuasive authority—could be valuable. However, a platform could still perpetuate its own unsound decisions. TripAdvisor’s recent missteps illustrate the potential downsides. The company has long been the leading source of online travel information.257 It hosts reviews of hotels, restaurants, and other travel services through a user-generated five-star rating system, accompanied by written reviews.258 But the company takes down reviews without public disclosure or notification to the reviews’ authors.259 In many instances, users posted about being the victims of crimes only to have their reviews removed.260 For example, one woman posted that she had been sexually assaulted by a security guard at a highly rated resort.261 Because TripAdvisor erased the review, the resort remained highly rated, causing subsequent travelers to choose it and meet a similar fate.262

The assaulted users who had their posts removed were not without alternatives. Upon exhausting platforms’ internal dispute processes, many users look to informal avenues. Amazon buyers ask their credit card companies to cancel the purchase,263 and e-mails to Amazon CEO Jeff Bezos have yielded refunds for buyers and reinstatement for sellers.264 Facebook users take to Twitter to complain or leave “profane comments” on CEO Mark Zuckerberg’s Instagram account.265

Although those options sometimes produce results, they have limits. An assault victim should not have to take to social media and reveal a very private and painful event to the world to get a response. Moreover, users with few followers have less social media influence. Appealing to the CEO may go nowhere. Like many platforms, TripAdvisor has perverse incentives to keep the reviews positive, because it depends on advertising revenue.266 No hotel would pay to advertise so that people can learn about recent assaults on its premises.

In the face of such misaligned incentives, and the difficulty in bringing a public lawsuit in such situations, platform rules that impose internal precedent and persuasive authority would be insufficient. An independent appeals process, staffed by judges with sufficient salaries, was crucial for the development of U.S. common law.267 An independent party would also be more likely to overturn a platform’s profitable but misguided precedent.

Congress should thus consider mandating that each large platform fund an external oversight board comprised of salaried judges. One model would be for the board to solely focus on users’ appeals of individual cases, more reflective of the U.S. courts of appeals and the Facebook Oversight Board. Another model would allow the body to initiate or consider broader challenges to platforms’ procedures, without any specific case.

To minimize the risks that industry would capture the oversight boards, the platform must not control either the level of funding or selection of judges. In some ways, a company-specific appellate body would simply be a more independent version of what credit card companies and credit bureaus are required to do for dispute resolution—since those processes require firms to pay a group of employees to adjudicate chargebacks.268

An alternative would be industry-specific appeals bodies, funded in proportion to each large platform’s share of industry revenues. A single private board could hear cases for all social media companies (such as Facebook, Twitter, and TikTok), another for all marketplaces, and so on. Under this model, by analogy, the district court would be the platform’s internal adjudicatory process, with the appeals board serving as a circuit court.

Weighing in favor of a circuit model is the similarity of issues across multiple content-sharing networks, such as harassment and misinformation. Additionally, users are increasingly posting material across platforms. Finally, grouping multiple platforms’ appeals within a single court improves economies of scale, adjudicator expertise, and platform law consistency. Weighing against a circuit model is the potential to smother dispute resolution innovation by pushing uniformity.

Under either model, appeals courts could be required to overturn a platform’s decision when it is either inconsistent with that platform’s precedent, or when the platform’s precedent is inconsistent with the broader set of cross-platform policies and laws.269 In this regard, even a case-focused appeals process would go beyond Facebook’s Oversight Board, which cannot change Facebook’s policies or future decisions.270 Returning to the example of TripAdvisor, in deciding an appeal by users posting about crimes, the appeals court might look to other platforms’ policies about removing reported crimes, or prior cases on that subject. If the case is one of first impression, the court could look to related cases about removing noncrime information, as well as broader societal norms of advertisement accuracy, informational completeness, and public safety.

The appeals board’s status as a nongovernmental entity would enable it to adopt more streamlined and innovative processes than the public court system. For instance, one option for staffing these courts would be to adopt a proposal, made in the context of content moderation, for the platform to “create a process that relies on a community, either of regional experts or [ ] serious users.”271 Alibaba and eBay have experimented with similar crowdsourced adjudication.272 In its early years, eBay India established a “Community Court” of twenty-one randomly selected users to whom sellers could appeal if they disagreed with buyer feedback.273 A majority vote by the platform supreme court’s judges would remove the challenged feedback. Largely automated processes for crowdsourcing adjudication may be the best chance for reaching scale with an independent appeals process.

Regardless of the appeals board design, public checks are appropriate to avoid the extreme “privatization of process” created by arbitration.274 Public courts or administrative agencies could provide a check by preserving parties’ ability to challenge a given platform court of appeals’ decision. The interface between these public and private entities deserves further attention, but by default only a small percentage would appeal to public courts, given the existing obstacles. That option could be encouraged, or in the interests of promoting efficiency and subject matter expertise, public courts may defer to platform appeals courts, similar to public courts’ deference to administrative agency adjudication.275

The task of building a private appeals structure for billions of disputes is daunting. But tech platforms already handle that high volume annually.276 Moreover, governments have successfully developed high-volume online appeals systems. Israeli insurers, for instance, successfully use an online system, Benoam, to handle appeals by customers of “fender-bender” property claims.277 Benoam is private, but it has implemented some jurisprudential norms by publicly posting (anonymized) major decisions and clarifying rules.278

Facebook’s decision to staff its Oversight Board with at most 40 judges is informative.279 In 2019, Facebook alone “took action on” 20.7 million hate speech posts and disabled more than 6.5 billion fake accounts.280 The small size of its Oversight Board indicates that at least one large-scale platform believes that it can operate with a lean external judicial force.281 Nor is the scale of federal circuit courts particularly vast, with a typical court of appeals consisting of about 14 judges.282

Like the federal system, the platform system would need to emphasize settlement, negotiation, and mediation.283 Users would only be able to externally adjudicate the most significant and novel cases. A sensible rule, already applied in other mandated dispute resolution contexts, would be to require parties to exhaust direct negotiation and internal procedures before initiating an external appeal—except in limited circumstances, such as harassment, in which direct contact among users is problematic.284 Only a small subset would appeal, and the platform appeals board would only take an in-depth review of a fraction of those appeals.

Nonetheless, the few precedential decisions could reverberate throughout the rest of the disputes handled by platforms, ideally aided by algorithms that identify similar cases to which the ruling is relevant.285 As a result, even randomly selecting a portion of the appeals petitions for a hearing could gradually reshape platform procedures. The alternative (or complementary) approach of bypassing individual cases to consider procedural reforms would further enable a small oversight board to have a big impact. Either model offers some promise to improve legitimacy, encourage the development of platform common law, and increase the likelihood of socially beneficial outcomes.

3. A platform supreme court.

A major design choice is whether there should be a central private platform adjudicator above the appeals boards. One reason to prefer a centralized court is because one case may implicate many different platforms. By way of example, pediatrician Nicole Baldwin produced a playful TikTok dance music video to “Cupid Shuffle” about how vaccines prevent measles, polio, influenza, and other viruses, ending with a punchline of “Vaccines DON’T CAUSE AUTISM.”286 The video went viral across multiple platforms, including Twitter.287 She was subsequently barraged not only with death threats and other attacks on social media, but also by fake Yelp and Google reviews accusing her, among other things, of “drugging vaccine injured autistic boys with transgenic pills.”288 When a single incident requires intervention across multiple platforms, a harassment victim would ideally not need to go to multiple platforms for redress.

Additionally, the functional distinctions across categories are beginning to blur. Google’s search foundations buttress its fast-growing shopping marketplace and its recent piloting of a social network called Shoelace.289 Facebook facilitates many commercial transactions, allows users to link a bank account, and has taken steps to establish a currency.290 Amazon regulates speech by increasingly banning “offensive books”291 and filtering which reviews it will allow on its product review sites—reviews that are sometimes viewed by millions.292 If Baldwin had written a book prior to the vaccination video, reviews on Amazon could have easily been weaponized against her, as has happened to other authors.293 A single incident could therefore implicate similar adjudicatory issues across many large platforms.

Thus, a central appeals court—a platform supreme court—is worth considering at the very least to provide effective remedies. This terminal platform court might hear mostly appeals of the toughest and most novel cases, while having original jurisdiction for pressing cross-platform cases requiring immediate injunctive relief. As with the oversight boards, it also might have the ability to go beyond individual cases to hear more systemic complaints about platform procedure.

There are myriad ways to staff and implement a platform supreme court. The judges could be selected with intersectoral input, possibly allowing users, a regulator, and platforms to each select a subset of judges. Public oversight into the judge appointment process would help, such as an independent administrative agency signing off on the structure and staffing.

The concentration of such great power in the hands of so few is a fraught undertaking. But greater power is in the hands of even fewer today—a handful of CEOs. They currently control the executive, legislative, and judicial functions in platforms offering increasingly crucial services for participation in society. Mandating a nongovernmental appeals system would provide a separation of at least one of those powers.

Many difficult details would remain to be determined, like the standard of review for the private appeals hearings—whether deferential, de novo, or some other standard. It would be necessary to determine how, beyond the salary and the appointment process, to insulate the appeals bodies from undue influence by industry or a self-serving president. The linkage of multiple public and private courts would also require more nuanced connective arrangements.

However, the goal is not complete harmonization. The design must not, for instance, push social media toward a uniform speech ecosystem. Just as the U.S. judicial system allows for heterogeneity in some areas of law, such as contracts and torts, a central platform court could allow for variation in the substantive rules while imposing a procedural floor, such as by requiring notice.

The downsides of a centralized private court structure may outweigh the benefits, especially if the alternative is a workable public process. But as a practical matter, the most desirable platform private court structure would reflect the norm in governance: po-lycentricism, defined as a system “characterized by multiple governing authorities,” both public and private, in which “each unit . . . exercises considerable independence to make norms and rules within a specific domain.”294 With an independent appeals structure in place, most platform disputes would continue to unfold internally, shaped by localized community norms. Those internal processes would, however, become embedded in a robust external public and private accountability structure.

C. New Platform Federal Rules

With or without an independent appeals structure, mandated internal procedures like those required of financial platforms could improve adjudication. Not all disputes will merit the same legal protections. For instance, it would be hard to justify procedural safeguards that might slow down Facebook’s and Twitter’s terminations of Russian operatives’ election-oriented fake accounts. Legislatures must write, and agencies must administer, platform rules with sensitivity to the diffuse third parties—including U.S. conspiracy-theory groups like QAnon—attempting to weaponize platforms at scale.

Although the project facing platforms is unprecedented, the public court system and administrative agencies have designed mechanisms for resolving large-volume, heterogeneous claims.295 As characterized by the Supreme Court, “[t]he basic purpose of the Federal Rules is to administer justice through fair trials, not through summary dismissals.”296 Eighty-six Federal Rules exist, many with intricate subsections. It would be impractical to enumerate all of the possible corresponding rules of platform procedure here, and many would be unwise to adopt in their current form. Instead, the discussion below highlights several examples to illustrate the potential for transforming the Federal Rules into principles of platform procedural justice.

1. Standing and equal access to human adjudicators.

Access to justice is fundamental to democracy. As platforms plant themselves at the center of public discourse, access to their justice becomes integral to democracy. Two components are particularly important: a procedurally level playing field and standing for nonusers.

One of the necessary principles in platform procedure, and indeed a driving force behind many of the Federal Rules, is equal access.297 Public support for legal aid services, as well as procedural reforms such as class actions, aims to expand access.298 However, even with these and other mechanisms, reliance on courts still means “the ‘haves’ come out ahead.”299 Initiating a lawsuit typically requires a lawyer, or at least the ability to pay court fees and navigate a labyrinth of procedural and substantive rules.300

Left unregulated, platform justice risks exacerbating that considerable access inequality. Firms generally prioritize higher-profit customers in resolving disputes, as demonstrated by a Bank of America patent for software allowing it to gauge whether to waive a fee depending, in part, on the amount of money a customer’s family has in their bank accounts.301 Similarly, credit reporting agencies provide VIP treatment to complaints about inaccurate records from a “judge, senator, congressman, government official, attorney, paralegal, professional athlete, actor, director, member of the media or a celebrity.”302 Like other businesses, some platforms prioritize their most valuable users, and can relegate the least valuable to justice by algorithm.303

Equal access is also relevant to external parties harmed by platforms. Social networks and search engines can, for instance, determine election results or tarnish a nonuser’s reputation. Online marketplaces can sell counterfeit goods, thereby harming the original producers even if they do not sell online.304

Those external parties may have little if any influence because platforms prioritize their own users’ complaints. For instance, many publishers and authors have found counterfeits of their books sold on Amazon, which accounts for over half of all books sold in the United States.305 In one case, Amazon sold thousands of counterfeit copies of The Sanford Guide to Antimicrobial Therapy, which provides formulations for drugs used to combat pneumonia and other infections.306 Besides the considerable lost revenues for the author and publisher, the copies posed a health risk: the low-quality copies’ formulas obscured minor print distinctions like that between a “7” and a “1” in the dosage.307 The publisher wanted to remain independent of Amazon, but the pirated books became so pervasive that its only viable solution was to allow Amazon to become its wholesaler—thereby giving the platform an incentive to police the counterfeits.308

To address these access barriers, the dispute resolution system should be easy to navigate, free for individuals and small businesses, and should afford all litigants comparable procedures. Credit card companies are also required to offer chargebacks free. Comparable procedures would include the ability to have a human adjudicator at some point in the process for sufficiently important or nuanced cases.309 Something akin to the appointment of neutral experts or special masters, as federal rules allow in courts,310 could help address imbalances in users’ ability to navigate platform procedures. A navigable process, with straightforward explanations, would broaden access regardless of party sophistication. Equal access also would mean prohibiting favoritism based on status as a social media influencer or number of followers. Furthermore, nonusers harmed by platforms should have standing in these private dispute processes—or at least the external appeals boards over them. Particularly when platforms have monopoly power, people should not have to join that platform to stop the harm.

2. Timeliness and transparency.

Two of the most fundamental dispute resolution characteristics are speed and transparency. When eBay analyzed why most buyers and sellers continued using the platform after entering its dispute resolution process, the surprising answer was not whether they won or lost.311 Instead, it was having the conflict resolved in a timely manner.312 Those findings have encouraged ADR scholars focusing on online dispute resolution to embrace reliance “on the intelligence and capabilities of machines.”313 Mandated dispute resolution systems for credit reports, credit card chargebacks, and online copyright takedown impose time constraints, such as ninety days to investigate and correct the credit card billing error.314

In light of the value of timeliness to parties, when large platforms fail to act with a speed reflective of how companies respond in competitive markets, platforms should similarly have time limits imposed. Time limits are particularly relevant to suspended accounts and content moderation. But such limits would more broadly need to be adjustable based on the procedural complexity of the case and appeals level. If a case affects many users—for example, because it is a class action or a precedent-setting higher court ruling—it could call for a longer timetable.

In terms of transparency, an irony of platform justice is that the companies epitomizing the information age often provide almost no information upon adjudicating user disagreements or suspending privileges. The general opacity of algorithms and platforms is a common source of concern among scholars.315 Transparency in dispute resolution, however, is not explored in those discussions.316

To be clear, platforms communicate many substantive rules for what constitutes a violation. Amazon, Apple, and Facebook provide thousands of pages on their expectations for community standards and app developer conduct.317 However, those rules sometimes omit key details. There is no list of books banned by Amazon.318 Nor does Airbnb publish all of the reasons why accounts may be suspended.319 More importantly, the reasoning for a decision remains largely secretive. Sellers shuttered by Amazon may never know what accusations were made against them or whether those accusations were falsely leveled by another merchant seeking a competitive edge.320 Nor do users know how or why the platform suspended their account.321 The platform has access to considerable information, but the parties often do not.

The drafters of the Federal Rules prioritized information exchange.322 Platforms should in the most important contexts at least be required to provide what credit agencies must: an inspection of the case when someone alleges a mistake.323 Automation can significantly lower the costs of this information exchange through online forms, such as those used for credit card chargebacks.324

Transparency is not a cure-all, and those with power can abuse it.325 Information-sharing requirements would need to be relaxed in some contexts—particularly in cases involving harassment. And the visibility must reckon with inevitable resistance to revealing trade secrets.326 But transparency plays a crucial role in both individual disputes and at a systems level. Besides due process rationales, requiring the publication of some decisions—at least in an anonymized, summarized format—ensures that future parties benefit from past cases at that same platform.327 Parties can use prior rulings to plead their case to Amazon decision makers who lack the time or motivation to review past cases. Publicly available rulings thereby contribute to closer scrutiny of legal principles that might otherwise remain stagnant.328 Transparency thus would improve not only the quality of individual adjudications within a platform, but also the development of a platform common law.

3. User class actions.

Many small harms may not be worth individuals’ time even if collectively they amount to societally harmful transfers of rights or resources from individuals to platforms.329 To address related problems, the Federal Rules provide for class actions.330 As I have argued elsewhere, administrative agencies provide efficiency and other advantages over courts for enforcing procedural laws against companies.331 Nonetheless, private rights of action can complement administrative oversight, and it is possible that platform processes could improve significantly upon the civil justice system’s aggregation shortcomings.

ADR scholars have begun to lay the foundations for leveraging automation to make aggregate online dispute resolution more feasible for private companies, with lower costs as a leading motivator.332 When a user systematically harms others—or when the platform does so itself—an outlet for initiating collective grievances deserves consideration. Rather than a class action, it would be a user action.

There are many possible ways to aggregate claims among users, and space constraints do not allow for exploring them all or addressing their many critiques. Briefly, aggregation is another area where a special master or neutral party as the users’ representative could further procedural justice. Some of the biggest obstacles in courts have been locating all class members, collecting relevant information, and processing distinctions among mass consumers.333 Because platforms would have such information readily available through their extensive monitoring, communications, and analytic tools, they could automate aggregation in ways not possible through traditional court actions.334 User actions could even relax the requirement in the Federal Rules that members be similarly situated, if the platform’s artificial intelligence can create ways to both group and tailor claim adjudication.335

4. Injunctions and bans.

Facebook, Amazon, and Airbnb are quick to suspend or terminate accounts at the first sign of an issue, sometimes with severe consequences for small businesses, property ownership, and participation in democracy.336 The Supreme Court has applied the Due Process Clause to analogous contexts. In Fuentes v. Shevin,337 a Florida resident challenged a state law that allowed the retailer Firestone—without any hearing—to enlist the sheriff to seize her gas stove purchased on credit.338 The Court held that the “possessory interest in the goods, dearly bought and protected by contract, was sufficient to invoke the protection of the Due Process Clause.”339 Even when acting on behalf of a business, the state could seize property without a hearing only under “truly unusual” circumstances.340

To address platforms’ analogous account deprivations prior to hearings, platform rules could establish boundaries for suspending a legitimate personal or small business account until the dispute is resolved. The more essential the platform service, the more procedural protections are relevant before cutting off access. Whatever the boundaries, it is imperative that platforms have some ability to constrain problematic users.341 Additionally, platforms must have the flexibility to act quickly in unusual circumstances. For instance, when harassment, hate speech, or other abusive behavior is involved, the immediate blocking of the accused from interacting with the accuser and deletion of posts makes sense. Similarly, when a product sold on Amazon threatens consumer safety, an immediate suspension of the product is justified.

However, in many categories of harm, there will not be as compelling of a counter-interest. When one merchant points out that another small business has suspiciously glowing reviews on its Amazon site, it is excessive to delist the products without a prior investigation and giving the accused a chance to explain.342 The writer who built her book material on Instagram should not permanently lose all access to her account due to allegations by one copyright holder.343 When the interest on one side is preventing assault at a resort, blocking posts without at least a rapid follow-up investigation also is inappropriate.

Despite the delicate balancing act and context-specific nature of these inquiries, parameters are possible. For instance, the federal rules could consider the relative power dynamics of the groups—large businesses versus consumers, or harassers versus harassed. Injunction-related procedures imposed on credit card companies clearly favor what is typically the less powerful party, the consumer, by blocking collections on a disputed debt until the matter is resolved.344

The rules should also disfavor extreme punishment when lesser sanctions exist. When removal of content or individual products would suffice, such as when there are no repeated offenses, the platform should face procedural hurdles in suspending accounts. And the procedural bar for permanent bans should be higher. Public adjudicators mostly impose permanent bans only for extreme conduct, such as fraud and Ponzi schemes depriving investors of millions.345 The overarching procedural ideal is to minimize punishment inflicted before it is clear that a wrong has occurred.

5. Reputational accuracy and completeness.

A leading legislative sponsor of the FCRA explained the legis-lation by observing, “The loss of one’s good name is beyond price and makes one poor indeed.”346 Yet Amazon delists sellers based on ratings;347 Airbnb uses third-party data, such as from social media, to block guests before they have done anything wrong, reminiscent of the preemptive crime fighting in the dystopian future portrayed in The Minority Report;348 and Google allows victims’ reputations to be tarnished by refusing to demote clearly false websites even in the face of complaints.349 Platforms’ philosophies regarding reputation sit in tension with federal law in other areas. Most notably, the Federal Rules of Evidence block hearsay testimony and withhold information about a defendant’s prior convictions from jurors.350

Platforms’ nonchalance regarding reputation also allows for inaccuracies—the animating issue behind regulation of credit reports.351 But accuracy is not the only crucial goal. Information mistakenly omitted can deprive someone of a job or loan.352 In Haro v. Shilo Inn,353 a banquet services company decided to promote Robert Haro to manager, and ran a background check before doing so.354 The background check correctly reported that Haro had a charge of failing to register as a sex offender dismissed.355 That information implied that Haro may have had his case dismissed after properly registering as a sex offender. Instead of promoting Haro, the banquet company terminated him.356 However, the court had dismissed the charge of nonregistry because it was a case of mistaken identity—Haro had never been accused of the original crime, only erroneously thought to need to register after someone else was convicted.357 A full report would have clarified that vital missing detail.

The interests in reputational accuracy and completeness reflect a procedural justice emphasis on trustworthiness of the process.358 But they present a delicate balancing act, particularly when combined with other procedural interests such as caution in issuing temporary injunctions. TripAdvisor’s erasure of guest reviews about being assaulted underscores the need to limit the platform’s ability to self-servingly provide incomplete reputational profiles.359 But a wrongly accused party also may suffer if unsubstantiated information persists.

There will be hard cases and no set of rules will solve all problems. It is tempting to respond by leaving reputational issues, and indeed misinformation more broadly, out of any platform rules. Omission would be preferable to letting the issue of reputation derail the larger project. Again, however, it would be a mistake to allow the inevitable complexity and imperfection of new federal rules to perpetuate an even more problematic set of existing private rules. At a minimum, the above proposals for allowing parties to inspect internal adjudications should extend to being able to learn how reputation factored into platform punishment above a certain threshold—such as the termination of an account.360 Similarly, imposing a reasonable investigation requirement on accuracy and completeness of reputational profiles could help simply by prompting the platform to take greater care. Consumers should also have some means to challenge Google or Facebook when they allow extreme speech harms, such as denials of the Holocaust, or fabrications of a student’s promiscuity, to persist at the top of search results.361

Scholars have recognized that the Federal Rules of Evidence’s allowance for court-appointed experts offers a solution to judges’ struggles in the “world of alternative facts.”362 An analogous means of accessing an independent fact-checker could help to address related problems in platforms. Given the discriminatory nature of online reputational ratings, restrictions on the use of such information—and heightened accountability for inaccuracy—is warranted.363 One of the chief targets for restrictions should be the big data predictive analytics that platforms such as Airbnb deploy to block access before the individual has done anything wrong.364

The difficulty of line drawing highlights the challenges facing platforms as the default procedural rule writers. Public laws could bring third parties, whether public courts or private appeals processes, into these thorny decisions. One pragmatic path forward would be to begin with specific rules for the clearest issues, such as transparency, notification, and independent appeals, alongside default rules and standards that platform common law can shape for harder issues. Ultimately, the most powerful intervention of platform rules may simply be to provide a writ of habeas corpus for the information age by allowing an outside entity to check the currently unfettered power that platforms wield.

D. Enforcement of Rules

For mandated procedures to work, they must be enforced. Two avenues for enforcement come through courts and administrative agencies. The more important of these is the administrative agency.

Private rights of action would allow individuals to bring lawsuits for procedural violations. Such a lawsuit would be distinct from a right to pursue a public court appeal of the substantive decision made by a private platform or appeals board.365 Consumers have private rights of action to sue credit card companies and credit reporting agencies for failing to comply with statutory procedural requirements.366 These provisions allow the consumer to recover not only attorney’s fees and actual damages, but also in some cases punitive damages for willful procedural violations.367

Two design features would improve the effectiveness of a private right of action. First, parties must be able to challenge systemic failures, rather than solely individual cases. The court can reward the complaining party with punitive damages for identifying systemic issues affecting a larger group of people.368 These lawsuits would enable large number of users to bring a common procedural action even if their substantive grievances differ greatly.

Second, private rights of action are only worthwhile if the punishment is substantial enough to deter. The FCRA has proven inadequate to prevent inaccuracies in credit scores in part because the consequences are minimal if the credit agency is merely negligent.369 The company must only pay for damages incurred—such as a higher interest rate on a loan—and attorney’s fees.370 Not only are those actual damages arduous to prove, but they are capped at $1,000.371 Few consumers will sue and the bank therefore risks minimal, unlikely damages.372 And it is exceedingly rare to prove willful violations giving rise to substantial punitive damages.373 Thus, credit report procedures are structured to under-deter because a credit bureau will not pay significantly for flawed resolution of inaccuracies, but will benefit from saving costs that would otherwise be required to more accurately verify information. Unsurprisingly, credit reports have remained “riddled with inaccuracies” even after the procedural mandates.374 To increase the likelihood of compliance, procedural rights of action should enable damages commensurate with platform size for negligence in addition to willful violations.

The judicial system has an important role to play in public oversight, but it has limits. The time and energy required to exhaust a complex private process is already great, making it unlikely that parties would pursue the next, more resource-intensive step of appealing to a public courthouse. Moreover, even in the private platform appeals system, whole categories of disputes will never surface. As with credit reports, it “may prove practically impossible for consumers, when dealing with big-data scoring systems that potentially integrate thousands of variables, to verify the accuracy of their scores and reports or to challenge decisions based on alternative models.”375 Public courts alone will provide suboptimal accountability if they rely solely on individuals to initiate cases.

Administrative agencies can address those shortcomings by conducting regulatory audits of platforms’ dispute resolution systems.376 The most likely agency in the existing regulatory framework is the FTC, which has a broad cross-industry mandate. For audits to work, the platform must be required to keep records of the entire process—thereby creating an “audit trail.”377 Similar record-keeping requirements are imposed in other areas of dispute resolution. For example, federal law instructs airlines to keep passenger complaints for government audit of procedural compliance with consumer protection laws.378 In one incident, after reviewing complaints, the Department of Transportation fined Delta Airlines $750,000 for bumping passengers from flights without first seeking volunteers and offering adequate compensation—in other words, for inadequate adjudicatory processes.379

The regulator’s role would be to occasionally sample the platform’s dispute records and analyze aggregate complaint statistics, such as categories of complaints, rationales, and remedies deployed. This information would help identify grounds for regulatory prosecution of the platform for rule violations. Also, the regulator should have rulemaking authority to adapt compulsory procedures to fast-moving industries. Monitoring is the norm for most large industries—from banking to food manufacturing to pharmaceuticals.380 Chargebacks’ overall success likely benefits from the Consumer Financial Protection Bureau’s close monitoring (examination) of credit card companies, including for effective chargeback systems.381 Regulatory monitoring of platform federal rules would thus be consistent with the overall modern framework for promoting compliance, as well as the existing governance of financial platform dispute resolution.

E. Objections

In addition to the localized counterpoints addressed throughout this Article, several broader objections merit consideration. One objection views skeptically the willingness to leave dispute resolution mostly in private hands (and algorithms). If reforms are needed, why not instead improve the public court system? For cases involving speech, keeping the core decisions in private hands avoids concerns about authoritarian state censorship—and possibly allows some oversight that satisfies the First Amendment.382 For other types of conflicts, a public alternative could be part of the solution, but would require a massive and unlikely overhaul of the judicial system. Federal courts are already overburdened, managing over 1.5 million cases annually.383 Handling even a hundred million platform disputes annually—a fraction of the total—would require massive increases in resources and technology deployment to the public judicial system or to administrative law judges.384

Large companies whose core competency is technological sophistication are better situated to repurpose algorithms and big data to sift through large numbers of cases at low cost. As operators of the platforms, they also have the ability to automate the collection of information—and to police problematic behavior—more efficiently.385 Unless and until unprecedented institutional capacity is built in the public sector, platforms must handle the vast majority of disputes internally. This Article’s proposals simply provide public oversight of those inevitably private systems.

Others will have the opposite concern, that public involvement may prove inefficient or even detrimental to platform justice by blocking innovation.386 Indeed, U.S. financial regulation is one of the most costly in the world,387 so drawing on the practices of credit cards and credit reporting agencies may be unwise. Why not leave it to markets to self-adjust in response to user demand?

Such skepticism is appropriate not only for procedural mandates, but for almost all regulation. Nobel Prize–winning work has debunked the notion that markets will solve every problem, through research establishing pervasive market failures, such as the puzzling persistence of used car dealers who continued to sell “lemons” even though laissez-faire economics suggested that such sellers would have been driven out of the market by word-of-mouth.388 Just as it is hard to know beforehand whether someone has bought a lemon, it is too onerous for consumers to assess the quality of a platform’s dispute system before joining. Those challenges help explain how platforms could persist with flawed dispute resolution even in contexts with numerous competitors.

Throughout history, policy makers initially believed that regulation was unnecessary in most industries. From Upton Sinclair’s exposure of the meatpackers, to banks’ risky behavior preceding the financial crisis of the 2000s, to the Deepwater Horizon oil spill in the Gulf of Mexico, to Boeing pressuring regulators to ease off its fatal 737 MAX design, legislators have ultimately concluded that originally lax regulation posed a societal threat and greater public oversight was necessary to complement private autonomy.389 Similarly, concerns that regulation would “kill the internet” drove early scholarly examinations, the legacy of which persists today.390 In light of other industries’ histories of failed self-regulation, platforms’ early missteps, and the evidence of competition shortcomings, expecting markets to solve all platform governance problems would be unrealistic.

It is methodologically difficult to establish that any particular legal intervention is justified—or even when the right time is to intervene so as not to cut short innovation that would have improved the problem. In particular, it is impossible to know the counterfactual, because without regulation companies may over time yield to public pressure as Facebook has done in creating its Oversight Board.

Yet public pressure fades, and the level faced by Facebook is unusually intense and thus unlikely to be applied to every large platform whose dispute resolution is in need of improvement. It also bears emphasis that existing oversight of the world’s largest companies, including mandated dispute resolution procedures for credit conflicts and online copyright violations, has not kept those firms from being highly profitable global leaders in their industries.391 Finally, procedural rules can be designed to still allow for dispute resolution innovations on the part of platforms above the floor required by law—as was the case for chargebacks with credit card companies.392 Indeed, a regulator authorized to write procedural rules could always begin with transparency and more general mandates and then move to more specific requirements if it becomes apparent that platforms are not advancing as would be expected in competitive markets.

In short, one option is to risk trusting platforms alone to police market entry, preserve reputation, and protect the “endangered species”393 of objective facts. However, given the signs of market failure, mandated procedural rules have the potential to improve efficiency.394 From a societal perspective, mandates may thus prove more promising because they provide platforms with public partners in their difficult adjudicatory tasks.

Another potential objection focuses on the distinction between dispute resolution and governance. Namely, a dispute arises whether Amazon bans a merchant because of complaints from other traders or Amazon identifies an issue on its own. What do we gain by viewing platform decisions about users from more of a judicial perspective, rather than as an executive or legislative entity?

Conceptual precision is valuable. Conflicts are central to platforms because intermediation defines them.395 Internal policies and algorithmic adjudications develop through an iterative feedback loop, informed heavily by those conflicts.396 Therefore, a rigorous institutional analysis of dispute resolution is crucial to examining platform policies. It is thus essential to understanding the nature of platforms and their place in society.

From a public policy standpoint, adopting a dispute resolution perspective can help the networked society flourish. To illustrate, in early 2020, Facebook and Twitter rejected House Speaker Nancy Pelosi’s request to delete a heavily edited video, posted by President Trump, implying that she ripped up his State of the Union speech as he honored one of the last surviving Black pilots of World War II who integrated the U.S. Army Air Forces.397 In considering Pelosi’s claim, Facebook applied a policy that it had rolled out a month before: content would be removed if it “would likely mislead someone into thinking that a subject of the video said words that they did not actually say.”398 Because Pelosi had in fact torn up the speech, the video did not meet that standard.399 Prior to its recent policy, the company had long resisted the idea of removing manipulated media, but reversed its “anything goes” position after receiving intense criticism from users and the public.400

The Pelosi-Trump dispute shows not only the litigation process for new cases based on platforms’ established policies, but also the incredible power involved. Responsiveness to public pressure offers a form of accountability. But an independent appeals process would help insulate the platform’s judicial function from undue influence. It would be easier for the president or the speaker of the house to pressure a CEO, who has considerable stock and much to lose from increased government scrutiny, than a large independent pool of appeals judges who have guaranteed salaries.

To be clear, this Article’s thesis does not require seeing platforms as closer to courts than to administrative agencies or governors. Indeed, depictions of platforms’ executive function underscore the need for a judicial check on that authority. If administrative agencies are the governmental analogy of choice for platforms, it bears emphasis that the Administrative Procedure Act sets forth rules for formal administrative agency hearings, including cross-examinations, apprising parties of material facts, and the agency’s power of appeal.401 Agencies have detailed published rules used in administrative judge hearings, modeled after the Federal Rules.402 Those formal rules are mostly missing from existing law and technology conversations about process.403 Given many similarities between those administrative rules and the Federal Rules, both agencies and courts as the government analog would indicate similar policy implications as those put forth in this Article.404

A final potential source of pushback is that the focus on process is inadequate, if not detrimental to the most important substantive issues needing regulatory attention. In this view, focusing on process provides a safe way to intervene without actually prohibiting any specified bad conduct. At its worst, lawyers’ misguided “faith in procedure” could create an excessively complex and costly process only navigable by sophisticated parties.405 Procedural focus thereby risks buttressing the existing institutions and legitimizing the harms they cause.

Admittedly, procedural reforms will not solve all of platforms’ problems. Most substantive decisions would remain in private hands. But improving procedural quality leads to better substantive outcomes.406 In particular, a well-designed independent appeals body could make decisions that are unprofitable for the platform but in society’s best interests.

Furthermore, for much of modern history, the prevailing view was that the substantive outcome drove people’s perception of justice.407 A set of experiments in the 1980s changed that narrative.408 Through survey instruments designed to assess people’s perceptions of a judicial process, psychologists demonstrated that the procedure for reaching an outcome influences people’s perception of its legitimacy as much, if not more than, the substantive outcome.409 The reforms above incorporate that research into what matters to people.

Additionally, dispute resolution interventions offer a means for substantial steps toward justice as perceived by the consumer. They are an essential part of any comprehensive solution to regulate platforms. Moreover, mandating procedures is more politically viable than substantive interventions. Businesses have increasingly realized that effective and legitimate dispute resolution improves profits.410 For those who oppose regulation on the grounds that it impinges on private autonomy, especially for matters involving speech, procedural interventions are more appealing because they largely preserve the private sector’s ability to make substantive decisions.411

To be clear, these and other objections raise valid concerns that can inform and improve platforms’ procedural design. The strategy is for the dispute architecture to leverage the strengths of both public and private sectors. Ultimately, whether the system is public, private, or hybrid, for whatever substantive laws exist, a set of procedures will govern the resolution of platform disputes. Those procedures will influence outcomes and the quality of administered justice. The design of those rules, whether mandated or voluntary, would ideally reflect norms not only of private sector efficiency and innovation, but also the public value in procedural justice.

Conclusion

This Article has begun to sketch the contours of a necessarily much larger project. With or without intervention, tech platforms play a court-like role in society, resolving disagreements between merchant and consumer, driver and passenger, or two interlocutors in the modern public square. Financial platforms—most notably credit card companies and credit reporting agencies—by their nature serve as intermediaries in private adjudicatory processes. Unlike tech platforms, however, financial platforms’ dispute resolution proceedings are subject to minimum legal standards such as conducting reasonable investigations and notifying parties. A central challenge in platform governance moving forward is determining how to shape the ongoing mass, secretive trials that can define people’s identities and banish them from communities.

In the framers’ vision for a new country’s judicial system, they began not with due process, but with Article III: “The judicial Power of the United States, shall be vested in one supreme Court, and in such inferior Courts as the Congress may from time to time ordain and establish.”412 As societal power migrates to platforms in the information age, something like a constitutional convention—but with a diverse array of stakeholders—would help to design a system of checks and balances. A fundamental part of that enterprise would be deciding whether Congress should ordain and establish a platform court system and federal rules of platform procedure for billions of disputes currently relegated to sometimes brutish colonial-style justice.

  • 1Simon van Zuylen-Wood, “Men Are Scum”: Inside Facebook’s War on Hate Speech, Vanity Fair (Feb. 26, 2019), https://perma.cc/AR9Q-A89B.
  • 2See, e.g., Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598, 1611, 1625–49 (2018) (describing how moderators, acting in “a private self-regulatory system to govern online speech,” enforce the rules they create for their users); Kyle Langvardt, Regulating Online Content Moderation, 106 Geo. L.J. 1353, 1366–70 (2018) (analyzing the constitutional concerns with content moderation by online platforms and advocating for congressional action to limit the reach of this content moderation); Andrew Tutt, The New Speech, 41 Hastings Const. L.Q. 235, 278 (2014) (“Digital speech intermediaries possess and exercise a new kind of control over the speech of individuals, associations, groups, and communities.”).
  • 3See, e.g., infra Part I.B.
  • 4See infra Part I.A.
  • 5See infra Part I.D (discussing Google’s role in reputation markets).
  • 6Caitlin Hall, Swimming Downstream: Battling Defamatory Online Content via Acquiescence, 19 Yale J.L. & Feminism 287, 287–88 (2007) (describing how, after the author’s acceptance to Yale Law School became known in online admission boards, she was subjected to harassment by strangers that a job interviewer later referenced).
  • 7See van Zuylen-Wood, supra note 1.
  • 8See Josh Dzieza, Prime and Punishment, The Verge (Dec. 19, 2018), https://perma.cc/8BNT-HU3R (describing a fake review set up by a competitor that succeeded in suspension).
  • 9Id.
  • 10See, e.g., Julie E. Cohen, Law for the Platform Economy, 51 U.C. Davis L. Rev. 133, 199 (2017) (analyzing how platforms’ “role in the international legal order increasingly resembles that of sovereign states”). A related analogy paints platforms as administrative agencies—which, like states, have executive, legislative, and adjudicatory functions. See, e.g., Hannah Bloch-Wehba, Global Platform Governance: Private Power in the Shadow of the State, 72 SMU L. Rev. 27, 29 (2019) (arguing that “platforms are acting as regulators” and “are performing quintessentially administrative functions”). See also generally Rory Van Loo, Rise of the Digital Regulator, 66 Duke L.J. 1267 (2017) [hereinafter Van Loo, Rise of the Digital Regulator] (discussing platforms’ regulatory and quasi-legislative functions); infra Part III.E (discussing the analogy to administrative agencies). These and other scholars discuss dispute resolution by platforms along the way to larger projects. See generally Amy J. Schmitz, There’s an “App” for That: Developing Online Dispute Resolution to Empower Economic Development, 32 Notre Dame J.L. Ethics & Pub. Pol’y 1 (2018) (discussing dispute resolution as a tool for economic development); Rory Van Loo, The Corporation as Courthouse, 33 Yale J. on Reg. 547, 567 (2016) [hereinaf-ter Van Loo, The Corporation as Courthouse] (examining innovative dispute resolution mechanisms and considering how regulators should respond). See also Julie E. Cohen, Between Truth and Power: The Legal Constructions of Informational Capitalism 143 (2019) (exploring “the design of dispute resolution systems and institutions for the era of informational capitalism”); Aluma Zernik, The Invisible Hand, the Regulatory Touch, or the Platform’s Iron Grip? 21–22 (unpublished manuscript) (on file with author) (discussing the dispute-resolution capabilities of platforms as part of the potential for replacing government regulators with private platforms).
  • 11Cohen, supra note 10, at 136.
  • 12Given the limits of interviews and desire of most to remain confidential, wherever possible a publicly available source was used instead.
  • 13For a review and categorization of early works in this vein, see Lawrence B. Solum, Models of Internet Governance, in Internet Governance: Infrastructure and Institutions 48, 56–57 (Lee A. Bygrave & Jon Bing eds., 2009). For more recent examples, see supra note 10.
  • 14See, e.g., K. Sabeel Rahman, The New Utilities: Private Power, Social Infrastructure, and the Revival of the Public Utility Concept, 39 Cardozo L. Rev. 1621, 1632, 1641 (2018) (observing that platforms like the large corporations of the Progressive Era can “exercise[ ] quasi-sovereign authority and influence over not only workers but the economy and society as a whole” (citing Dalia Tsuk, From Pluralism to Individualism: Berle and Means and 20th-Century American Legal Thought, 30 Law & Soc. Inquiry 179 (2005))); Ganesh Sitaraman, Regulating Tech Platforms: A Blueprint for Reform, Great Democracy Initiative 5 (Apr. 2018), https://perma.cc/EDJ7-EHNQ (comparing tech platforms providing “essential services” to public utilities); Jonathan Zittrain, Engineering an Election, 127 Harv. L. Rev. F. 335, 336 (2014) (warning of platforms’ ability to shape elections); see also supra note 10. Professor Evelyn Douek’s valuable work on this topic is the most relevant, albeit focused on the context of speech. See generally Evelyn Douek, Verified Accountability: Self-Regulation of Content Moderation as an Answer to the Special Problems of Speech Regulation (Aegis Series Paper No. 1903, 2019).
  • 15Professor Danielle Keats Citron’s groundbreaking call for technological due process showed how constitutional principles could broadly be applied to technology. See generally Danielle Keats Citron, Technological Due Process, 85 Wash. U. L. Rev. 1249 (2008). That concept will be explored in greater depth below. Although Citron’s original work can be distinguished because it was not focused on tech platforms or dispute resolution and relies on administrative agencies as the government analogue, it nonetheless provides valuable foundations on which this Article builds. See id. at 1301–13 (arguing that administrative agencies’ use of technology should be subjected to due process); Danielle Keats Citron & Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 Wash. L. Rev. 1, 23 (2014) (concluding that due process is needed for automated scores produced by those creating credit scores, without discussing dispute resolution functions or online platforms). One of this Article’s contributions is extending the due process analysis to platforms’ dispute resolution systems. See infra Part III.A.
  • 16See generally Tom Tyler, Why People Obey the Law (2006) (summarizing the procedural justice literature suggesting that process heavily influences perception of legitimacy). See also Orna Rabinovich-Einy & Ethan Katsh, Technology and the Future of Dispute Systems Design, 17 Harv. Negot. L. Rev. 151, 198 (2012) (arguing for “novel approaches” to integrating technology into dispute resolution).
  • 17See, e.g., Schmitz, supra note 10, at 43; Heather Scheiwe Kulp & Amanda L. Kool, You Help Me, He Helps You: Dispute Systems Design in the Sharing Economy, 48 Wash. U. J.L. & Pol’y 179, 216 (2015). For an early take on online dispute resolution, see David A. Hoffman & Salil K. Mehra, Wikitruth Through Wikiorder, 59 Emory L.J. 151, 170–74 (2009). The literature on private administration is vast and has parallels to this project. See, e.g., Nathaniel Donahue & John Fabian Witt, Tort as Private Administration, 105 Cornell L. Rev. 1093, 1170 (2020) (“Private administration is the architecture within which the law alternately vindicates and obstructs the basic goals of deterrence and corrective justice.”).
  • 18But see Van Loo, The Corporation as Courthouse, supra note 10, at 567, 595–97 (describing the internal dispute processes of American Express, Amazon, and other platforms and proposing regulatory oversight).
  • 19See, e.g., Amalia D. Kessler, Arbitration and Americanization: The Paternalism of Progressive Procedural Reform, 124 Yale L.J. 2940, 2942 (2015).
  • 20See, e.g., David Horton, Arbitration About Arbitration, 70 Stan. L. Rev. 363, 370 (2018) (arguing that the combination of arbitration clauses and delegation clauses, which allow arbitrators to delegate whether arbitration should proceed, has allowed “corporations [to] draft[ ] around [the] prophylactic layer of judicial review”); Judith Resnik, Diffusing Disputes: The Public in the Private of Arbitration, the Private in Court, and the Erasure of Rights, 124 Yale L.J. 2804, 2936 (2015) (arguing that the new reliance on a private, arbitral judicial system works as “an unconstitutional deprivation of litigants’ property and court access rights”).
  • 21Mandatory arbitration refers to the practice of businesses inserting clauses into their form contracts that require consumers to use arbitration for any disputes. See AT&T Mobility LLC v. Concepcion, 563 U.S. 333, 339 (2011) (“[C]ourts must place arbitration agreements on an equal footing with other contracts and enforce them according to their terms.” (citation omitted)). There have been numerous symposia and collections on the topic. See generally Stephan Landsman, ADR and the Cost of Compulsion, 57 Stan. L. Rev. 1593 (2005) (publishing as part of a Stanford Law Review symposium emphasizing class actions and arbitration); Kessler, supra note 19 (writing as part of a Yale Law Journal collection on mandatory arbitration). See also Roger H. Trangsrud, Class Actions and Access to Justice, 82 Geo. Wash. L. Rev. 595, 596 (2014) (referring to a symposium at the George Washington University Law School focused on class actions).
  • 22See Judith Resnik, The Privatization of Process: Requiem for and Celebration of the Federal Rules of Civil Procedure at 75, 162 U. Pa. L. Rev. 1793, 1821 (2014) (“[T]he promise of confidentiality is a linchpin of ADR’s appeal.”).
  • 23David Horton, Arbitration as Delegation, 86 N.Y.U. L. Rev. 437, 440 (2011).
  • 24Resnik, supra note 20, at 2936.
  • 25Cox v. Louisiana, 379 U.S. 536, 554 (1965).
  • 26See infra Part II.A–B (outlining the procedures mandated for credit card companies and credit bureaus).
  • 2715 U.S.C. §§ 1681, 1681g; see infra Part II.B.
  • 28See Van Loo, The Corporation as Courthouse, supra note 10, at 555–58, 561 (arguing that with the right competitive pressures, some large companies’ private dispute processes offer people voice, speed, and often better outcomes than the law would provide).
  • 29See infra Part III.A (discussing normative foundations for intervention).
  • 30Indeed, there is a risk that regulations focused solely on procedure will help legitimate injustice. This concern is important and addressed infra Part III.
  • 31See generally Rory Van Loo, In Defense of Breakups: Administering A “Radical” Remedy, 105 Cornell L. Rev. 1955 (2020).
  • 32See Rory Van Loo, The Missing Regulatory State: Monitoring Businesses in an Age of Surveillance, 72 Vand. L. Rev. 1563, 1620 (2019).
  • 33See generally Rory Van Loo, Helping Buyers Beware: The Need for Supervision of Big Retail, 163 U. Pa. L. Rev. 1311 (2015) (proposing that Amazon and other retailers share machine-readable data). See also Rory Van Loo, Digital Market Perfection, 117 Mich. L. Rev. 815, 871–73 (2019) (arguing that general and customer-account data access would provide digital intermediary market benefits).
  • 34See Rory Van Loo, Technology Regulation by Default: Platforms, Privacy, and the CFPB, 2 Geo. L. Tech. Rev. 531, 545 (2018).
  • 35See Rory Van Loo, The Revival of Respondeat Superior and Evolution of Gatekeeper Liability, 109 Geo. L.J. 141, 189 (2020).
  • 36Gillian K. Hadfield, Innovating to Improve Access: Changing the Way Courts Regulate Legal Markets, 143 Daedalus 83, 84 (2014).
  • 37See, e.g., Prager Univ. v. Google LLC, No. 19-cv-340667, 2019 Cal. Super. LEXIS 2034, at *15–16 (Cal. Super. Ct. Nov. 19, 2019) (concluding that Google had no obligation to provide equal access because YouTube’s “Restricted Mode” and advertising service “are nothing like a traditional public forum”); see also Anupam Chander, Facebookistan, 90 N.C. L. Rev. 1807, 1820–22, 1844 (2012) (“United States law permits a large measure of freedom for Facebook to set the terms of [its platform].”).
  • 38Joshua Fruchter, Amazon Takes Aim at Patent Infringement in Its Marketplace, Nat’l L. Rev. (July 12, 2019), https://perma.cc/9NPF-U5H5 (putting the figure at 58%).
  • 39David Streitfeld, In Amazon’s Bookstore, No Second Chances for the Third Reich, N.Y. Times (Feb. 9, 2020), https://perma.cc/7LZX-7HGX.
  • 40See Tara Johnson, How to Deal with Amazon A-to-Z Claims, Tinuiti (Aug. 31, 2018), https://perma.cc/NFT8-WDTR.
  • 41See Streitfeld, supra note 39.
  • 42Compare Ethan Katsh & Orna Rabinovich-Einy, Digital Justice: Technology and the Internet of Disputes 79 (2017) (describing the millions of disputes handled by e-commerce platforms), with Federal Judicial Caseload Statistics 2019, U.S. Cts., https://perma.cc/QV99-ZLK3 (noting that 376,762 cases were filed in federal district courts in 2019).
  • 43See Jane K. Winn, The Secession of the Successful: The Rise of Amazon as Private Global Consumer Protection Regulator, 58 Ariz. L. Rev. 193, 200–02 (2016).
  • 44See Scott Shane, Prime Mover: How Amazon Wove Itself into the Life of an American City, N.Y. Times (Nov. 30, 2019), https://perma.cc/G2SC-9XWM.
  • 45Fruchter, supra note 38 (putting the figure at 73%).
  • 46Dzieza, supra note 8.
  • 47Order Defect Rate, Amazon Seller Cent., https://perma.cc/4MRP-N7X8.
  • 48About Feedback Manager, Amazon Seller Cent., https://perma.cc/V2HL-SZ4K.
  • 49See ReedsDoItBestHdw, Feedback Removal Win!, Amazon Servs. Seller Fs., https://perma.cc/K8X4-PECE.
  • 50Can Amazon Remove Buyer Feedback?, Amazon Seller Cent., https://perma.cc/Q3YD-XPNH (noting that Amazon will remove feedback upon merchant request if it is profane, solely about the product rather than the purchase experience, or uses personally identifiable information).
  • 51See About Feedback Manager, supra note 48.
  • 52Gregory Magana, Amazon Is Beset by False Product Reviews, Bus. Insider (Apr. 17, 2019), https://perma.cc/HN2U-HX5Q.
  • 53About A–Z Guarantee, Amazon Help & Customer Serv., https://perma.cc/F92W-ARGR.
  • 54Pooja Vishant, An Amazon Seller’s Guide to A-to-Z Guarantee Claims, Medium (Sept. 15, 2016), https://perma.cc/K9UR-9HU6.
  • 55Community Guidelines, Amazon, https://perma.cc/XQA2-428A.
  • 56See Dzieza, supra note 8.
  • 57See id.
  • 58See Complaint at 2, Wanna Play Prods. Inc. v. Emery, No. 20-cv-00010-AT (N.D. Ga. filed Jan. 2, 2020).
  • 59Id. at 26 & n.30 (citing U.S. Patent No. 5,445,522).
  • 60Id. at 38.
  • 61See generally Kaity Y. Emerson, From Amazon’s Domination of E-Commerce to Its Foray into Patent Litigation: Will Amazon Succeed as “The District of Amazon Federal Court”?, 21 N.C. J.L. & Tech. 71 (2019).
  • 62Create a Plan of Action to Reinstate Selling Privileges, Amazon Seller Cent., https://perma.cc/WB5T-WLTR.
  • 63E.g., Amazon Seller Suspension Att’ys, https://perma.cc/ZZ3B-HVR9.
  • 64See Dzieza, supra note 8.
  • 65Telephone Interview with Chris McCabe, Former Amazon Emp. (Feb. 20, 2020).
  • 66Amazon, like eBay and Alibaba, is one of many online marketplaces operating such systems. See, e.g., Rob Enderle, EBay vs. Amazon: An Interesting Lesson in Customer Care, IT Bus. Edge (Jan. 19, 2012), https://web.archive.org/web/20120414103615/https://www.itbusinessedge.com/cm/blogs/enderle/ebay-vs-amazon-an-interesting-lesson-in-customer
    -care/?cs=49557&page=2 (recounting an experience of eBay freezing a buyer’s account).
  • 67Barbara Ortutay, Does the Naked Body Belong on Facebook? It’s Complicated, Associated Press (Jan. 15, 2020), https://perma.cc/R9WL-FVU5.
  • 68Id.
  • 69Id.
  • 70See Zittrain, supra note 14, at 336; see also Brandy Zadrozny, Drowned Out by the Algorithm: Vaccination Advocates Struggle to Be Heard Online, NBC News (Feb. 26, 2019), https://perma.cc/KB6F-WAS5.
  • 71Kashmir Hill, Many Are Abandoning Facebook. These People Have the Opposite Problem., N.Y. Times (Aug. 22, 2019), https://perma.cc/G22G-7PBQ.
  • 72See id.
  • 73Telephone Interview with Julie Stone (Jan. 21, 2021) (describing her experience with Instagram accounts @obviousbutamazing and @obviousbutamazing2).
  • 74See, e.g., van Zuylen-Wood, supra note 1.
  • 75Mike Schroepfer, Community Standards Report, Facebook AI (Nov. 13, 2019), https://perma.cc/98YQ-9FV9.
  • 76See, e.g., Hill, supra note 71.
  • 77Id.
  • 78See, e.g., Philip M. Napoli, What If More Speech Is No Longer the Solution? First Amendment Theory Meets Fake News and the Filter Bubble, 70 Fed. Commc’ns L.J. 55, 75–76 (2018) (describing the congressional response to Cambridge Analytica’s use of social media data to construct voter profiles).
  • 79Monika Bickert, Publishing Our Internal Enforcement Guideline and Expanding Our Appeals Process, Facebook Newsroom (Apr. 24, 2018), https://perma.cc/JX28-V5E4.
  • 80See id.
  • 81See Bickert, supra note 79.
  • 82See, e.g., Dawn Carla Nunziato, The Marketplace of Ideas Online, 94 Notre Dame L. Rev. 1519, 1538–49 (2019). Facebook has also begun providing links to alternative perspectives. See Bickert, supra note 79.
  • 83Sarah C. Haan, Facebook’s Alternative Facts, 105 Va. L. Rev. Online 18, 25 (2019).
  • 84See Nunziato, supra note 82, at 1539.
  • 85See Fed. R. Evid. 706.
  • 86Bickert, supra note 79.
  • 87See Hill, supra note 71.
  • 88See id.
  • 89Professor Douek has led scholarly analysis of this topic. See, e.g., Evelyn Douek, Facebook’s “Oversight Board:” Move Fast with Stable Infrastructure and Humility, 21 N.C. J.L. & Tech. 1, 16–17 (2019).
  • 90Facebook, Oversight Board Charter 4–5 (2019).
  • 91See id. at 5–6.
  • 92Allison Orr Larsen, Constitutional Law in an Age of Alternative Facts, 93 N.Y.U. L. Rev. 175, 234 (2018).
  • 93Why Was My Listing Paused or Suspended?, Airbnb, https://www.airbnb.com/help/article/1303/why-was-my-listing-paused-or-suspended.
  • 94See, e.g., Racist Airbnb Host to Black Guests: ‘Which Monkey Is Going to Stay on the Couch?’, NewsOne (June 1, 2019), https://perma.cc/X4QD-DMMS (describing how one host’s account was blocked after she made racist remarks to her Black guests).
  • 95See, e.g., Kelly Kampen, AirBNB Why Did You Terminate My Account? – An Open Letter to AirBNB, Medium (Aug. 20, 2015), https://perma.cc/4FLU-CTMN.
  • 96See, e.g., Account Deleted After Guest Used Dodgy Credit Card, airbnbHELL (May 29, 2019), https://perma.cc/7XBG-MVEF; Rikster, Comment to Superhost Account Removed for “Security Reasons”, airbnbHELL (June 2, 2019), https://perma.cc/Z9HB-YMPB.
  • 97See Rikster, supra note 96.
  • 98James Dobbins, Making a Living with Airbnb, N.Y. Times (Apr. 7, 2017), https://perma.cc/A4Y9-WHS2 (describing New York residents who depend on supplemental income from Airbnb to cover living expenses, home upkeep, and retirement savings).
  • 99Cf. Fuentes v. Shevin, 407 U.S. 67, 96–97 (1972) (finding a due process violation for a sheriff’s seizure of consumer goods without a hearing).
  • 100See, e.g., Nick88, Host Cancelled Less Than 24 Hours Before Check In, Airbnb cmty. (Oct. 14, 2016), https://perma.cc/L9SM-4R5F.
  • 101See Violet Blue, Your Online Activity Is Now Effectively a Social ‘Credit Score’, engadget (Jan. 17, 2020), https://perma.cc/FCC5-AZJE.
  • 102Id.
  • 103U.S. Patent No. 9,070,088 (filed Sept. 16, 2014) (issued June 30, 2015).
  • 104Id.
  • 105What Does It Mean When Someone’s ID Has Been Checked?, Airbnb, https://www.airbnb.com/help/article/2356/what-does-it-mean-when-someones-id-has
    -been-checked.
  • 106See Frank Pasquale, Rankings, Reductionism, and Responsibility, 54 Clev. St. L. Rev. 115, 127 (2006).
  • 107Jon Porter, Angry Redditors Are Trying to Google Bomb Game of Thrones Writers, The Verge (May 15, 2019), https://www.theverge.com/2019/5/15/18624480/game-of-thrones-google-bomb-db-weiss-david-benioff.
  • 108Noam Cohen, Google Halts ‘Miserable Failure’ Link to President Bush, N.Y. Times (Jan. 29, 2007), https://perma.cc/X3QF-9JC8.
  • 109Ryan Moulton & Kendra Carattini, A Quick Word About Googlebombs, Google Webmaster Cent. Blog (Jan. 25, 2007), https://perma.cc/DRQ9-JKRE.
  • 110See infra Part III.C.1 (offering procedural rules to address harm to nonusers).
  • 111See, e.g., Pasquale, supra note 106, at 127.
  • 112See Eldar Haber, Privatization of the Judiciary, 40 Seattle U. L. Rev. 115, 120–29 (2016) (showing how in Europe the right to be forgotten has put search engines in a quasi-judicial role).
  • 113Remove Your Personal Information from Google, Google Search Help, https://perma.cc/772X-EW9K.
  • 114Remove Content About Me on Sites with Exploitative Removal Practices from Google, Google Search Help, https://perma.cc/UZJ5-4ZBB.
  • 115Request to Remove your Personal Information on Google, Google Search Help, https://support.google.com/websearch/troubleshooter/9685456#ts=2889054%2C2889099.
  • 116See Remove Content About Me on Sites with Exploitative Removal Practices from Google, supra note 114.
  • 117Ratings go through an approval process, applying considerations such as use of inappropriate language. Flag and Fix Inappropriate Content, Maps User Contributed Content Pol’y Help, https://perma.cc/U5UV-J4Z9.
  • 118See supra Part I.A.
  • 119See “I’ve Lost Everything”: Small Businesses Claim Fake Online Reviews Killing Them, A Current Aff., https://perma.cc/LZJ8-RW9N.
  • 120Id.
  • 121See id.
  • 122Flag and Fix Inappropriate Content, supra note 117.
  • 12347 U.S.C. § 230 (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”).
  • 124See, e.g., Bennet v. Google, LLC, 882 F.3d 1163, 1165–67 (D.C. Cir. 2018).
  • 125Cf. Larsen, supra note 92, at 175 (discussing the “‘post-truth’ society”).
  • 126Danielle Keats Citron & Helen Norton, Intermediaries and Hate Speech: Fostering Digital Citizenship, 91 B.U. L. Rev. 1435, 1471–72 (2011) (alteration in original).
  • 127Deirdre K. Mulligan & Daniel S. Griffin, Rescripting Search to Respect the Right to Truth, 2 Geo. L. Tech. Rev. 557, 568–69 (2018) (summarizing Google’s approach to anti-Semitic content).
  • 128Government Requests to Remove Content, Google Transparency Rep., https://perma.cc/86EC-LW5Y.
  • 129Google’s reasons for not following a court order include that it lacked specificity as to the information that should be removed. See id.
  • 130Telephone Interview with Former Google Employee (Feb. 25, 2020).
  • 131See id. (describing the committee for YouTube issues as including YouTube’s General Counsel, Director of Product Management, Head of User Operations, and a member of the Policy Abuse Legal Team).
  • 132See id. (explaining that lower-level employees convinced the committee that YouTube videos involving weed were an important part of the internet).
  • 133Id.
  • 134Cf. Pasquale, supra note 106, at 125–29 (analyzing the structure of Google).
  • 135See Allyson Haynes Stuart, Google Search Results: Buried If Not Forgotten, 15 N.C. J.L. & Tech. 463, 500 (2014).
  • 136See, e.g., Daniel Schwarcz, Redesigning Consumer Dispute Resolution: A Case Study of the British and American Approaches to Insurance Claims Conflict, 83 Tul. L. Rev. 735, 761 (2009) (“[M]ost states mandate relatively well-developed internal grievance processes for health insurers.”); 14 C.F.R. § 250.9 (2020) (requiring airlines to provide a written explanation for denying a passenger from boarding).
  • 137See Alexi Horowitz-Ghazi, The Cost of Getting Your Money Back, NPR: Planet Money (June 26, 2019), https://perma.cc/F2YM-G9ZR.
  • 138Id.
  • 139Telis Demos, You Accidentally Sent $149 to a Stranger on Venmo? Good Luck Getting It Back, Wall St. J. (July 12, 2018), https://www.wsj.com/articles/you-accidentally-venmoed-149-to-a-stranger-good-luck-getting-it-back-1531411133. The company may also send a message to the transferee on behalf of the transferor. See Horowitz-Ghazi, supra note 137.
  • 140Cf. Demos, supra note 139 (discussing the reliance on goodness of others).
  • 141Jan Logemann, Different Paths to Mass Consumption: Consumer Credit in the United States and Germany During the 1950s and ’60s, 41 J. Soc. Hist. 525, 539 (2008).
  • 142Pub. L. No. 93-495, 88 Stat. 1511 (codified at 15 U.S.C. §§ 1666–1666i).
  • 14315 U.S.C. § 1666(b).
  • 14415 U.S.C. § 1666(a)(3)(B).
  • 14515 U.S.C. § 1666(a)(3)(B) (requiring an investigation and notice within the lesser of ninety days or two billing cycles); 12 C.F.R. § 226.13(f) (2020) (requiring a reasonable investigation).
  • 14612 C.F.R. § 226.13(f)(2) (2020).
  • 14715 U.S.C. § 1666(b).
  • 148208 F. Supp. 2d 765 (E.D. Mich. 2002).
  • 149Id. at 767–68.
  • 150Id. at 774–75.
  • 151Id.
  • 152Id. at 775.
  • 153See, e.g., Pierce v. JP Morgan Chase Bank, N.A., No. 11–00102–KD–M, 2012 WL 3610776, at *6 (S.D. Ala. Aug. 21, 2012).
  • 154See, e.g., Schmitz, supra note 17, at 16–19.
  • 155Louis Del Duca, Colin Rule & Zbynek Loebl, Facilitating Expansion of Cross-Border E-Commerce—Developing a Global Online Dispute Resolution System (Lessons Derived from Existing ODR Systems—Work of the United Nations Commission on International Trade Law), 1 Pa. St. J.L. & Int’l Affs. 59, 70–72 (2012).
  • 156Horowitz-Ghazi, supra note 137.
  • 157See id.
  • 158Henry H. Perritt, Jr., Dispute Resolution in Cyberspace: Demand for New Forms of ADR, 15 Ohio St. J. on Disp. Resol. 675, 691 (2000) (observing that chargebacks add customer protection by equalizing bargaining power between customer and merchant).
  • 159Id. at 687 (noting that consumers “stop with [company complaint mechanisms or chargebacks] in the vast majority of cases rather than going on to file lawsuits or complaints with administrative agencies”). However, at least one study of National Arbitration Forum cases found that 8.4% were related to credit card chargebacks. Christopher R. Drahozal & Samantha Zyontz, An Empirical Study of AAA Consumer Arbitrations, 25 Ohio St. J. on Disp. Resol. 843, 924 (2010).
  • 16015 U.S.C. § 1666(a)(3)(B).
  • 161Gregory Karp, Consumers Have Powerful Tool in Credit Card Chargebacks, NerdWallet (Feb. 16, 2017), https://perma.cc/SV4N-DK3R (putting the figure at 90% in the United States); Sravan Kumar, The Murky World of Chargebacks, Chargebee Blog (Aug. 4, 2019), https://perma.cc/T73T-JMEL (putting the figure at 79% globally).
  • 162See Perritt, supra note 158, at 691–92.
  • 16312 C.F.R. § 226.13(d)(2) (2020).
  • 16412 C.F.R. § 226.13(d)(1) (2020).
  • 165See Perritt, supra note 158, at 692.
  • 166John Rothchild, Protecting the Digital Consumer: The Limits of Cyberspace Utopianism, 74 Ind. L.J. 893, 976–77 (1999).
  • 167See, e.g., Visa, Visa Claims Resolution: Efficient Dispute Processing for Merchants 6 (2017).
  • 168See, e.g., Stephen Gardner, Credit Reports: Basic Rights and Responsibilities of Creditors and Consumers, 59 Consumer Fin. L.Q. Rep. 248, 253–55 (2005).
  • 169See id. at 253.
  • 170See supra Part I.D.
  • 171John R. Fonseca, Handling Consumer Credit Cases § 11:1, at 446 (3d ed. 1986).
  • 172See id. at 448–50; Gary Rivlin, The Long Shadow of Bad Credit in a Job Search, N. Y. Times (May 11, 2013), https://perma.cc/EEK2-H5RX.
  • 173Fonseca, supra note 171, at 445 (quotation marks and citation omitted).
  • 174McKeown v. Sears Roebuck & Co., 335 F. Supp. 2d 917, 943 (W.D. Wis. 2004).
  • 175Id. at 924–25.
  • 176Fair Credit Reporting Act: Hearing Before the Subcomm. on Consumer Affairs and Coinage of the Comm. on Banking, Finance and Urban Affairs, 102d Cong. 20 (1991) (statement of David Medine, Assoc. Dir. for Credit Practices, FTC).
  • 177Pub. L. No. 91-508, 84 Stat. 1128 (codified at 15 U.S.C. §§ 1681–1681x).
  • 17815 U.S.C. § 1681i(a)(2)(B) (mandating access to “all relevant information regarding the dispute”).
  • 17915 U.S.C. § 1681g(f)(1)(C).
  • 18015 U.S.C. § 1681e(b). The rating agency is expected to notify the consumer of the results of the investigation within thirty days. 15 U.S.C. § 1681s-2(a)(8)(E)(iii); 15 U.S.C. § 1681(i)(a)(1).
  • 181See, e.g., Pinner v. Schmidt, 805 F.2d 1258, 1262 (5th Cir. 1986); Bryant v. TRW, Inc., 689 F.2d 72, 75, 79 (6th Cir. 1982).
  • 182520 F.3d 1066 (9th Cir. 2008).
  • 183Id. at 1068.
  • 184Id.
  • 185Id.
  • 186Id. at 1070–71.
  • 187See, e.g., Miller v. Wells Fargo & Co., No. 05-CV-42-S, 2008 WL 793683, at *6 (W.D. Ky. Mar. 24, 2008) (holding that a long delay in removing information “may indicate a failure to employ reasonable procedures to assure maximum possible accuracy”).
  • 188Mary Spector, Where the FCRA Meets the FDCPA: The Impact of Unfair Collection Practices on the Credit Report, 20 Geo. J. on Poverty L. & Pol’y 479, 485 (2013).
  • 189See, e.g., id. at 486–88.
  • 190Switching costs can be high. See Jack M. Balkin, Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation, 51 U.C. Davis L. Rev. 1149, 1199 (2018) (arguing that “exit from a platform may be costly because of network effects”).
  • 191See infra Part III (drawing on credit rating legislation to design platform procedure).
  • 192Pinner v. Schmidt, 805 F.2d 1258, 1261 (5th Cir. 1986).
  • 193See infra Part III.A.
  • 194Jennifer M. Urban & Laura Quilter, Efficient Process or “Chilling Effects”? Takedown Notices Under Section 512 of the Digital Millennium Copyright Act, 22 Santa Clara Comput. & High Tech. L.J. 621, 622 (2006).
  • 195Pub. L. No. 105-304, 112 Stat. 2860 (1998) (codified as amended in scattered sections of 17 U.S.C.).
  • 196Urban & Quilter, supra note 194, at 624–31. On internet payment blockades and non-regulatory private ordering approaches to trademark and corporate copyright, see generally Annemarie Bridy, Internet Payment Blockades, 67 Fla. L. Rev. 1523 (2015).
  • 19717 U.S.C. § 512(c)(1).
  • 198See Chaim Gartenberg, Disney Is Hunting Down the Most Popular Baby Yoda Toys on Etsy, The Verge (Jan. 17, 2020), https://www.theverge.com/2020/1/17/21069124/
    baby-yoda-dolls-etsy-disney-mandalorian-copyright-takedown-enforcement; Kat Tenbarge, ‘Baby Yoda’ GIFs Were Pulled Down for Copyright Reasons, but GIPHY Has Returned Them to the Internet, Insider (Nov. 25, 2019), https://perma.cc/5B8A-WTAH.
  • 19917 U.S.C. § 512(g)(2)(B)–(C).
  • 20017 U.S.C. § 512(g)(2)(B)–(C).
  • 20117 U.S.C. § 512(i).
  • 202Urban & Quilter, supra note 194, at 631.
  • 203See Peter S. Menell & Michael J. Meurer, Notice Failure and Notice Externalities, 5 J. Legal Analysis 1, 25 (2013) (concluding that the DMCA “provides copyright owners with relatively efficient means for blocking dissemination of infringing copies”).
  • 204Maayan Perel & Niva Elkin-Koren, Accountability in Algorithmic Copyright Enforcement, 19 Stan. Tech. L. Rev. 473, 478–79, 506 (2016).
  • 205Id. at 477 n.7.
  • 206See Dineen Wasylik, Take Down Abuse: From Harry Potter to LEGOs, DPW Legal (Feb. 7, 2014), https://perma.cc/A5M2-VJGH.
  • 207See Jennifer M. Urban, Joe Karaganis & Brianna L. Schofield, Notice and Takedown in Everyday Practice 9–13 (U.C. Berkeley Pub. L. Rsch. Paper No. 2755628, 2017) (describing studies concerning copyright notice and takedown processes).
  • 208See Wasylik, supra note 206.
  • 209See Sharon Bar-Ziv & Niva Elkin-Koren, Behind the Scenes of Online Copyright Enforcement: Empirical Evidence on Notice & Takedown, 50 Conn. L. Rev. 339, 344, 376 (2018) (reviewing the literature and providing new data on copyright takedown misuse).
  • 210Urban et al., supra note 207, at 44.
  • 211Id. at 45 (quotation marks omitted).
  • 212Id. at 46.
  • 213See supra note 161 and accompanying text.
  • 214See 17 U.S.C. § 512(g)(2).
  • 21517 U.S.C. § 512(g)(2)(C) (requiring also copyright holder notification of lawsuit).
  • 216See supra Part II.
  • 217See supra Part I.
  • 218See Van Loo, The Corporation as Courthouse, supra note 10, at 555–58, 561, 582.
  • 219For examples of relevant discussions, see Lina M. Khan, The Separation of Platforms and Commerce, 119 Colum. L. Rev. 973, 1017 (2019) (detailing many platforms’ tendencies toward natural monopolies); Herbert Hovenkamp, Regulation and the Marginalist Revolution, 71 Fla. L. Rev. 455, 465 (2019) (describing that concerns about natural monopolies often lead to market regulation). It is important to recognize, however, that these dynamics are subject to debate—as are core concepts in how to measure competition in antitrust. See generally Louis Kaplow, On the Relevance of Market Power, 130 Harv. L. Rev. 1303 (2017).
  • 220See supra Part I.
  • 221On the more general relationship between competition and dispute resolution, see generally Van Loo, The Corporation as Courthouse, supra note 10.
  • 222See Cable Television, Fed. Comm. Comm’n, https://perma.cc/WR68-X85N.
  • 223For examples of applying due process to related issues other than the type of platform dispute resolution that is the focus of this Article, in addition to the early work of Citron mentioned supra note 15, see Kate Crawford & Jason Schultz, Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms, 55 B.C. L. Rev. 93, 124–28 (2014) (applying due process to predictive privacy analytics); Kristen E. Eichensehr, Digital Switzerlands, 167 U. Pa. L. Rev. 665, 722–23 (2019) (asking whether technology companies should be held to similar standards as governments, such as requiring due process); Elizabeth G. Thornburg, Going Private: Technology, Due Process, and Internet Dispute Resolution, 34 U.C. Davis L. Rev. 151, 196 (2000) (applying the concept of due process to the DMCA).
  • 224See, e.g., Prager Univ. v. Google LLC, No. 17-CV-06064-LHK, 2018 WL 1471939, at *8 (N.D. Cal. Mar. 26, 2018) (rejecting the argument that private social media corporations are state actors); Freedom Watch, Inc. v. Google, Inc., 368 F. Supp. 3d 30, 40 (D.D.C. 2019) (finding that social networks “do not become ‘state actors’ based solely on the provision of their social media networks to the public”). Some view this as antiquated, or have challenged it. See Przemek Palka, Facebook’s Exercise of Public Power, PrzemysLAW.technology (Nov. 3, 2016), https://perma.cc/2N4U-LPEP; Thomas Kadri, Platforms as Blackacres, 68 UCLA L. Rev. (forthcoming 2021).
  • 225Goldberg v. Kelly, 397 U.S. 254, 267 (1970) (requiring minimum procedural safeguards before termination of welfare benefits); Arnett v. Kennedy, 416 U.S. 134, 164 (1974) (Powell, J., concurring) (“Governmental deprivation of such a[ ] [property or liberty] interest must be accompanied by minimum procedural safeguards.”).
  • 226Mathews v. Eldridge, 424 U.S. 319, 335 (1976) (citing Goldberg, 397 U.S. at 263–71).
  • 227See Van Loo, Rise of the Digital Regulator, supra note 10, at 1321.
  • 228See Dawn Carla Nunziato, The Fourth Year of Forgetting: The Troubling Expansion of the Right to Be Forgotten, 39 U. Pa. J. Int’l L. 1011, 1024 (2018) (discussing the consequences when Google’s search engine cannot index information).
  • 229Packingham v. North Carolina, 137 S. Ct. 1730, 1737 (2017); see also Amélie P. Heldt, Merging the Social and the Public: How Social Media Platforms Could Be a New Public Forum, 46 Mitchell Hamline L. Rev. 997, 1031 (2020) (“By calling social media platforms the ‘modern public square’ . . . , the Court acknowledged the reality of how most people use the Internet.” (quoting Packingham, 137 S. Ct. at 1737)).
  • 230Mathews, 424 U.S. at 335.
  • 231See, e.g., supra Part I (summarizing platform procedure); Bloch-Wehba, supra note 10, at 75 (“[P]latforms routinely reach opposite conclusions about specific instances of online content.”).
  • 232See Amit Datta, Anupam Datta, Jael Makagon, Deirdre K. Mulligan & Michael Carl Tschantz, Discrimination in Online Advertising: A Multidisciplinary Inquiry, 81 Proc. Mach. Learning Rsch. 20, 32 (2018).
  • 233Galen Sherwin & Esha Bhandari, Facebook Settles Civil Rights Cases by Making Sweeping Changes to Its Online Ad Platform, ACLU (Mar. 19, 2019), https://perma.cc/AKZ6-DX3V (summarizing lawsuits filed).
  • 234Telephone Interview with Chris McCabe, supra note 65.
  • 235115 Cong. Rec. 2,412 (1969) (statement of Sen. William Proxmire).
  • 236See supra Part I.A.
  • 237See supra notes 56–57 and accompanying text (describing the role of reputation on Amazon).
  • 238See supra Part I.
  • 239Amy J. Schmitz & Colin Rule, The New Handshake: Online Dispute Resolution and the Future of Consumer Protection 52 (2017).
  • 240For example, many of the top twenty Fortune 500 firms are financial institutions and technology companies that would be required to comply with both copyright and chargeback processes.
  • 241See generally, e.g., Van Loo, The Corporation as Courthouse, supra note 10. Of course, there would be diminishing, and at a certain point negative, returns for such investments.
  • 242See id. at 557 (“A focus on short-term revenues, rather than on the lifetime value of the customer, causes many firms to underemphasize the solving of customers’ problems.”).
  • 243See supra Part II.
  • 244See, e.g., Rory Van Loo, The New Gatekeepers: Private Firms as Public Enforcers, 106 Va. L. Rev. 467, 482–84 (2020).
  • 245This Article leaves the constitutional limitations of its proposals to others who are experts on such matters. There would no doubt be legal challenges and limits based on the First Amendment in at least some contexts, most notably with search and social network platforms. The literature on this topic is too vast to list, but for a review of the cases on regulating platform speech, see Daphne Keller, Who Do You Sue? State and Platform Hybrid Power over Online Speech 2, 4 (Aegis Series Paper No. 1902, 2019). For a sense of the broader debates around speech limiting regulation, see generally, for example, Amanda Shanor, The New Lochner, 2016 Wis. L. Rev. 133; Nelson Tebbe, A Democratic Political Economy for the First Amendment, 105 Cornell L. Rev. 959 (2020); Genevieve Lakier, The First Amendment’s Real Lochner Problem, 87 U. Chi. L. Rev. 1241 (2020).
  • 246Mathews, 424 U.S. at 335 (stating that one important factor in assessing due process requirements is “the Government’s interest, including the function involved and the fiscal and administrative burdens that the additional or substitute procedural requirement would entail”).
  • 247Arnett, 416 U.S. at 164 (Powell, J., concurring).
  • 248See, e.g., Rory Van Loo, Regulatory Monitors: Policing Firms in the Compliance Era, 119 Colum. L. Rev. 369, 432–33 (2019) (describing the appeals processes of large regulators).
  • 249See Van Loo, The Corporation as Courthouse, supra note 10, at 560.
  • 250See, e.g., Katsh & Rabinovich-Einy, supra note 42, at 20–21 (2017) (pushing for truly virtual dispute resolution that improves efficiency and fairness); Schmitz & Rule, supra note 239, at 52 (explaining how entire dispute resolution processes can rely heavily on algorithms to the benefit of both consumers and businesses).
  • 251Facebook, supra note 90, at 5.
  • 252See The Federalist No. 78, at 398–99 (Alexander Hamilton) (Gary Wills ed., 1982). It is debatable whether the Board’s “highly persuasive” standard is the right standard, as opposed to seeing prior decisions as something closer to binding. On prior court decisions as binding, see generally, for example, Amy Coney Barrett, Statutory Stare Decisis in the Courts of Appeals, 73 Geo. Wash. L. Rev. 2 (2005). Most circuit courts have adopted “law-of-the-circuit” rules that bind subsequent circuit court decisions. Joseph W. Mead, Stare Decisis in the Inferior Courts of the United States, 12 Nev. L.J. 787, 794–95 (2012).
  • 253See, e.g., supra Part I.A (discussing Amazon’s great variance in outcomes).
  • 254See infra Part III.C.2.
  • 255See generally Molly K. Land, The Problem of Platform Law: Pluralistic Legal Ordering on Social Media, in The Oxford Handbook of Global Legal Pluralism 974 (Paul Schiff Berman ed., 2020).
  • 256How much it matters that a platform is different is debatable, but, as long as the business models vary greatly, the institution’s set of interests to be weighed may merit divergent approaches. See supra Part I.
  • 257Raquel Rutledge & Andrew Mollica, TripAdvisor Removed Warnings About Rapes and Injuries at Mexico Resorts, Tourists Say, Milwaukee J. Sentinel (Nov. 1, 2017), https://perma.cc/L3H3-ZGNL.
  • 258Id.
  • 259Id.
  • 260Id.
  • 261Id.
  • 262Rutledge & Mollica, supra note 257.
  • 263See Order Defect Rate, supra note 47.
  • 264See, e.g., u/i4mt3hwin, Amazon A-Z Claim Denied - Any Recourse?, Reddit (Jan. 25, 2017), https://perma.cc/ADQ3-84JS; Khadeeja Safdar & Laura Stevens, Banned from Amazon: The Shoppers Who Make Too Many Returns, Wall St. J. (May 22, 2018), https://www.wsj.com/articles/banned-from-amazon-the-shoppers-who-make-too-many-returns-1526981401 (discussing buyer refunds).
  • 265See, e.g., Hill, supra note 71.
  • 266See Rutledge & Mollica, supra note 257.
  • 267See Nathan S. Chapman & Michael W. McConnell, Due Process as Separation of Powers, 121 Yale L.J. 1672, 1705 (2012) (explaining the importance of tenure and sufficient salaries for judges); Gregory M. Dyer & Brendan Judge, Criminal Defendants’ Waiver of the Right to Appeal—An Unacceptable Condition of a Negotiated Sentence or Plea Bargain, 65 Notre Dame L. Rev. 649, 662 (1990) (asserting that an important function of appeals is “the development of the common law”).
  • 268See supra Part II.
  • 269The design of a sensible standard for precedence is, in itself, a complicated undertaking. See William Baude, Constitutional Liquidation, 71 Stan. L. Rev. 1, 36–42 (2019).
  • 270Facebook, supra note 90, at 3.
  • 271Tim Wu, When Censorship Makes Sense: How YouTube Should Police Hate Speech, New Republic (Sept. 18, 2012), https://perma.cc/XR4S-KHD2. An alternative model is a self-regulatory organization that would monitor and adjudicate internet intermediaries. See Frank Pasquale, Beyond Innovation and Competition: The Need for Qualified Transparency in Internet Intermediaries, 104 Nw. U. L. Rev. 105, 168–69 (2010).
  • 272Anjanette H. Raymond & Abbey Stemler, Trusting Strangers: Dispute Resolution in the Crowd, 16 Cardozo J. Conflict Resol. 357, 382 (2015).
  • 273Id.
  • 274See Resnik, supra note 22, at 1821.
  • 275Smiley v. Citibank, 517 U.S. 735, 740–41 (1996).
  • 276See Katsh & Rabinovich-Einy, supra note 250, at 67.
  • 277Rabinovich-Einy & Katsh, supra note 16, at 183.
  • 278Id. at 184.
  • 279See van Zuylen-Wood, supra note 1.
  • 280Community Standards Enforcement Report, Facebook Transparency, https://transparency.facebook.com/community-standards-enforcement.
  • 281See van Zuylen-Wood, supra note 1 (providing staffing figures).
  • 282U.S. Cts., Chronological History of Authorized Judgeships in U.S. Courts of Appeals 1–14 (2009) (reporting that Congress has authorized a total of 179 appellate judgeships, ranging from 6 positions in the First Circuit to 29 in the Ninth Circuit).
  • 283See Resnik, supra note 22, at 1802.
  • 284See, e.g., Schwarcz, supra note 136, at 761 (summarizing the law related to insurance policyholder grievances).
  • 285See infra Part III.C.3 (discussing aggregation mechanisms).
  • 286Dr. Nicole Baldwin (@drnicolebaldwin), TikTok (Jan. 10, 2020), https://www.tiktok.com/@drnicolebaldwin/video/6780375204574055685.
  • 287Brooke Sjoberg, Pediatrician Gets Death Threats After Pro-Vaccine TikTok Video, The Daily Dot (Jan. 20, 2020), https://perma.cc/KVU2-UA32.
  • 288Renee DiResta (@noUpside), Twitter (Jan. 19, 2020), https://twitter.com/noUpside/status/1218943537653280768.
  • 289Cat Ellis, Google Is Building a New Social Network, TechRadar (July 12, 2019), https://perma.cc/TN8A-8FS7 (describing a pilot project in New York for a regionally based social network).
  • 290Annie Lowrey, Don’t Trust Facebook, The Atlantic (Oct. 24, 2019), https://perma.cc/6Z8R-9QPW (commenting on Facebook’s Libra cryptocurrency project and other financial intermediation steps).
  • 291Streitfeld, supra note 39.
  • 292See Magana, supra note 52.
  • 293See, e.g., Jay Greene, Amazon Reviews Hijacked by Causes, Conspiracies, Rage, Seattle Times (Oct. 31, 2015), https://perma.cc/3STP-ZMAU (describing attacks by conspiracy theorists made in the reviews of a book written by a mother of one of the Sandy Hook shooting victims).
  • 294Elinor Ostrom, Polycentric Systems for Coping with Collective Action and Global Environmental Change, 20 Glob. Envtl. Change 550, 552 (2010) (setting forth Nobel Prize–winning economist Elinor Ostrom’s theory of polycentricism).
  • 295See infra notes 401–02 and accompanying text (explaining how the Administrative Procedure Act and related agency rules are modeled after the Federal Rules).
  • 296Surowitz v. Hilton Hotels Corp., 383 U.S. 363, 373 (1966); see also Maria J. Glover, The Federal Rules of Civil Settlement, 87 N.Y.U. L. Rev. 1713, 1715 (2012) (describing the goal of the Federal Rules as facilitating “truth-seeking” and “the resolution of cases on their merits”).
  • 297See, e.g., David L. Noll, Regulating Arbitration, 105 Calif. L. Rev. 985, 1007 (2017) (“Devices such as notice pleading, liberal discovery, and liberal joinder seek to equalize litigants’ ability to prove their claims and defenses.”).
  • 298Mauro Cappelletti & Bryant Garth, Access to Justice: The Worldwide Movement to Make Rights Effective, in 1 Access to Justice: A World Survey 21, 35–36 (Mauro Cappelletti & Bryant Garth eds., 1978).
  • 299See generally Marc Galanter, Why the “Haves” Come Out Ahead: Speculations on the Limits of Legal Change, 9 Law & Soc’y Rev. 95 (1974).
  • 300Id. at 119–20.
  • 301See, e.g., Van Loo, The Corporation as Courthouse, supra note 10, at 565.
  • 302Michael R. Guerrero, Disputing the Dispute Process: Questioning the Fairness of §1681s-2(a)(8) and §1681j(a)(1)(A) of the Fair and Accurate Credit Reporting Act, 47 Cal. W. L. Rev. 437, 450 (2011).
  • 303See, e.g., Van Loo, The Corporation as Courthouse, supra note 10, at 565–66 (describing technological advances broadly spreading companies’ abilities to identify higher-value users); Sofia Ranchordás, Online Reputation and the Regulation of Information Asymmetries in the Platform Economy, 5 Critical Analysis L. 127, 134 (2018) (describing how platforms use scores to prioritize influencers). See generally Yonathan A. Arbel, Reputation Failure: The Limits of Market Discipline in Consumer Markets, 54 Wake Forest L. Rev. 1239 (2019) (observing the limits of reputation markets).
  • 304See Jon Emont, Amazon’s Heavy Recruitment of Chinese Sellers Puts Consumers at Risk, Wall St. J. (Nov. 11, 2019), https://www.wsj.com/articles/amazons-heavy-recruitment
    -of-chinese-sellers-puts-consumers-at-risk-11573489075 (detailing a small cleaning products seller who was forced to lay off most of its U.S. staff after cheap counterfeits of its product were sold on Amazon).
  • 305See David Streitfeld, What Happens After Amazon’s Domination Is Complete? Its Bookstore Offers Clues, N.Y. Times (June 23, 2019), https://perma.cc/QPB2-GV5W.
  • 306Id.
  • 307Id.
  • 308Id.
  • 309On the interplay between humans and machines, see Tim Wu, Will Artificial Intelligence Eat the Law? The Rise of Hybrid Social-Ordering Systems, 119 Colum. L. Rev. 2001, 2002 (2019).
  • 310Fed. R. Evid. 706 (providing for court-appointed experts); Fed. R. Civ. P. 53 (providing for special masters).
  • 311See Colin Rule, Quantifying the Economic Benefits of Effective Redress: Large E-Commerce Data Sets and the Cost-Benefit Case for Investing in Dispute Resolution, 34 U. Ark. Little Rock L. Rev. 767, 776 (2012) (finding that buyers who reached amicable dispute resolutions were more likely to return than buyers who simply achieved a full refund in their dispute).
  • 312See id.
  • 313Ethan Katsh & Colin Rule, What We Know and Need to Know About Online Dispute Resolution, 67 S.C. L. Rev. 329, 330 (2016).
  • 31415 U.S.C. § 1666(a); supra Part II.
  • 315See, e.g., Citron & Pasquale, supra note 15, at 28.
  • 316In contrast, the ADR literature emphasizes not only online dispute resolution, but also confidentiality—a core tenet of ADR. See Katsh & Rule, supra note 313, at 330–31.
  • 317See Documentation, Facebook for Developers, https://perma.cc/8KH5-ENAC; Legal Policies, Amazon, https://perma.cc/G7W6-4LXW; Terms of Service, Facebook, https://perma.cc/KMZ7-PMD2 (last updated Oct. 22, 2020).
  • 318See Streitfeld, supra note 39.
  • 319See Kampen, supra note 95.
  • 320See supra Part I.A.
  • 321Id.
  • 322See, e.g., Judith Resnik, Failing Faith: Adjudicatory Procedure in Decline, 53 U. Chi. L. Rev. 494, 501 (1986).
  • 323See 15 U.S.C. § 1681i (laying out procedures for disputing credit report accuracy); supra note 178 and accompanying text (describing credit bureau transparency mandates).
  • 324See generally Visa, supra note 167 (outlining Visa’s automated online process for handling chargebacks and chargeback disputes).
  • 325David Pozen, Transparency’s Ideological Drift, 128 Yale L.J. 100, 156 (2018)
    (discussing how corporations utilize transparency to access valuable information). For a further discussion of how transparency can fail consumers, see generally, for example, Omri Ben-Shahar & Carl E. Schneider, The Failure of Mandated Disclosure, 159 U. Pa. L. Rev. 647 (2011).
  • 326It should be feasible to design the process so as to protect trade secrets and enhance visibility. Cf. Rebecca Wexler, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System, 70 Stan. L. Rev. 1343, 1343–44 (2018) (“A criminal trade secret privilege is ahistorical, harmful to defendants, and unnecessary to protect the interests of the secret holder.”).
  • 327On the value of transparency to precedent, see, for example, Kevin E. Davis & Helen Hershkoff, Contracting for Procedure, 53 Wm. & Mary L. Rev. 507, 540, 544 (2011).
  • 328Cf. J. Maria Glover, Disappearing Claims and the Erosion of Substantive Law, 124 Yale L.J. 3052, 3074–76 (2015) (discussing how arbitration and the Supreme Court’s jurisprudence have prevented the vindication and continued evolution of substantive law, particularly in areas that rely on so-called private attorneys general).
  • 329On the potential for small consumer harms to contribute to large issues such as economic inequality, see generally Rory Van Loo, Broadening Consumer Law: Competition, Protection, and Distribution, 95 Notre Dame L. Rev. 211 (2019).
  • 330See Fed. R. Civ. P. 23. Recently, multidistrict litigation—which combines cases from different jurisdictions—has become the tool of choice for aggregation. See 28 U.S.C. § 1407.
  • 331See generally Van Loo, The Corporation as Courthouse, supra note 10.
  • 332Scott Cooper, Colin Rule & Louis Del Duca, From Lex Mercatoria to Online Dispute Resolution: Lessons from History in Building Cross-Border Redress Systems, 43 Unif. Com. Code L.J. 749, 767–68 (2011).
  • 333See Alexander W. Aiken, Comment, Class Action Notice in the Digital Age, 165 U. Pa. L. Rev. 967, 978, 982 (2017) (explaining how traditional methods are often ineffective).
  • 334See id. at 1003 (discussing machine learning as a method for improving the provision of class action notice).
  • 335See, e.g., Jeremy R. McClane, Class Action in the Age of Twitter: A Dispute Systems Approach, 19 Harv. Negot. L. Rev. 213, 240 (2014) (exploring use of technology to improve class actions).
  • 336See supra Part I.
  • 337407 U.S. 67 (1972).
  • 338Id. at 70–71.
  • 339Id. at 86–87.
  • 340Id. at 90.
  • 341Wikipedia emphasized this goal early on. See Hoffman & Mehra, supra note 17, at 162–63.
  • 342See supra note 8 and accompanying text (discussing this case).
  • 343See supra note 73 and accompanying text (discussing this case).
  • 34412 C.F.R. § 226.13(d)(1) (2020).
  • 345See SEC: Madoff Banned from Working Again, CBS News (Jun. 16, 2009), https://perma.cc/6ARP-4J3G (discussing an SEC settlement imposing a lifetime ban on Bernie Madoff from the financial industry for creating a Ponzi scheme that scammed investors out of millions of dollars).
  • 346Bryant v. TRW, Inc., 689 F.2d 72, 79 (6th Cir. 1982) (quoting a lawmaker quoting William Shakespeare).
  • 347See supra Part I.A.
  • 348See Philip K. Dick, The Minority Report (1956) (depicting a dystopian world in which the government imprisons people because it believes they will commit crimes); supra note 105 and accompanying text (describing Airbnb practices).
  • 349See supra Part I.D.
  • 350See Fed. R. Evid. 404(b), 802. Granted, in some limited contexts hearsay and reputation evidence are allowed. See Fed. R. Evid. 405, 803–04.
  • 351See supra Part II.B.
  • 352Cf. Elizabeth Doyle O’Brien, Minimizing the Risk of the Undeserved Scarlet Letter: An Urgent Call to Amend § 1681e(b) of the Fair Credit Reporting Act, 57 Cath. U. L. Rev. 1217, 1220 (2008).
  • 353No. 08-6306-AA, 2009 WL 2252105 (D. Or. July 27, 2009).
  • 354See id. at *1.
  • 355See id.
  • 356See id.
  • 357See id.
  • 358See generally Tyler, supra note 16 (showing the importance of trustworthiness for legitimacy).
  • 359See supra note 257 and accompanying text (discussing TripAdvisor’s takedowns).
  • 360See Pasquale, supra note 106, at 138 (calling for visibility into search results).
  • 361See generally, e.g., Mulligan & Griffin, supra note 127 (providing examples of Google allowing hate speech and Holocaust denials); Hall, supra note 6 (discussing one law student’s experience).
  • 362Larsen, supra note 92, at 223.
  • 363Yanbo Ge, Christopher R. Knittel, Don MacKenzie & Stephen Zoepf, Racial and Gender Discrimination in Transportation Network Companies 16 (Nat’l Bureau of Econ. Rsch., Working Paper No. 22776, 2016), https://perma.cc/7LSC-CXXG (finding longer wait times and higher cancellations rates by Uber drivers against users with “African American–sounding names”).
  • 364See supra note 105 and accompanying text (explaining predictive technologies’ usage).
  • 365See supra Part III.B.2.
  • 366See, e.g., Rigby v. FIA Card Servs., N.A., 490 F. App’x 230, 236–37 (11th Cir. 2012); Dennis, 520 F.3d at 1068; Burnstein, 208 F. Supp. 2d at 775.
  • 36715 U.S.C. § 1681n(a)(2).
  • 368For an approach rooted in deterrence, courts could award punitive damages “if, and only if, an injurer has a significant chance of escaping liability for the harm he caused.” A. Mitchell Polinsky & Steven Shavell, Punitive Damages: An Economic Analysis, 111 Harv. L. Rev. 869, 870 (1998).
  • 36915 U.S.C. § 1681n(a)(1).
  • 37015 U.S.C. § 1681n(a)(1).
  • 37115 U.S.C. § 1681n(a)(1)(A) (providing a cap); see Miller v. Wells Fargo & Co., No. 3:05-CV-42-S, 2008 WL 793683, at *7 n.7 (W.D. Ky. Mar. 24, 2008) (finding plaintiffs’ actual damages arguments unpersuasive).
  • 372See, e.g., Hadfield, supra note 36, at 83–84.
  • 373See Alexandra P. Everhart Sickler, The (Un)Fair Credit Reporting Act, 28 Loy. Consumer L. Rev. 238, 256 (2016) (noting that the FCRA “limit[s] consumers’ ability” to receive damages by “imposing procedural hurdles that are difficult to satisfy”).
  • 374Mikella Hurley & Julius Adebayo, Credit Scoring in the Era of Big Data, 18 Yale J.L. & Tech. 148, 178 (2016) (quotation marks omitted).
  • 375Id. at 189–90.
  • 376For a similar proposal for regulatory auditing of complaints, see Van Loo, The Corporation as Courthouse, supra note 10, at 595–97.
  • 377Scholars have called for audit trails in other contexts. See, e.g., Citron & Pasquale, supra note 15, at 28 (calling for audit trails for automated scoring, such as for credit scores).
  • 378See 14 C.F.R. § 259.5(c) (2020).
  • 379Bart Jansen, Delta Fined for Violating Bumping Rules – Again, USA Today (June 26, 2013), https://perma.cc/J7ZH-FS5V.
  • 380See Van Loo, supra note 248, at 436–40.
  • 381See Van Loo, supra note 32, at 1620 (noting that financial institutions are some of the most heavily regulated businesses).
  • 382See generally Mark MacCarthy, A Consumer Protection Approach to Platform Content Moderation in the United States, in Fundamental Rights Protection Online: The Future Regulation of Intermediaries 115 (Bilyana Petkova & Tuomas Ojanen eds., 2020).
  • 383Federal Judicial Caseload Statistics 2019, U.S. Cts., https://perma.cc/K8X7-4QN4.
  • 384Administrative agencies come with greater risk of capture than do courts. On the problem of capture, see generally, for example, Rachel E. Barkow, Insulating Agencies: Avoiding Capture Through Institutional Design, 89 Tex. L. Rev. 15 (2010).
  • 385See David S. Evans, Governing Bad Behavior by Users of Multi-Sided Platforms, 27 Berkeley Tech. L.J. 1201, 1219 (2012).
  • 386The federal courts’ procedures are largely inert and rarely change. See, e.g., Carl Tobias, Civil Justice Reform Sunset, 1998 U. Ill. L. Rev. 547, 551, 598–600.
  • 387Howell E. Jackson, Variation in the Intensity of Financial Regulation: Preliminary Evidence and Potential Implications, 24 Yale J. on Reg. 253, 270 (2007).
  • 388George A. Akerlof, The Market for “Lemons”: Quality Uncertainty and the Market Mechanism, 84 Q.J. Econ. 488, 488, 495 (1970).
  • 389See, e.g., Van Loo, supra note 248, at 371–72 (summarizing the history of interventions necessitating regulation); Natalie Kitroeff, David Gelles & Jack Nicas, The Roots of Boeing’s 737 Max Crisis: A Regulator Relaxes Its Oversight, N.Y. Times (July 27, 2019), https://perma.cc/4VVY-EUAX.
  • 390See Paul Ohm, We Couldn’t Kill the Internet If We Tried, 130 Harv. L. Rev. F. 79, 79, 84 (2016); Solum, supra note 13, at 57–58 (summarizing the early literature on the internet).
  • 391See Van Loo, supra note 244, at 496–97 (explaining how the largest U.S. companies, which operate in banking, technology, oil, and pharmaceuticals, serve as enforcers).
  • 392For instance, credit card companies began to automate chargebacks and provide online forms with only vague laws. See supra Part II.A.
  • 393Larsen, supra note 92, at 177.
  • 394In the alternative, procedural rules may be less inefficient than regulatory alternatives. On lawmakers’ decision of whether to impose platform liability, see generally SFrank Fagan, Optimal Social Media Content Moderation and Platform Immunities, 50 Eur. J. L. & Econ. 437 (2020).
  • 395See Cohen, Law for the Platform Economy, supra note 10, at 136.
  • 396See Van Loo, The Corporation as Courthouse, supra note 10, at 563.
  • 397Michael Levenson, Pelosi Clashes with Facebook and Twitter over Video Posted by Trump, N.Y. Times (Feb. 8, 2020), https://perma.cc/8GS2-ZWXR.
  • 398Id.
  • 399Id.
  • 400David McCabe & Davey Alba, Facebook Says It Will Ban ‘Deepfakes’, N.Y. Times (Jan. 7, 2020), https://perma.cc/F8Y7-68TD.
  • 4015 U.S.C. §§ 554, 556–557.
  • 402See, e.g., Rules of Practice and Procedure for Administrative Hearings Before the Office of Administrative Law Judges, 29 C.F.R. § 18 (2020) (providing procedural rules for the U.S. Department of Labor, Office of Administrative Law Judges modeled after civil rules of procedure). See generally also Thomas W. Merrill, Article III, Agency Adjudication, and the Origins of the Appellate Review Model of Administrative Law, 111 Colum. L. Rev. 939 (2011) (studying administrative law appeals).
  • 403The omission is because existing treatments do not focus on dispute resolution. See, e.g., Citron & Pasquale, supra note 15, at 28 (applying due process to algorithmic scoring).
  • 404The biggest difference may be the greater willingness in the agency context to have an appeals structure that is not organizationally removed from the original adjudicator. See supra Part III.A.
  • 405Cf. Nicholas Bagley, The Procedure Fetish, 118 Mich. L. Rev. 345, 349, 387–93 (2019) (discussing how repeat industry players benefit from excess procedure).
  • 406See Tom R. Tyler, The Psychological Consequences of Judicial Procedures: Implications for Civil Commitment Hearings, 46 SMU L. Rev. 433, 433 (1992).
  • 407Tyler, supra note 16, at 5.
  • 408Id. at 9.
  • 409See id. (showing more broadly the value of procedural justice to the law).
  • 410See Van Loo, The Corporation as Courthouse, supra note 10, at 555.
  • 411Cf. Colin Camerer, Samuel Issacharoff, George Loewenstein, Ted O’Donoghue & Matthew Rabin, Regulation for Conservatives: Behavioral Economics and the Case for “Asymmetric Paternalism”, 151 U. Pa. L. Rev. 1211, 1212–13 (2003) (“[P]aternalism prevents people from behaving in their own best interests.”).
  • 412U.S. Const. art. III, § 1, cl. 1.