Technology and Law

Print
Comment
Volume 91.7
Network Harms
Andy Z. Wang
B.S. 2022, San Jose State University; J.D. Candidate 2025, The University of Chicago Law School.

I would like to thank Professor Omri Ben-Shahar for his tremendous guidance and advice. Thank you to the editors and staff of the University of Chicago Law Review for their tireless editing support. A special thank you to Eric Haupt, Jack Brake, Karan Lala, Tanvi Antoo, Luke White, Jake Holland, Bethany Ao, Emilia Porubcin, Benjamin Wang, and Anastasia Shabalov for their invaluable insights and contributions along the way.

For data, the whole is greater than the sum of its parts. There may be millions of people with the same birthday. But how many also have a dog, a red car, and two kids? The more data is aggregated, the more identifying it becomes. Accordingly, the law has developed safe harbors for firms that take steps to prevent aggregation of the data they sell. A firm might, for instance, anonymize data by removing identifying information. But as computer scientists have shown, clever de-anonymization techniques enable motivated actors to unmask identities even if the data is anonymized. Data brokers collect, process, and sell data. Courts have traditionally calculated data brokering harms without considering the larger data ecosystem. This Comment suggests a broader conception is needed because the harm caused by one broker’s conduct depends on how other brokers behave. De-anonymization techniques, for instance, often cross-reference datasets to make guesses about missing data. A motivated actor can also buy datasets from multiple brokers to combine them. This Comment then offers a framework for courts to consider these “network harms” in the Federal Trade Commission’s (FTC) recent lawsuits against data brokers under its Section 5 authority to prevent unfair acts and practices.

Print
Article
v88.2
Competing Algorithms for Law: Sentencing, Admissions, and Employment
Saul Levmore
William B. Graham Distinguished Service Professor of Law, The University of Chicago Law School.

We benefited from discussions with colleagues at a University of Chicago Law School workshop and with Concetta Balestra Fagan and Eliot Levmore.

Frank Fagan
Associate Professor of Law, EDHEC Business School, France.

When the past is thought to predict the future, it is unsurprising that machine learning, with access to large data sets, wins prediction contests when competing against an individual, including a judge. Just as computers predict next week’s weather better than any human working alone, at least one study shows that machine learning can make better decisions than can judges when deciding whether or not to grant bail.