For data, the whole is greater than the sum of its parts. There may be millions of people with the same birthday. But how many also have a dog, a red car, and two kids? The more data is aggregated, the more identifying it becomes. Accordingly, the law has developed safe harbors for firms that take steps to prevent aggregation of the data they sell. A firm might, for instance, anonymize data by removing identifying information. But as computer scientists have shown, clever de-anonymization techniques enable motivated actors to unmask identities even if the data is anonymized. Data brokers collect, process, and sell data. Courts have traditionally calculated data brokering harms without considering the larger data ecosystem. This Comment suggests a broader conception is needed because the harm caused by one broker’s conduct depends on how other brokers behave. De-anonymization techniques, for instance, often cross-reference datasets to make guesses about missing data. A motivated actor can also buy datasets from multiple brokers to combine them. This Comment then offers a framework for courts to consider these “network harms” in the Federal Trade Commission’s (FTC) recent lawsuits against data brokers under its Section 5 authority to prevent unfair acts and practices.
November
2024