Understanding Equal Sovereignty
- Share The University of Chicago Law Review | Understanding Equal Sovereignty on Facebook
- Share The University of Chicago Law Review | Understanding Equal Sovereignty on Twitter
- Share The University of Chicago Law Review | Understanding Equal Sovereignty on Email
- Share The University of Chicago Law Review | Understanding Equal Sovereignty on LinkedIn
I would like to thank Jack Brake, Anne Marie Hawley, and Jonah Klausner for their thoughtful edits and Jake Holland for his indispensable advice all throughout the drafting process.
Illinois’s Biometric Information Privacy Act (BIPA) is the country’s most powerful law governing biometric data—data generated from an individual’s biological characteristics, like fingerprints and voiceprints. Over the past decade, BIPA garnered a reputation as an exceptionally plaintiff-friendly statute. But from 2023–2024, the Illinois legislature, Illinois Supreme Court, and Ninth Circuit Court of Appeals all sided with BIPA defendants for the first time. Most significantly, in Zellmer v. Meta Platforms, Inc., the Ninth Circuit dismissed the plaintiff’s BIPA claim because the face scan collected by the defendant could not be used to identify him.
It is unclear whether these developments represent a trend or an exception to BIPA’s plaintiff-friendliness. Which path is charted will largely turn on how courts interpret Zellmer: While Zellmer established that a biometric identifier must be able to identify an individual, lower courts have construed its holding narrowly to require that the entity collecting biometric data must itself be capable of identifying, rather than it being sufficient for any entity to do so. Reading BIPA this narrowly would significantly weaken the statute’s protections.
After detailing how employer and consumer cases catalyzed this recent defendant-friendly shift, this Comment proposes a two-step framework to determine whether a biometric identifier is able to identify, falling under BIPA’s reach. Given BIPA’s broad influence, where courts ultimately land on this question will be crucial to the protection of biometric data nationwide."
I would like to thank Professor Lior Strahilevitz and the editors and staff of the University of Chicago Law Review for their thoughtful advice and insight.
Recently, many states have reacted to the growing data economy by passing data privacy statutes. These follow the “interaction model”: they allow consumers to exercise privacy rights against firms by directly interacting with them. But data brokers, firms that buy and sell data for consumers whom they do not directly interact with, are key players in the data economy. How is a consumer meant to exercise their rights against a broker with an “interaction gap” between them?
A handful of states have tried to soften the interaction gap by enacting data-broker-specific legislation under the “transparency model.” These laws, among other things, require brokers to publicly disclose themselves in state registries. The theory is that consumers would exercise their rights against brokers if they knew of the brokers’ existence. California recently went further with the Delete Act, providing consumers data-broker-specific privacy rights.
Assembling brokers’ reported privacy request metrics, this Comment performs an empirical analysis of the transparency model’s efficacy. These findings demonstrate that the transparency model does not effectively facilitate consumers in following through on their expected privacy preferences or meaningfully impacting brokers. Therefore, regulators should follow in the footsteps of the Delete Act and move beyond the transparency model.
Thanks to the editors of The University of Chicago Law Review for their help with this piece.
During the COVID-19 pandemic, many businesses transitioned to remote work for some or all of their employees, relying on videoconference platforms like Zoom and Microsoft Teams for communication.