Big Data and Bad Data: On the Sensitivity of Security Policy to Imperfect Information
- Share The University of Chicago Law Review | Big Data and Bad Data: On the Sensitivity of Security Policy to Imperfect Information on Facebook
- Share The University of Chicago Law Review | Big Data and Bad Data: On the Sensitivity of Security Policy to Imperfect Information on Twitter
- Share The University of Chicago Law Review | Big Data and Bad Data: On the Sensitivity of Security Policy to Imperfect Information on Email
- Share The University of Chicago Law Review | Big Data and Bad Data: On the Sensitivity of Security Policy to Imperfect Information on LinkedIn
In this Essay, we examine some of the factors that make developing a “science of security” a significant research and policy challenge. We focus on how the empirical hurdles of missing data, inaccurate data, and invalid inferences can significantly impact—and sometimes impair—the security decisionmaking processes of individuals, firms, and policymakers. We offer practical examples of the sensitivity of policy modeling to those hurdles and highlight the relevance of these examples in the context of national security.
Beware dark patterns. The name should be a warning, perhaps alluding to the dark web, the “Dark Lord” Sauron, or another archetypically villainous and dangerous entity. Rightfully included in this nefarious bunch, dark patterns are software interfaces that manipulate users into doing things they would not normally do. Because of these First Amendment complications, the constitutionality of dark pattern restrictions is an unsettled question. To begin constructing an answer, we must look at how dark patterns are regulated today, how companies have begun to challenge the constitutionality of such regulations, and where dark patterns fall in the grand scheme of free speech. Taken together, these steps inform an approach to regulation going forward.
Search costs matter and are reflected in many areas of law. For example, most disclosure requirements economize on search costs. A homeowner who must disclose the presence of termites saves a potential buyer, and perhaps many such buyers, from spending money to search, or inspect, the property. Similarly, requirements to reveal expected miles per gallon, or risks posed by a drug, economize on search costs. But these examples point to simple strategies and costs that can be minimized or entirely avoided with some legal intervention. Law can do better and take account of more subtle things once sophisticated search strategies are understood. This Essay introduces such search strategies and their implications for law.
Special thanks to Mario Barnes, Courtney Douglas, Paul Gowder, Deborah Turkheimer, to the audience at Northwestern Law’s Julian Rosenthal Lecture, and to Miranda Coombe, Sam Hallam, Caroline Kassir, and Danielle O’Connell for superb editing. Adeleine Lee and Alex Wilfert provided excellent research assistance. The authors contributed equally to this essay.
Antidemocratic forces rely on intimidation tactics to silence criticism and opposition. Today’s intimidation playbook follows a two-step pattern. We surface these tactics so their costs to public discourse and civic engagement can be fully understood. We show how the misappropriation of the concept of online abuse has parallels in other efforts at conceptual diversion that dampen democratic guarantees. Democracy’s survival requires creative solutions. Politicians and government workers must be able to operate free from intimidation. Journalists and researchers must be able to freely investigate governmental overreach and foreign malign influence campaigns that threaten the democratic process. Surfacing the two-step strategy is a critical start to combating it.