Beware dark patterns. The name should be a warning, perhaps alluding to the dark web, the “Dark Lord” Sauron, or another archetypically villainous and dangerous entity. Rightfully included in this nefarious bunch, dark patterns are software interfaces that manipulate users into doing things they would not normally do. Because of these First Amendment complications, the constitutionality of dark pattern restrictions is an unsettled question. To begin constructing an answer, we must look at how dark patterns are regulated today, how companies have begun to challenge the constitutionality of such regulations, and where dark patterns fall in the grand scheme of free speech. Taken together, these steps inform an approach to regulation going forward.
First Amendment
This Case Note explores the possibility that, in a world where TikTok is banned or heavily regulated, individual TikTok users could sue states under a Takings Clause theory. Any such cases would have to wrestle with two core questions (1) whether the account holders hold an actionable property interest in their accounts; and (2) if so, whether permanently and totally depriving users of access to their accounts constitutes a taking.
The First Amendment prohibits the state from “establish[ing]” a religion, and it is uncontroversial that this prohibition extends to so-called religious coercion.
This past term, the Supreme Court in Kennedy v. Bremerton School District (2022) formally overturned the notorious Lemon test that had governed Establishment Clause jurisprudence for more than a half-century.
In his quixotic bid to buy and reform Twitter, Elon Musk swiftly arrived at the same place nearly every tech mogul does: he doesn’t want censorship, but he does want to be able to suppress some legal speech.
The internet has drastically altered our notion of the press.
This term, the Supreme Court is scheduled to hear and consider Kristin Biel’s case.