Algorithms have found their way into courtrooms, college admission committees, and human resource departments. While defendants and other disappointed parties have challenged the use of algorithms on the basis of due process or similar objections, it should be expected that they will also challenge their accuracy and attempt to present algorithms of their own in order to contest the decisions of judges and other authorities. The problem with this approach is that people who can transparently see why they have been algorithmically denied rights or resources can manipulate an algorithm by retrofitting data. Demands for full algorithmic transparency by policy makers and legal scholars are therefore misguided. To overcome algorithmic manipulation, we present the novel solution of algorithmic competition. This approach, versions of which have been deployed in finance, would work well in law. We show how the state, a university, or an employer should set aside untested data in a lockbox. Parties to a decision then develop their respective algorithms and compete. The algorithm that performs best with the lockbox data wins. While this approach presents several complications that this Article discusses in detail, it is superior to full disclosure of data and algorithmic transparency.