My current working definition of overoptimization goes like this: overoptimization has occurred when the introduction of immense amounts of information into a human system produces conditions that allow for some players within that system to maximize their comparative advantage, without overtly breaking the rules, in a way that (intentional or not) creates meaningful negative social consequences. I want to argue that many human systems in the 2020s have become overoptimized in this way, and that the social ramifications are often bad.