👩💻 Join our community of thousands of amazing developers!
My understanding is that various ‘coherence’ arguments exist of the form: If your preferences diverged from being representable by a utility function in some way, then you would do strictly worse in some way than by having some kind of preferences that were representable by a utility function. For instance, you will lose money, for nothing. You have good reason not to do that / don’t do that / you should predict that reasonable creatures will stop doing that if they notice that they are doing i...