To put it another way, in this view there's an ultimate *meta*-criterion for one's own personal criterion for optimality. Ideal rationality means conforming to a criterion drawn from that family of criteria. But you get to set all the free variables to whatever you want.
-
-
Replying to @catherineols
Interesting… What is the ultimate meta-criterion?
1 reply 0 retweets 0 likes -
Replying to @Meaningness
A raw attempt at scaffolding, then answer: You're a human. You're not an abstract mathematical agent, but if you were, you'd have a single goal and you'd maximize it to the best of your ability. You sometimes act like you have goals. Other times, you act inconsistently.
1 reply 0 retweets 1 like -
Replying to @catherineols @Meaningness
Of course, any sequence of actions can be modeled as maximizing some latent goal: for example, "output those actions in that order". Nailed it! But... you, human, you have an intuition that you really want something (or somethings!) more than that. What is it, I wonder?
1 reply 0 retweets 1 like -
Replying to @catherineols @Meaningness
Let's pretend you actually do have just one goal, X- could be whatever, up to you. To the extent your actions are not actually the ones that would maximize X, you're leaving value on the table. You could have more X if you acted differently. Instead, sometimes you act to reduce X
1 reply 0 retweets 0 likes -
Replying to @catherineols @Meaningness
Acting *more* rationally in this framework is to act in ways that more consistently bring you to maximize/optimize some X="the stuff you really want". You can fill in X. Economics at that level is the meta-criterion.
1 reply 0 retweets 0 likes -
Replying to @catherineols @Meaningness
Is there some ultimate, universal X? Heck no. That's up to *you*. Sure, some are more likely for a human to want. If you were to assert that "no, no really, the highest goal I have is to perform the actions I perform in the order I perform them" then I would be really skeptical.
1 reply 0 retweets 0 likes -
Replying to @catherineols
So, then, would you take my definition as accurate, with the variable bindings: Criterion = maximize whatever you want Method = take whatever action has the highest value of probability * desirability of outcome?
1 reply 0 retweets 0 likes -
Replying to @Meaningness
Yes to criterion. Method... not quite? If you as a human were to run the algorithm "take whatever action you compute to have the highest value of probability * desirability" you're gonna have problems. CFAR's method is like ... introspect? write stuff down? remember to sleep?
2 replies 0 retweets 2 likes -
Replying to @catherineols @Meaningness
Why those methods? *because* when you use them, you tend to satisfy the criterion. But they're a grab bag of methods, not one weird trick, because you are a human, not AIXI (https://en.wikipedia.org/wiki/AIXI ). If you try to act like AIXI you'll have a bad time.
2 replies 0 retweets 1 like
Yes, I think understanding rationality as a “grab bag of methods” is really important. It does go beyond “rationalism” as I’m defining it for the purpose of the particular document I’m working on.
-
-
Replying to @Meaningness
Got it. I assumed that "one ought maximize one's EV" and/or "the collective philosophy of the CFAR founders" would count. I think I hear you as saying that formality is important to this document. In which case, my advice is to emphasize that in this paragraph.
2 replies 0 retweets 1 like -
Replying to @catherineols
Thanks, yes, this passage comes right after one that explicitly distinguishes the formal and informal senses of “rational.” Out-of-context chunk of a full-length book, so some unclarity is inevitable!
0 replies 0 retweets 2 likes
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.