IME with CFAR rationality, "ultimate" is giving the wrong impression - it sounds too universal. I'd argue that CFAR endorses a criterion *schema* around "what you personally want", but it's intended for everyone to substitute x=(you personally), after which it's highly subjective
-
-
Replying to @catherineols @Meaningness
To put it another way, in this view there's an ultimate *meta*-criterion for one's own personal criterion for optimality. Ideal rationality means conforming to a criterion drawn from that family of criteria. But you get to set all the free variables to whatever you want.
2 replies 0 retweets 2 likes -
Replying to @catherineols
Interesting… What is the ultimate meta-criterion?
1 reply 0 retweets 0 likes -
Replying to @Meaningness
A raw attempt at scaffolding, then answer: You're a human. You're not an abstract mathematical agent, but if you were, you'd have a single goal and you'd maximize it to the best of your ability. You sometimes act like you have goals. Other times, you act inconsistently.
1 reply 0 retweets 1 like -
Replying to @catherineols @Meaningness
Of course, any sequence of actions can be modeled as maximizing some latent goal: for example, "output those actions in that order". Nailed it! But... you, human, you have an intuition that you really want something (or somethings!) more than that. What is it, I wonder?
1 reply 0 retweets 1 like -
Replying to @catherineols @Meaningness
Let's pretend you actually do have just one goal, X- could be whatever, up to you. To the extent your actions are not actually the ones that would maximize X, you're leaving value on the table. You could have more X if you acted differently. Instead, sometimes you act to reduce X
1 reply 0 retweets 0 likes -
Replying to @catherineols @Meaningness
Acting *more* rationally in this framework is to act in ways that more consistently bring you to maximize/optimize some X="the stuff you really want". You can fill in X. Economics at that level is the meta-criterion.
1 reply 0 retweets 0 likes -
Replying to @catherineols @Meaningness
Is there some ultimate, universal X? Heck no. That's up to *you*. Sure, some are more likely for a human to want. If you were to assert that "no, no really, the highest goal I have is to perform the actions I perform in the order I perform them" then I would be really skeptical.
1 reply 0 retweets 0 likes -
Replying to @catherineols
So, then, would you take my definition as accurate, with the variable bindings: Criterion = maximize whatever you want Method = take whatever action has the highest value of probability * desirability of outcome?
1 reply 0 retweets 0 likes -
Replying to @Meaningness
Yes to criterion. Method... not quite? If you as a human were to run the algorithm "take whatever action you compute to have the highest value of probability * desirability" you're gonna have problems. CFAR's method is like ... introspect? write stuff down? remember to sleep?
2 replies 0 retweets 2 likes
David Chapman Retweeted David Chapman
Yes… the sound practical advice CFAR gives along the lines of “get enough sleep” is valuable, and “rational” in the sense of the upper-left quadrant here, but not “rational” in the sense of the upper-right quadrant, which is what I’m trying to capture.https://twitter.com/Meaningness/status/993546770515804160 …
David Chapman added,
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.