“If your method requires infinite storage space and computation speed to use, the problem is not that finite beings aren’t sufficiently rational, it’s that your method doesn’t work.”
(@nostalgebraist re Bayes, but applies to many rationalist fantasies.)http://nostalgebraist.tumblr.com/post/161645122124/bayes-a-kinda-sorta-masterpost …
-
-
Replying to @Meaningness @nostalgebraist
Worth noting that this was written before Eliezer's "Toolbox-thinking and Law-thinking"; several LW commenters seem to feel that that the T-T & L-T post addresses the criticisms in this post (based on the comments at https://www.lesswrong.com/posts/gsQjde3qeZw36arYE/nostalgebraist-bayes-a-kinda-sorta-masterpost … ).
2 replies 0 retweets 0 likes -
Replying to @xuenay @nostalgebraist
It doesn’t seem to me that EY’s post addresses the issues at all. But, after much effort, I don’t understand his point, so I might be missing something.
1 reply 0 retweets 0 likes
Afaict, EY is saying “PT is a set of axioms. If you accept the axioms, then you have to accept their mathematical consequences.” Which no one disputes. The question is, where do the axioms apply? Which he doesn’t seem to address at all.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.