I guess what really bothers me about Yudkowsky is that he is infinitely committed to the question of what gains can be wrung out of small increases in map fidelity and likewise infinitely disinterested in how lossy a map can be while producing equivalent behavior.
-
Show this thread
-
This Tweet is unavailable.
-
Replying to @0K_ultra
He thinks rationality is a mark of superiority even when it produces indistinguishable effects from lesser rationality. And you can't deny this is an uncommon occurrence looking at history, so it's not an edge case, it's a wedge down the center
2 replies 0 retweets 1 like -
People mostly do the best they can given circumstances. To say "I could do better" then qualify it with "and would be superior even if I couldn't" is odious
1 reply 0 retweets 1 like -
This Tweet is unavailable.
-
Replying to @0K_ultra
I'm not saying there's no argument to be made for elegance or generalizability, just that it's gymnastic to occupy a context, actually or hypothetically, and imagine that the true test of your mettle comes from outside that context. You might as well believe The Last Starfighter.
1 reply 0 retweets 1 like
I mean, aren't all or at least most optimizations domain specific and conditional?
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.