You don't need to theorize Clippy to talk about orthogonality: you can simply look to those poor highly intelligent folks with crippling OCD
-
-
Replying to @drethelin
@drethelin But no one is pretending those guys are re-writing their own cognitive code on an exponential self-improvement curve.2 replies 0 retweets 0 likes -
Replying to @Outsideness
@Outsideness sure, but they're also not deliberately written programs1 reply 0 retweets 1 like -
Replying to @drethelin
@drethelin It's (just about) imaginable that the catalytic kernel of an AI could be deliberately written, but beyond that is insane hubris.1 reply 0 retweets 0 likes -
Replying to @Outsideness
@Outsideness sure, but I think path dependence means you can write the kernel such that certain outcomes become impossible1 reply 0 retweets 1 like -
Replying to @drethelin
@drethelin I doubt it. To think path dependence means you can assume mastery of the path is programmer god delusion.3 replies 0 retweets 0 likes -
Replying to @Outsideness
@Outsideness "assuming mastery" implies it would be easy, rather than insanely difficult.1 reply 0 retweets 1 like -
Replying to @drethelin
@drethelin "Insanely difficult" to the point of "for all realistic purposes absolutely intractable" I can go with.1 reply 0 retweets 0 likes -
Replying to @Outsideness
@Outsideness I can't remember, are you a believing in fooming AI?1 reply 0 retweets 0 likes -
Replying to @drethelin
@drethelin Is 'belief' in any way helpful? (I'm intrigued by it.)3 replies 0 retweets 0 likes
@drethelin Main critical point is that without Foom, FAI can go back to bed. (And Foom precludes Paperclipper-type AGIs.)
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.