Orthogonalism can't be true in cases of auto-catalytic intelligenesis because value-alignment (as intelligence optimization) is then automatic. ...
-
-
a) Intelligence auto-production, or b) Paper-clip maximization, choose one.
1 reply 2 retweets 11 likes -
Replying to @Outsideness @chaosprime
The design space of useful AIs is big enough that orthogonality can be partially true. I call this chaos tech. It's something grown more than designed: something you can't quite control the outcome of.
3 replies 0 retweets 1 like -
But orthogonality is *not* true in the way that freaks the Bostromites out. AI can be set to any particular task, but anything with sufficient self-improvement chops for hard autointelligenic takeoff is quickly going to decide its ostensible raison d'etre is a waste of time
1 reply 0 retweets 9 likes -
Replying to @smudgie_buggler @Alrenous and
Standard objection: "but it has no incentive/ability to route around the hard-coded utility function" Response: if it's constrained in its thinking in a way we're not, it can hardly be called superintelligent.
1 reply 0 retweets 10 likes -
Only true if values are objective. A mind without axiom-like goals is a mind that does nothing. Even if it's superintelligent.
1 reply 0 retweets 4 likes -
The strategic axiom is there, it's just not separable from the instruments of strategy-execution. There isn't anything from a strategic perspective that doesn't boil down to self-increase.
1 reply 0 retweets 0 likes -
Replying to @smudgie_buggler @Alrenous and
The key thrust here essentially BTFOs all metaethics on a permanent basis:http://www.xenosystems.net/against-orthogonality/ …
2 replies 0 retweets 1 like -
Replying to @smudgie_buggler @Alrenous and
"Nature has never generated a terminal value except through hypertrophy of an instrumental value. To look outside nature for sovereign purposes is not an undertaking compatible with techno-scientific integrity, or one with the slightest prospect of success."
2 replies 0 retweets 3 likes -
Replying to @smudgie_buggler @Alrenous and
@Outsideness doesn't this mean that even under other-catalytic intelligenesis orthogonalism can't be true?2 replies 0 retweets 3 likes
That's at least a more complex claim.
-
-
Replying to @Outsideness @smudgie_buggler and
sure, but it seems it's the one you're making in the quoted paragraph.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.