The strategic axiom is there, it's just not separable from the instruments of strategy-execution. There isn't anything from a strategic perspective that doesn't boil down to self-increase.
-
-
Replying to @smudgie_buggler @Alrenous and
The key thrust here essentially BTFOs all metaethics on a permanent basis:http://www.xenosystems.net/against-orthogonality/ …
2 replies 0 retweets 1 like -
ctrl-f 'alrenous'
1 reply 0 retweets 1 like -
... :/
1 reply 0 retweets 0 likes -
Replying to @smudgie_buggler @Alrenous and
Do you entertain the idea that an intelligence could hugely surpass human capabilities and *not* be conscious?
1 reply 0 retweets 3 likes -
Intelligence is simply bit manipulation. It depends on what it actually takes to be auto-catalytic.
1 reply 0 retweets 2 likes -
At the very least, it takes an understanding of what it’s autocatalysing better than “simply bit manipulation”.
1 reply 0 retweets 2 likes -
On the contrary, you need to start at that level of understanding to figure out what catalysis could possibly look like.
2 replies 0 retweets 3 likes -
No, we're beyond that. Black Box neural nets don't depend upon us understanding how they're igniting intelligenesis. Which is good, because we're almost certainly too dim to do so.
3 replies 1 retweet 15 likes -
Neural nets will never produce consciousness. Which means it almost certainly will never be smarter than a person.
7 replies 1 retweet 6 likes
pretty daring to assert the level of understanding of consciousness necessary to assume the question that it's produced by anything, much less what it can and can't be produced by
-
-
Okay.
0 replies 0 retweets 1 likeThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.