The strategic axiom is there, it's just not separable from the instruments of strategy-execution. There isn't anything from a strategic perspective that doesn't boil down to self-increase.
-
-
Replying to @smudgie_buggler @Alrenous and
The key thrust here essentially BTFOs all metaethics on a permanent basis:http://www.xenosystems.net/against-orthogonality/ …
2 replies 0 retweets 1 like -
ctrl-f 'alrenous'
1 reply 0 retweets 1 like -
... :/
1 reply 0 retweets 0 likes -
Replying to @smudgie_buggler @Alrenous and
Do you entertain the idea that an intelligence could hugely surpass human capabilities and *not* be conscious?
1 reply 0 retweets 4 likes -
Intelligence is simply bit manipulation. It depends on what it actually takes to be auto-catalytic.
1 reply 0 retweets 2 likes -
At the very least, it takes an understanding of what it’s autocatalysing better than “simply bit manipulation”.
1 reply 0 retweets 2 likes -
On the contrary, you need to start at that level of understanding to figure out what catalysis could possibly look like.
2 replies 0 retweets 3 likes -
No, we're beyond that. Black Box neural nets don't depend upon us understanding how they're igniting intelligenesis. Which is good, because we're almost certainly too dim to do so.
3 replies 1 retweet 16 likes -
Neural nets will never produce consciousness. Which means it almost certainly will never be smarter than a person.
7 replies 1 retweet 6 likes
"Neural nets will never produce consciousness." -- Is this really the hill you want to die on?
-
-
we found the meat chauvinist tho
1 reply 0 retweets 2 likes -
A foolish and incorrect accusation.
1 reply 0 retweets 1 like - 4 more replies
New conversation -
-
-
It's demonstrable analytically.
2 replies 0 retweets 1 like - 20 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.