In general new tech goes first to the rich, and then the cost goes down & it becomes more widely available until all can share it. But there is a class of potential tech to which this logic does not apply. Doomsday tech; last period tech; ladder-kicking tech. For example,
-
Show this thread
-
Vernor Vinge wrote of "bobbles," military tech that provided such a decisive military advantage the first faction to use it was near guaranteed an eternal chokehold on power. There is a nearer example: genetic engineering for intelligence. If it is possible, and first available
2 replies 1 retweet 28 likesShow this thread -
to the rich, and confers a large enough advantage, it seems plausible to me the first generation of adopters (rather, their children) could secure a permanent advantage. New rounds of tech would only heighten the problem, as the first altered generation used the cutting edge
6 replies 0 retweets 19 likesShow this thread -
tech on their kids, etc. * * * I see this issue as more serious by many orders of magnitude than AI risk.
5 replies 1 retweet 15 likesShow this thread -
-
Replying to @danlistensto @PereGrimmer
the genetic engineering you're talking about would only accelerate some of the natural consequences of improved nutrition and pediatric medicine (and possibly assortative mating)
1 reply 0 retweets 5 likes -
Replying to @danlistensto @PereGrimmer
it does bring to mind an interesting thought experiment about how much an order of magnitude difference in pace of acceleration would matter though. probably quite a lot.
1 reply 0 retweets 5 likes -
Replying to @danlistensto
Yeah. Think of it concretely - say 50,000 people each have at least one kid engineered to have an IQ 6SD higher than von Neumann. The kids form a community. What would they honestly think of we schlubs?
1 reply 1 retweet 5 likes -
Replying to @PereGrimmer @danlistensto
And, spot on re accelerating assortative mating; but I think it has the potential to be qualitatively different due to the potential for a shift of very high magnitude not subject to noise.
1 reply 0 retweets 5 likes -
Replying to @PereGrimmer
i honestly can't imagine a person with an IQ double that of Von Neumann, who was already a few steps past the line of "can interact with normals without friction" by most accounts. I wonder what the upper cognitive limit actually is. What kind of enhancements are possible?
4 replies 0 retweets 5 likes
Perhaps there's a limit for biological brains without cryptographically secured reward systems: at some point the mind becomes too smart to be blackmailed by the organism to regulate its affairs.
-
-
Replying to @Plinz @PereGrimmer
can you expand on that? how would you secure a brain's reward functions? in a computer system we'd be looking at message integrity and code injection vulnerabilities. does the brain have analogous functional structures? afaik we have no idea what the mechanism really is.
1 reply 0 retweets 1 like -
Replying to @danlistensto @PereGrimmer
If a human mind realizes the relevance of hacking its reward function, it may chose to lock itself into the cell of a monastery for a couple decades and let go of whatever it wants. I don't think that we evolved protections (like guilt, shame, boredom, love) against that.
1 reply 0 retweets 3 likes - 5 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.