(1/6) Humanity seems to have developed an increasing respect for consciousness over time. It ebbs and flows but, in spite of our violent past, the trajectory appears positive.
(2/6) It seems likely that we’ll evolve to having a morality where maximization of consciousness and, correspondingly, minimization of destruction of consciousness is what we’re solving for.
(3/6) Given this, seems like the most optimistic case for AGI is that its development is so rapid that it all but misses the early phases of our destructive evolution and fast forwards to viewing consciousness as a fundamental good worthy of being preserved.
(4/6) The hope would be that this would include biological substrate consciousness, even if becomes inferior on many dimensions. The thing I’m unsure of is whether we could imbue AGI with this value, or whether we just have to hope it’s a fundamental truth of the universe!
(5/6) Def spend a lot of time thinking about plausible optimistic paths and would most welcome any other perspectives... would love for there to be many positive paths!
(6/6) Seems morally obligatory to do everything we can to fight for the brightest possible future so feels important to front-load thinking about these things so we can take actions that are hopefully at least directionally correct.
What do you think of the idea that many sufficiently advanced biological consciousnesses such as humans have high enough bandwidth with silicon based AGI that we can sorta morph together. I think it might be possible that AGI could convince enough of the brain that it is the host