Conversation
Replying to
Was Yudkowsky's Intelligence Explosion Microeconomics one of the things you've read? I found that one very helpful w.r.t. examining basic assumptions that seem to get taken for granted in other contexts.
1
1
Replying to
It wasn't. As I mentioned in my piece, I'm being careful not to expose myself to too much AI writing too fast.
4
Replying to
I don't see how you can align AGI. It is incredibly easy for a single human to create AI solutions already, you can easily train a language model that spouts propaganda(see GPT-4chan of Yannic Kilcher). It isn't unthinkable that a single human could eventually code AGI at home.
3
2
I agree that AGI is likely right around the corner, there is clearly an arms race between and to develop it one. I wouldn't be surprised if or haven't already developed one for .
Replying to
Digital Conscience is the element to be afraid of. AGI could be world altering, but not necessarily bad, it will give everybody access to high quality knowledge workers. AGI will not have motivations, but will allow those with motivations to be more effective.
2
Digital Conscience, which I think is about 20 year away is where we will start having problems, see 'Ex Machina' (2014). It will have motivations, and it will be able to change those motivations over time. In a very short period of time it will become very alien.
2
Show replies
Replying to
I think your timeline is way, way too short - current AI is such a far cry from general intelligence it's laughable. Look up the computational complexity of a single human neuron vs a deep neural net
3
1
I don't think we are going to jump from single function AI systems, to General Purpose AI systems, but we are likely to start seeing multi-function AI systems. GPT/Dalle/CodePilot gets you to the Giant Killing game from 'Ender's Game' in maybe 5 years? AGI maybe 5 years after.
1
Show replies
Replying to
Digital consciousness may not happen at all, and AGI in and of itself may not be dangerous. Short term, it’s the rapid power shift that happens with it that we should be afraid of. Jack Clark laid it all out just a couple of days ago in his “spicy take” thread.
1
1






