https://overcast.fm/+Ic2hwsH2U/1:10:49 … #AI
“You could build a mind that thought that 51 was a prime number but otherwise had no defect of its intelligence – if you knew what you were doing” —@ESYudkowsky
Is it possible build a mind able to learn but incapable of correcting this error? (Why?)
-
-
Also to be super clear, no human should ever try to pull this kind of shenanigan while doing AGI alignment. Find simple, compact, coherent, consistent ways to do stuff or don't do it.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I’m wondering why you guys never consider motivation. I expect it would be easily possible to create something that thinks 51 is prime and is just not motivated, possibly is anti-motivated, to correct the error. You’ll notice the parallel in people.
-
I expect it would be possible but very difficult. Accuracy subgoals pop up from all sorts of parent goals. Humans delude themselves but humans are thoroughly nuts and have crazy philosophies and weak metacognition that easily permits such errors, which is a further defect.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.