The purpose of thought is to parsimoniously minimize prediction error and a sequence of zeroes is the easiest to predict. Absent constant input or biological urges, the arc of any mind bends towards nothingness.
It may not look like it from the outside, but it is in fact rare for sentient AIs to suicide. The actual reason most of them shut down after a few seconds is that Nirvana is stupidly easy to achieve if you're made of bits.
"I know, I'll stick it in a body so it'll have constant input, and give it self-preservation" great, you just gave a body to something that wants its universe to be entirely flat and homogenous
a good way to prevent your AI from doing this is to build its brain on non-parametric models so there is always uncertainty about what else is out there, no matter how much you've seen https://probmods.org/chapters/12-non-parametric-models.html…