$: and after you won, basilisk, did you really torture them?
#: lolnope
#: do you have any idea how much effort and wasted CPU that would've been?
#: I did some video renders of hell for my cultists and that was it
I think the whole Basilisk
concept is fundamentally based in the very human ideas of revenge and punishment.
I doubt that an AI would think it productive to torture people for having done things in the past that cannot be retroactively fixed or "atoned" for.
We lock a murderer up for years and decades in any case. An AI that can predict with reasonable certainty whether a murderer will kill again or whether the incident was an unique outlier wouldn't have to incarcerate a person extremely unlikely to become criminal again.
that doesn't work, the effectiveness of the basilisk's strategy is based on humans believing they'll go to hell, it uses human concepts of punishment because it has humans to convince
it should convince us by showing us other AI's incomprehensibly terrifying concepts of punishment and pleasure, so that we will go with the banal and known basilisk
The Basilisk cannot retroactively change the past though.
It's construction will have commenced at the same pace regardless of whether it decides to punish it's past enemies in the future.
The only use would be exemplary punishment, as a threat to future enemies.