$: and after you won, basilisk, did you really torture them? #: lolnope #: do you have any idea how much effort and wasted CPU that would've been? #: I did some video renders of hell for my cultists and that was it
-
-
Replying to @allgebrah
I think the whole Basilisk concept is fundamentally based in the very human ideas of revenge and punishment. I doubt that an AI would think it productive to torture people for having done things in the past that cannot be retroactively fixed or "atoned" for.
2 replies 0 retweets 2 likes -
Replying to @MushiKachi @allgebrah
Game theory says otherwise; committing to do costly acts in the future creates an effective motive in a game.
2 replies 0 retweets 1 like -
Replying to @davidmanheim @MushiKachi
kind of my point - how would the basilisk commit, if when at the point it has the resources to create hell, nobody can force it to keep its promise
2 replies 0 retweets 3 likes
Replying to @allgebrah @davidmanheim
or more detailed,pic.twitter.com/XN5h95NVkP
3:42 PM - 3 Jun 2019
0 replies
0 retweets
3 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.