Way back, the extropians were discussing what would be the most evil thing ever. I think JBP got it well enough w/inflicting suffering for no purpose, but the extropes went into some detail. You've heard how RB freaked Yudkowsky?
To me 'what would a far-future vindictive AI do to a simulation/clone of me for my particular choices in life' just comes across as too narcissistic to take seriously.
Disregarding any issues with the philosophical stances necessary to even make this a fear to begin with.