if your reaction to shorter AI timelines is to panic and freak out instead of taking more risks and chasing after your dreams then your problem isn’t AI timelines, it’s being crippled by fear
Conversation
Replying to
if you think we’re all gonna die in 10 years or whatever then you have an opportunity to seriously ask yourself: what would it take to make these 10 years meaningful? when the end comes what will you wish you were doing?
4
6
83
haha oh no writing this brought up some Feelings i gotta take care of this 😅
1
2
43
as usual i am just Talking to Myself on Twitter Dot Com
1
3
41
friendly reminder
Quote Tweet
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
Show this thread
1
21
Replying to
qc you have more high potency serious emotional bangers than texas has too-large trucks
1
5
i wonder if some of that time spent "wasted" didn't turn out valuable w/ stuff whirring around in your subconscious
1
1
3
Show replies
Replying to
also while I couldn’t be more aisafetypilled I think most people who are sure we’re all gonna die when ai gets good haven’t done good thought, it’s just emotional crippling fear masquerading as intellectual analysis
4
2
49
easier to imagine paperclip apocalypse than infinite horizon of novelty
1
2
Show replies
Replying to
If you substitute "incoming asteroid" for AI do you still endorse this tweet?
1
1
17
Why would that change it, assuming equal timelines and equal tractability? The lower the amount by which you think you can shift the probability of (AI or asteroid) xrisk, the less worthwhile trying seems vs e.g. saving people from malaria so they can enjoy whatever time is left.
1
1
1






