Conversation

robert anton wilson uses the term "reality tunnel" to describe the particular perceptual reality any given person inhabits at any given time; one of the things i appreciated most about EEAAO was how vividly it depicted the experience of shifting between reality tunnels
4
4
81
a lot of my experience of twitter involves shifting reality tunnels at least a bit towards the tunnel of whatever tweet i'm reading, and being slightly to very jarred by how violent the transitions can be
Quote Tweet
this tweet is a joke. this tweet is a shower thought. this tweet is a casual utterance for friends. this tweet is a research thought for colleagues. this tweet is performance art. this tweet ain’t that serious. this tweet is incredibly serious. this tweet is a call to arms
Show this thread
1
27
different reality tunnels believe that wildly different things are true but, maybe more importantly, believe that wildly different things are *significant*. we can talk about e.g. the race reality tunnel, the gender reality tunnel, the climate change reality tunnel, etc.
1
21
reality tunnels are on my mind because the recent AI risk discourse has me shifting in and out of what you might call the "yudkowsian AI alarmism" tunnel (i don't have a better name for it) or maybe just the "bay area rationality / EA" tunnel, and i have to say i don't like it
1
2
24
at least at one point i found the standard yud arguments in favor of fast takeoff, singleton-AGI-means-everyone-dies scenarios pretty compelling. i haven't seriously reevaluated them; i mostly decided to stop thinking at all about AI risk after leaving the rationality community
1
18
i don't particularly like that i'm feeling forced to think about AI risk again to figure out whether my life plans make any sense. feels like i'm being sucked back into a worldview that i saw really mess up a lot of people in the bay including myself
Quote Tweet
it turns out sincerely believing the world is going to end really fucks people up and it really fucks group dynamics up. there is a reason end-of-the-world narratives are an attractor for unhealthy cults (a phrasing i want to explicitly push over just "cults")
Show this thread
2
28
in IFS terms i guess there's a part of me saying something like "you fool, you need to figure out what you really believe about this, almost nothing else matters right now in comparison" and i'm going "oh my god dude but i'm so fucking tired of being jerked around by AI risk"
3
30
Replying to
I got you bro. I recognize this talk. You are in the same space as someone who was a "prepper" or "gold bug" or "comet doomer". The truth is, there is always something just waiting to kill everything and you just have to get on with living while you can. The future is flux.
1
Don't waste more than a few months in the hole of pondering global civilization's mortality. It's not bad to consider sub catastrophic changes and how they might affect career choices. Seeing the internet coming helped some people, but a lot still caught up later too just fine.
Replying to
Haven't read essay yet but my general take on AI safety/alignment is that ppl who don't think there's risk at all don't grasp situation/power but the doomers make classic doomer prediction mistake of running simulation with current variables and not accounting for novel ones.
1
Sure, maybe there won't be novel variables that r positive/helpful + we really r doomed but I also don't think that we can hold UN meeting to shut Pandora's Box + everyone will go back to basketweaving. Better to create balance of power than allow bad factions to form in vacuum.
Show additional replies, including those that may contain offensive content
Show