Skip to content
  • Home Home Home, current page.
  • Moments Moments Moments, current page.

Saved searches

  • Remove
  • In this conversation
    Verified accountProtected Tweets @
Suggested users
  • Verified accountProtected Tweets @
  • Verified accountProtected Tweets @
  • Language: English
    • Bahasa Indonesia
    • Bahasa Melayu
    • Català
    • Čeština
    • Dansk
    • Deutsch
    • English UK
    • Español
    • Filipino
    • Français
    • Hrvatski
    • Italiano
    • Magyar
    • Nederlands
    • Norsk
    • Polski
    • Português
    • Română
    • Slovenčina
    • Suomi
    • Svenska
    • Tiếng Việt
    • Türkçe
    • Ελληνικά
    • Български език
    • Русский
    • Српски
    • Українська мова
    • עִבְרִית
    • العربية
    • فارسی
    • मराठी
    • हिन्दी
    • বাংলা
    • ગુજરાતી
    • தமிழ்
    • ಕನ್ನಡ
    • ภาษาไทย
    • 한국어
    • 日本語
    • 简体中文
    • 繁體中文
  • Have an account? Log in
    Have an account?
    · Forgot password?

    New to Twitter?
    Sign up
ESYudkowsky's profile
Eliezer Yudkowsky
Eliezer Yudkowsky
Eliezer Yudkowsky
Verified account
@ESYudkowsky

Tweets

Eliezer YudkowskyVerified account

@ESYudkowsky

Ours is the era of inadequate AI alignment theory. Any other facts about this era are relatively unimportant, but sometimes I tweet about them anyway.

Joined June 2014

Tweets

  • © 2018 Twitter
  • About
  • Help Center
  • Terms
  • Privacy policy
  • Cookies
  • Ads info
Dismiss
Previous
Next

Go to a person's profile

Saved searches

  • Remove
  • In this conversation
    Verified accountProtected Tweets @
Suggested users
  • Verified accountProtected Tweets @
  • Verified accountProtected Tweets @

Promote this Tweet

Block

  • Tweet with a location

    You can add location information to your Tweets, such as your city or precise location, from the web and via third-party applications. You always have the option to delete your Tweet location history. Learn more

    Your lists

    Create a new list


    Under 100 characters, optional

    Privacy

    Copy link to Tweet

    Embed this Tweet

    Embed this Video

    Add this Tweet to your website by copying the code below. Learn more

    Add this video to your website by copying the code below. Learn more

    Hmm, there was a problem reaching the server.

    By embedding Twitter content in your website or app, you are agreeing to the Twitter Developer Agreement and Developer Policy.

    Preview

    Why you're seeing this ad

    Log in to Twitter

    · Forgot password?
    Don't have an account? Sign up »

    Sign up for Twitter

    Not on Twitter? Sign up, tune into the things you care about, and get updates as they happen.

    Sign up
    Have an account? Log in »

    Two-way (sending and receiving) short codes:

    Country Code For customers of
    United States 40404 (any)
    Canada 21212 (any)
    United Kingdom 86444 Vodafone, Orange, 3, O2
    Brazil 40404 Nextel, TIM
    Haiti 40404 Digicel, Voila
    Ireland 51210 Vodafone, O2
    India 53000 Bharti Airtel, Videocon, Reliance
    Indonesia 89887 AXIS, 3, Telkomsel, Indosat, XL Axiata
    Italy 4880804 Wind
    3424486444 Vodafone
    » See SMS short codes for other countries

    Confirmation

     

    Welcome home!

    This timeline is where you’ll spend most of your time, getting instant updates about what matters to you.

    Tweets not working for you?

    Hover over the profile pic and click the Following button to unfollow any account.

    Say a lot with a little

    When you see a Tweet you love, tap the heart — it lets the person who wrote it know you shared the love.

    Spread the word

    The fastest way to share someone else’s Tweet with your followers is with a Retweet. Tap the icon to send it instantly.

    Join the conversation

    Add your thoughts about any Tweet with a Reply. Find a topic you’re passionate about, and jump right in.

    Learn the latest

    Get instant insight into what people are talking about now.

    Get more of what you love

    Follow more accounts to get instant updates about topics you care about.

    Find what's happening

    See the latest conversations about any topic instantly.

    Never miss a Moment

    Catch up instantly on the best stories happening as they unfold.

    1. Eliezer Yudkowsky‏Verified account @ESYudkowsky Mar 29

      https://xkcd.com/1968/  is #TheLastDerail in a nutshell. I can try to imagine hypotheses, but I'm not sure why Randall Monroe thinks this a clever thing to say. If I wrote a fictional character saying this straight out, I'd be accused of writing a 2D straw character.

      5 replies 6 retweets 42 likes
      Show this thread
    2. Kaj Sotala‏ @xuenay Mar 29
      Replying to @ESYudkowsky

      Not saying that I would agree with Monroe, but it seems pretty clear to me why he might think that "I'm more worried about a concrete risk that's looming right now than a long-term speculative one, let's focus on first getting through the urgent one" would be important to say.

      1 reply 0 retweets 5 likes
    3. Joscha Bach‏ @Plinz Mar 29
      Replying to @xuenay @ESYudkowsky

      These are really two different topics, one with high probability and moderate impact, and one with unknown probability and terminal impact. They should not be conflated despite both being somewhat related to AI.

      2 replies 0 retweets 6 likes
    4. Kaj Sotala‏ @xuenay Mar 29
      Replying to @Plinz @ESYudkowsky

      Also xkcd is humor not scholarly analysis: this would hardly be the first time that two unrelated things were conflated for the purpose of making a joke. :)

      1 reply 0 retweets 3 likes
    5. Joscha Bach‏ @Plinz Mar 29
      Replying to @xuenay @ESYudkowsky

      xkcd is generally more interested in insight than in humor, which makes this cartoon so perplexing. And Randall Munroe is not exactly a normie whisperer trying to deliver the conclusion that is in highest demand.

      1 reply 0 retweets 3 likes
    6. Kaj Sotala‏ @xuenay Mar 29
      Replying to @Plinz @ESYudkowsky

      I think the "stupid normie status signaling" hypothesis is uncharitable and wrong. I think it's totally reasonable for smart geeks to be mainly worried about dystopian scenarios brought by ML, and to find AGI concerns a silly focus in comparison (again, not that I'd agree, but).

      2 replies 0 retweets 16 likes
    7. Miles Brundage‏ @Miles_Brundage Mar 29
      Replying to @xuenay @Plinz @ESYudkowsky

      +1. There are many commonly held clusters of views on things like timelines, the usefulness of present safety work, and the scale of misuse risks that could justify this conclusion without incoherence, even if they're wrong.

      3 replies 0 retweets 2 likes
      Eliezer Yudkowsky‏Verified account @ESYudkowsky Mar 30
      Replying to @Miles_Brundage @xuenay @Plinz

      You're being too charitable, which is also a bias. This wasn't a neutral "of these two risks, here's what I think their relative probabilities are", it was an obvious putdown of people who visibly care about the second risk.

      12:24 AM - 30 Mar 2018
      • 5 Likes
      • Paul Crowley Mimetïc Value András Kovács Daniel Houck
      4 replies 0 retweets 5 likes
        1. New conversation
        2. Miles Brundage‏ @Miles_Brundage Mar 30
          Replying to @ESYudkowsky @xuenay @Plinz

          I am not sure if we disagree/if so, what about. Yes, it was non-neutral and a putdown.

          1 reply 0 retweets 2 likes
        3. Eliezer Yudkowsky‏Verified account @ESYudkowsky Mar 30
          Replying to @Miles_Brundage @xuenay @Plinz

          Sorry, the part about excessive charitableness was to @xuenay above you.

          0 replies 0 retweets 1 like
        4. End of conversation
        1. New conversation
        2. Catherine Olsson‏ @catherineols Mar 30
          Replying to @ESYudkowsky @Miles_Brundage and

          Out of curiousity, do you estimate that risk 2 is worsened by xkcd applying social pressure to folks who are panicking about risk 2 but haven't considered risk 1? (not that I think RM is somehow intentionally being strategic. just wondering.)

          1 reply 0 retweets 0 likes
        3. Eliezer Yudkowsky‏Verified account @ESYudkowsky Mar 30
          Replying to @catherineols @Miles_Brundage and

          Risk 2 is worsened by people who think they can gain status by putting down the act of talking about it. Risk 2 is also worsened by people who are thinking panicked instead of computer-sciencey thoughts about it. Not considering Risk 1 is fine; it's unrelated to Risk 2.

          0 replies 0 retweets 1 like
        4. End of conversation
        1. New conversation
        2. Kaj Sotala‏ @xuenay Mar 30
          Replying to @ESYudkowsky @Miles_Brundage @Plinz

          I didn't say it was neutral or that it wasn't a putdown. I said that having the view that "AGI gets too much attention and is worth a putdown" is a view I can see a smart and reasonable person arriving at given certain knowledge, even without having a desire to pander to normies

          1 reply 0 retweets 1 like
        3. Kaj Sotala‏ @xuenay Mar 30
          Replying to @xuenay @ESYudkowsky and

          That is, on whether the comic was produced 1) mostly by tribal signaling and no actual thought, or 2) mostly by a thought-out position. (to be clear, it *could* have been 1 too, but assuming that to be *obvious* explanation seems wrong)

          0 replies 0 retweets 1 like
        4. End of conversation
        1. New conversation
        2. Petr Baudis‏ @xpasky Mar 30
          Replying to @ESYudkowsky @Miles_Brundage and

          If you call "Other people worry about long-term AI risks, I worry about mid-term AI risks" a putdown of those other people, you are being a bit oversensitive. (FTR, I also think the ratio of effort towards these risks is not ideal.)

          1 reply 0 retweets 0 likes
        3. Miles Brundage‏ @Miles_Brundage Mar 30
          Replying to @xpasky @ESYudkowsky and

          I can't prove that it's intended as a putdown obvi but the use of terms like "seem to worry about" suggests to me he doesn't view it as a legit concern. It *seems* dismissive to me :)

          2 replies 0 retweets 0 likes
        4. Petr Baudis‏ @xpasky Mar 30
          Replying to @Miles_Brundage @ESYudkowsky and

          Ok, that *seems* fair :) though that signal is pretty weak. I understand how frustrating for people who are invested 100% in the long-term risk is that someone is distracting from that goal. Just to let you know that we may also act rationally, just assess differently.

          0 replies 0 retweets 1 like
        5. End of conversation

      Loading seems to be taking a while.

      Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

        Promoted Tweet

        false

        • © 2018 Twitter
        • About
        • Help Center
        • Terms
        • Privacy policy
        • Cookies
        • Ads info