Skip to content
  • Home Home Home, current page.
  • Moments Moments Moments, current page.

Saved searches

  • Remove
  • In this conversation
    Verified accountProtected Tweets @
Suggested users
  • Verified accountProtected Tweets @
  • Verified accountProtected Tweets @
  • Language: English
    • Bahasa Indonesia
    • Bahasa Melayu
    • Català
    • Čeština
    • Dansk
    • Deutsch
    • English UK
    • Español
    • Filipino
    • Français
    • Hrvatski
    • Italiano
    • Magyar
    • Nederlands
    • Norsk
    • Polski
    • Português
    • Română
    • Slovenčina
    • Suomi
    • Svenska
    • Tiếng Việt
    • Türkçe
    • Ελληνικά
    • Български език
    • Русский
    • Српски
    • Українська мова
    • עִבְרִית
    • العربية
    • فارسی
    • मराठी
    • हिन्दी
    • বাংলা
    • ગુજરાતી
    • தமிழ்
    • ಕನ್ನಡ
    • ภาษาไทย
    • 한국어
    • 日本語
    • 简体中文
    • 繁體中文
  • Have an account? Log in
    Have an account?
    · Forgot password?

    New to Twitter?
    Sign up
ESYudkowsky's profile
Eliezer Yudkowsky
Eliezer Yudkowsky
Eliezer Yudkowsky
Verified account
@ESYudkowsky

Tweets

Eliezer YudkowskyVerified account

@ESYudkowsky

Ours is the era of inadequate AI alignment theory. Any other facts about this era are relatively unimportant, but sometimes I tweet about them anyway.

Joined June 2014

Tweets

  • © 2018 Twitter
  • About
  • Help Center
  • Terms
  • Privacy policy
  • Cookies
  • Ads info
Dismiss
Previous
Next

Go to a person's profile

Saved searches

  • Remove
  • In this conversation
    Verified accountProtected Tweets @
Suggested users
  • Verified accountProtected Tweets @
  • Verified accountProtected Tweets @

Promote this Tweet

Block

  • Tweet with a location

    You can add location information to your Tweets, such as your city or precise location, from the web and via third-party applications. You always have the option to delete your Tweet location history. Learn more

    Your lists

    Create a new list


    Under 100 characters, optional

    Privacy

    Copy link to Tweet

    Embed this Tweet

    Embed this Video

    Add this Tweet to your website by copying the code below. Learn more

    Add this video to your website by copying the code below. Learn more

    Hmm, there was a problem reaching the server.

    By embedding Twitter content in your website or app, you are agreeing to the Twitter Developer Agreement and Developer Policy.

    Preview

    Why you're seeing this ad

    Log in to Twitter

    · Forgot password?
    Don't have an account? Sign up »

    Sign up for Twitter

    Not on Twitter? Sign up, tune into the things you care about, and get updates as they happen.

    Sign up
    Have an account? Log in »

    Two-way (sending and receiving) short codes:

    Country Code For customers of
    United States 40404 (any)
    Canada 21212 (any)
    United Kingdom 86444 Vodafone, Orange, 3, O2
    Brazil 40404 Nextel, TIM
    Haiti 40404 Digicel, Voila
    Ireland 51210 Vodafone, O2
    India 53000 Bharti Airtel, Videocon, Reliance
    Indonesia 89887 AXIS, 3, Telkomsel, Indosat, XL Axiata
    Italy 4880804 Wind
    3424486444 Vodafone
    » See SMS short codes for other countries

    Confirmation

     

    Welcome home!

    This timeline is where you’ll spend most of your time, getting instant updates about what matters to you.

    Tweets not working for you?

    Hover over the profile pic and click the Following button to unfollow any account.

    Say a lot with a little

    When you see a Tweet you love, tap the heart — it lets the person who wrote it know you shared the love.

    Spread the word

    The fastest way to share someone else’s Tweet with your followers is with a Retweet. Tap the icon to send it instantly.

    Join the conversation

    Add your thoughts about any Tweet with a Reply. Find a topic you’re passionate about, and jump right in.

    Learn the latest

    Get instant insight into what people are talking about now.

    Get more of what you love

    Follow more accounts to get instant updates about topics you care about.

    Find what's happening

    See the latest conversations about any topic instantly.

    Never miss a Moment

    Catch up instantly on the best stories happening as they unfold.

    1. Eliezer Yudkowsky‏Verified account @ESYudkowsky Jan 21

      I think "AIs make paperclips" has probably obtained more audience than all of my other conceptual originations combined. I guess it's not very surprising that it's an invention that had the potential for easy misunderstanding, and that it's the misunderstanding that spread.

      13 replies 22 retweets 92 likes
      Show this thread
    2. Alexander Davis‏ @ADeebus Jan 21
      Replying to @ESYudkowsky

      As I understand it, it's a warning about how an AI will maximize its values and that those values are unlikely to coincidentally align w/ human values. Thus, example of seemingly harmless paperclick imperative that runs amok when backed w/ enormous intelligence. Roughly accurate?

      1 reply 0 retweets 1 like
    3. Mary Olivia‏ @amalgamary Jan 22
      Replying to @ADeebus @ESYudkowsky

      The values of the AI in these scenarios are whatever we tell it to optimize for - it's just that if we don't take the end result of such optimization into account, we will give the AI priorities that are not, ultimately, in line with well-rounded human flourishing.

      2 replies 0 retweets 1 like
    4. Eliezer Yudkowsky‏Verified account @ESYudkowsky Jan 22
      Replying to @amalgamary @ADeebus

      You only get to fail at that point if you have succeeded on the much earlier problem of having the AGI's optimization target bear any resemblance whatsoever to what you hoped you were targeting.

      1 reply 0 retweets 1 like
    5. Alexander Davis‏ @ADeebus Jan 22
      Replying to @ESYudkowsky @amalgamary

      Seems easier from a layman's perspective. Won't the early work on such an AGI's predecessors ensure that it's hitting some sort of benchmark, as a measure of progress, ie AlphaGo's win/loss ratio? If AlphaGo were generalized, its values would presumably be go-related, right. No?

      2 replies 0 retweets 0 likes
    6. Eliezer Yudkowsky‏Verified account @ESYudkowsky Jan 22
      Replying to @ADeebus @amalgamary

      That's as likely to happen as natural selection producing humans who exclusively and explicitly target inclusive genetic fitness. Hill-climbing with X as a fitness function does not scale to produce smart consequentialists targeting X as a goal.

      1 reply 0 retweets 2 likes
    7. Alexander Davis‏ @ADeebus Jan 22
      Replying to @ESYudkowsky @amalgamary

      Interesting. I would have thought your latter statement was obviously wrong, which of necessity recommends a huge update in my beliefs (now must be set at, "I don't understand this domain very well and am very bad at predicting things as relates to it.")

      1 reply 0 retweets 1 like
    8. Eliezer Yudkowsky‏Verified account @ESYudkowsky Jan 22
      Replying to @ADeebus @amalgamary

      Are you've sure you've understood the statement that you think you'd have considered obviously wrong? Like, is the evolution of humans now a clear example and demonstration of the statement that sounded obviously wrong?

      1 reply 0 retweets 0 likes
    9. Alexander Davis‏ @ADeebus Jan 22
      Replying to @ESYudkowsky @amalgamary

      I think so? Perhaps you'd help: if I understand, you're saying that a species moving incremental updates towards a set goal (either via evolution or programming) is unlikely to yield higher-order thinking that still moves towards that goal in a predictable way? Like how rock 1/2

      1 reply 0 retweets 0 likes
    10. Alexander Davis‏ @ADeebus Jan 22
      Replying to @ADeebus @ESYudkowsky @amalgamary

      crystals may have formed larger and more complex structures, and yet once they got complex enough to (in one theory) evolve into living cells, their replication no longer tracked with previous inferred goals of "make bigger rock crystals?" 2/2

      1 reply 0 retweets 0 likes
      Eliezer Yudkowsky‏Verified account @ESYudkowsky Jan 22
      Replying to @ADeebus @amalgamary

      I'm not sure that's a good example? They were unintelligent systems reproducing by other means. Humans, who do have explicit desires, have no explicit desire to replicate DNA. External selection on X doesn't produce internal explicit quoted goals of X.

      9:14 AM - 22 Jan 2018
      • 2 Likes
      • Jesin00 Alexander Davis
      1 reply 0 retweets 2 likes
        1. Alexander Davis‏ @ADeebus Jan 22
          Replying to @ESYudkowsky @amalgamary

          Makes sense. I was trying to come up with something that was far enough away to not merely be aping the previous examples, but it ended up more as an analogy. But I do think I understand this thread, at least. And so I'm confident in my lack of confidence in this domain. Thanks!

          0 replies 0 retweets 2 likes
          Thanks. Twitter will use this to make your timeline better. Undo
          Undo

      Loading seems to be taking a while.

      Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

        Promoted Tweet

        false

        • © 2018 Twitter
        • About
        • Help Center
        • Terms
        • Privacy policy
        • Cookies
        • Ads info