Search results
  1. Conditional on AGI extinguishing humanity, lost jobs are hardly more important. So this argument only makes sense if you think AGI is impossible or alignment is easy. So you should just argue that claim; the part about jobs is irrelevant.

  2. "Lol, status-as-ontological-weight? I guess that explains . It's not that they're trying to distract you from the apocalypse by pointing to inequality, they think they're lowering the probability of AGI ruin by performing lowering its status."

  3. Another entry in the "But why worry about absolute and final ruin if self-driving cars?" campaign

  4. 11 Apr 2018

    We can recover from tyrannical world governments and AI-assisted war. We cannot recover from unaligned ASI. It's that simple.

    Show this thread
  5. 24 Jul 2018

    For context, I have a growing suspicion that less-than-full-fidelity memetic spread of "OMG, AGI" has causally brought humanity closer to building unsafe AGI sooner - Legg, Musk, etc. I don't claim that deployers of are being *strategic*, but they might be helping.

  6. 31 Mar 2018

    Notice how all the headlines in the tag start with the word "forget"? It's almost as thought there's something they don't want you to think about.

  7. Jan 14

    Possibly derived from this Ted Chiang piece in Buzzfeed where it’s more explicitly combined with .

    Show this thread
  8. Jan 5
    Replying to

    Responding to (the very eminent) Rodney Brooks does not mean I'm not responding to anyone else. And the fact that top names with big resources are resorting to and is object-level news about AGI safety, not just about arguments about AGI safety.

  9. 27 Jul 2018
    Replying to
  10. Jan 5

    (Note: Brooks did not use .)

  11. 13 Mar 2018
  12. I wonder if there's an actual PR campaign behind (never mind that apocalypse, whatabout corporations!)

  13. is in a nutshell. I can try to imagine hypotheses, but I'm not sure why Randall Monroe thinks this a clever thing to say. If I wrote a fictional character saying this straight out, I'd be accused of writing a 2D straw character.

    Show this thread
  14. Very sensible. But the percentages are irrelevant if there is in fact a category error at the root of the comparison. Admittedly this is an "empiricist or rationalist?" thing, but the "billions die" position is purely conjectural. position is not.

  15. Jan 14

    Silicon Valley Is Turning Into Its Own Worst Fear via

  16. 28 Aug 2017

    Here's a link about posted last week See the rest at

Loading seems to be taking a while.

Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.