"It's close" is debatable (5 years or 50?), but this article doesn't address the danger at all.https://twitter.com/ToKTeacher/status/602260717522194433 …
-
-
@KrishanBhattach Elon Musk, Eliezer Yudkowsky, and many others. -
@SamHarrisOrg@KrishanBhattach (Hawking is worried about alien life for the same reason, incidentally. It's the same mistake.) -
Tweet unavailable
-
@KrishanBhattach@SamHarrisOrg AGI dangers r same as "other people" dangers. Concerns about clock speeds, etc are just misunderstandings. -
.
@ToKTeacher An AGI would be fundamentally *not human*. To think of it as "people" is nonsensical.@KrishanBhattach@SamHarrisOrg -
.
@ToKTeacher There's every reason to not expect any kind of human values, expectations or goals in an AGI.@KrishanBhattach@SamHarrisOrg -
@imperfectidea Human values progress towards objective reality. AGI would discover them. Just like they'd discover physics. - 11 more replies
New conversation -
-
-
@SamHarrisOrg@DavidDeutschOxf "Like building skyscrapers ever taller in the expectation one day they will fly"Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@SamHarrisOrg I think ur overestimating extent 2 which the philosophical probs@DavidDeutschOxf raises there r even considered by AGI ppl.Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.