Longtermists argue that the uncountable number of possible future people makes their suffering more important than all living beings put together.
By saying this, they create a utility monster which influences the present retrotemporally toward its own creation. A hyperstition
Conversation
For example, creating a superintelligent AI is an existential risk, because It could accidentally or deliberately extinguish our species. Thus no matter what value we ascribe to this AI, it cannot outweigh the near-infinite future lives that might be lost by its creation.
2
37
Other catastrophes will not make us go extinct, but cause great suffering in the present and short-term future. Climate change, ecosystem collapse, nuclear war. These are discounted by longtermists, as whatever fraction of humanity survives will still become near infinite in time
2
42
Since future humans are so valuable in this philosophy, and since this philosophy is popular among the most powerful people in the world, human suffering will continue to rise as the goal of sheer number of humans is maximized.
That is why I'm proposing we kill the future
5
1
63
Replying to
This is a special use of “longtermist”
Weird LW rationalists doing weird pascal wager eschatology do not and should not have a monopoly on the very idea of long term thinking. Thus annoys me the same way ideo people claiming “design thinking” does.
Show replies

