Conversation

Paraphrased from a thread: Most people think AI is good. If you're the top 1% of smart people, though, you know it's bad. => if you think AI is good, you're not top 1% smart. This kind of statement is an example of "cult" behavior because it forces you to agree. Big turnoff
2
12
"How can we make normal people understand existential AI risk? It's so complicated to explain because the concepts are difficult" is another example of this. Implication is that we're right, how do we get dumb people to understand. This is a pattern that manipulative people use
1
7
Replying to
Say that you truly believed some new technology really was an incredibly dangerous risk to our world, but the evidence that you found extremely convincing didn’t convince others. How would you go about trying to prevent that danger from coming to pass?
1
1
Replying to
I think a good approach would be to use the "leader without coercive power" methods, like pastors, political advocates, some politicians. Stuff like follow-then-lead, similiarity-based persuasion, prestige.
1
Replying to and
I mean I understand the current approach is coming from a "play to your outs" variance-increasing strategy, and it's an elites-focused strategy. But I don't think it's the most effective because it's asking too much
1
Replying to
It seems you’re accusing the people you disagree with of bad faith, manipulative tactics. Feels sort of similar to assuming that the people that disagree with you are stupid, no? What if we all stopped claiming that our opponents were engaging in bad faith without evidence?
1
1
Replying to and
It's more like, say, Jehovah's Witnesses. JWs truly believe that non-believers will suffer an awful fate. Therefore, they will use the most effective tactics to convert people. Manipulation is ethical because it's for their own good.
2
Replying to
Manipulating others “for their own good” is a move made in bad faith, because if an argument is true you can advance it solely on that basis. So yeah, you’re accusing AI Xrisk evangelists of arguing in bad faith right now.
1
1
Replying to and
They seem to genuinely believe I’m at risk and want to save me though Would you call preaching hellfire to convert people through fear bad faith? How about yelling “bomb” in a crowd after spotting a suspicious backpack? I’d just call it misplaced confidence in bad judgment
Replying to and
This is basically my view - they believe we are genuinely at risk of extinction from AI risk, so they are happy to use the most effective persuasion methods (in their view). This doesn't involve any bad faith
1
1
Replying to and
Yeah bad faith would be if they were making an argument they themselves didn’t believe in order to exploit me in an unrelated way, like making me work for them. Tom Sawyer whitewash the fence argument is the archetype.
1
1
Show replies