I think I have to read his first book to see what he proposes. One problem seems to be our religions are failing to keep societies together. They also seem not sophisticated enough and don't provide adequate programming of humans for a healthy existence in a modern society. /3
-
-
However, current religions have survived in harsh environments for a very long time and spread to billions of brains, so evolving existing religions might be much easier than replacing them with something different. It might not be perfect, but could be the only feasible way. 4/4
1 reply 0 retweets 0 likes -
Replying to @pavel23
The reasons why the existing religions are no longer tools for organizing social order is that they already were replaced with something more successful. Religious order kept us stable after the fall of the Roman Empire, but always below 400M people. I don't think we can go back.
1 reply 0 retweets 0 likes -
Religions have issues: they destroy epistemology and impair rationalism, they are either totalitarian or toothless when it comes to implementing order, the admin needs to be intransparent (corruption!), and they establish acceptance criteria that not the nicest religion might win
1 reply 0 retweets 0 likes -
Conversely, rationalism allows to define universal criteria for truth, and norms based on consequentialism, with a society that can in principle openly negotiate about preferred consequences, especially once you build an independent death star that incentivizes against violence.
1 reply 0 retweets 0 likes -
Replying to @Plinz
The problem is, people aren't very good at being rational. We are not Vulcans. We use our higher brain functions mainly to rationalize our emotional decisions. And I have doubts about universal truth, given the mess with all those variable meaning functions in the brain. /1
1 reply 0 retweets 0 likes -
Replying to @pavel23
No, some people are very good at being rational, and other people are very good at accepting programming, and while there is some commutation between the groups, they seem to be quite distinct.
1 reply 0 retweets 0 likes -
Replying to @Plinz
You said in one of your CCC talks that our brain states evolve in landscapes where we might find ourselves in different valleys. Rationality also is not a well defined concept. Real people are not utility-maximizing agents, and couldn't agree on a utility function if they were.
1 reply 0 retweets 0 likes -
Replying to @pavel23
Rationality does by itself not imply values, other than a pragmatically justified commitment to truth for the process itself. But once we make our preferences explicit, we can use rationality to negotiate.
2 replies 0 retweets 0 likes -
Replying to @Plinz
I am afraid we have to establish a common meaning for what you mean with "truth for process". Absence of intentional deception? Best effort? Peterson speaks about how hard and tedious it can be for people to find out what they really want and need. People's minds are messy.
3 replies 0 retweets 0 likes
Truth for process means mostly absence of confusion and self-deception. Disassemble the null hypothesis; it is usually a complex conspiracy theory that you share with your peers. No motivated reasoning, no faith, no pseudo agnosticism where you could quantify confidence.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.