I think I have to read his first book to see what he proposes. One problem seems to be our religions are failing to keep societies together. They also seem not sophisticated enough and don't provide adequate programming of humans for a healthy existence in a modern society. /3
-
-
However, current religions have survived in harsh environments for a very long time and spread to billions of brains, so evolving existing religions might be much easier than replacing them with something different. It might not be perfect, but could be the only feasible way. 4/4
1 reply 0 retweets 0 likes -
Replying to @pavel23
The reasons why the existing religions are no longer tools for organizing social order is that they already were replaced with something more successful. Religious order kept us stable after the fall of the Roman Empire, but always below 400M people. I don't think we can go back.
1 reply 0 retweets 0 likes -
Religions have issues: they destroy epistemology and impair rationalism, they are either totalitarian or toothless when it comes to implementing order, the admin needs to be intransparent (corruption!), and they establish acceptance criteria that not the nicest religion might win
1 reply 0 retweets 0 likes -
Conversely, rationalism allows to define universal criteria for truth, and norms based on consequentialism, with a society that can in principle openly negotiate about preferred consequences, especially once you build an independent death star that incentivizes against violence.
1 reply 0 retweets 0 likes -
Replying to @Plinz
The problem is, people aren't very good at being rational. We are not Vulcans. We use our higher brain functions mainly to rationalize our emotional decisions. And I have doubts about universal truth, given the mess with all those variable meaning functions in the brain. /1
1 reply 0 retweets 0 likes -
Replying to @pavel23
No, some people are very good at being rational, and other people are very good at accepting programming, and while there is some commutation between the groups, they seem to be quite distinct.
1 reply 0 retweets 0 likes -
Replying to @Plinz
You said in one of your CCC talks that our brain states evolve in landscapes where we might find ourselves in different valleys. Rationality also is not a well defined concept. Real people are not utility-maximizing agents, and couldn't agree on a utility function if they were.
1 reply 0 retweets 0 likes -
Replying to @pavel23
Rationality does by itself not imply values, other than a pragmatically justified commitment to truth for the process itself. But once we make our preferences explicit, we can use rationality to negotiate.
2 replies 0 retweets 0 likes -
Replying to @Plinz
In the largest philosophical encyclopedia I know ( https://de.wikipedia.org/wiki/Historisches_W%C3%B6rterbuch_der_Philosophie … ) I once looked up "Wahrheit", and it had the longest entry of all terms in those twelve volumes. Already the German and English Wikipedia articles show how multi-faceted the concept of truth is.
2 replies 0 retweets 0 likes
It is quite simple, really. No belief without priors. Confidence in belief = weight of evidence to support it. Once you have that basic epistemological principle, you can derive Bayesianism, information theory, game theory, and even the theory of universal learning = Strong AI.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.