If morality is no more than a sense of what you ought to do that evolved due to selection pressures in past environments, and it isn't actually what you ought to do, then there is no reason for you to obey it. Upon what basis do you condemn the rational nihilist who ignores it?
The optimal human morality would be the system which maximises human wellbeing, minimises human suffering because this is what all humans want for themselves and is extended to all humans for that reason too. We can disagree and go our own way within that on how to live.
-
-
There is no more to it than this. You can ask how do I justify this rationally to someone who doesn't want to do it and if this requires something outside shared human needs and moral foundations to justify it, I can't. This is a human morality expanded to all humans.
-
Perhaps I'm reading you incorrectly, but it appears that you are conceding that empirical reasoning alone is insufficient to provide a rational argument as to why people ought to expand their circle of empathy too all humans. Do I have that right?
-
??? I really can't explain any better than that, I'm afraid. If you still don't understand what I mean, saying it all over again is unlikely to help. I am speaking to what is optimum human morality which SH explains by saying it is akin to an optimum human diet.
-
You're not following me for the reason religious people generally don't. You'd first need to accept, at least for the sake of argument, that morality is not something humans seek outside themselves but a quality of us, that we can understand as a whole load of 'is's & get right.
-
I am perfectly willing to entertain the idea. What I can't understand is how all those 'is's' can possibly provide a rational argument to follow moral precepts that might, on occasion, be against our individual self interest.
-
It's the same as how all these 'is's can provide a rational argument for eating an optimal diet that might, on occasion, be against our preferences. These are different things - what is optimal and what we want to do.
-
If we could programme all the 'is's that pertain to human morality into a computer & it could calculate the optimum human morality in any given situation, this still won't wash with someone who isn't thinking morally but selfishly. Those would be different calculations.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.