People who influenced my thoughts here:
@rsnous for putting this question in my head in the first place
@ctbeiser for concrete argument against: https://web.archive.org/web/20170916053902/http://cbeiser.me/on-time-free-theories-of-decision-making/ …
@gwern tangential argument that sunk cost fallacies specifically might not apply to individuals
-
-
Show this thread
-
Just going to throw out some ideas Well studied violations of rationality may pop up regularly for individuals, but most of it is about truth seeking and being more correct. There may be very few cases where a clear cognitive bias can be squashed while making personal decisions.
Show this thread -
Lets say that LW rationality addressable problems do pop up in individual decision making. It could be that intuitive gut feelings integrate your values and priorities well. Overriding gut feelings in order to excise a bias you identified might throw you off more than you fixed
Show this thread -
Maybe learning to identify bias *is* helpful for people but the folks gravitating to LW and reading about biases for fun are the least likely to benefit from this because their thinking might be more rigorous than average already.
Show this thread -
Maybe we are worse at achieving goals if we remove our own bias in favor of correctness. Convincing yourself that your new company is a guaranteed success probably makes you a more convincing leader. Your politics are probably more convincing if you give in to confirmation bias.
Show this thread -
FWIW, I do think LW rationality is helpful. Prob helps with decisions and at very least I'd bet it is helpful like reading a book that gives you a different perspective on the world is helpful. Mainly just fun to try doing a premortem for why it might not help. /cc
@juliagalefShow this thread
End of conversation
New conversation -
-
-
Possible answer: it's helpful to de-bias when using deliberative mindset, but very harmful to de-bias in implemental mindset. Few people can separate these mindsets. (Those that can are great leaders)
-
Not familiar with the terms so I looked em up. Basically, are you saying de-biasing is good for figuring out whether to adopt a goal but harmful when planning the steps for achieving a goal? Is this a correct read?
-
Sorry for not being clear! When we're figuring out our goals and how to reach them, it's helpful to de-bias, to be more accurate. Once we figure out our goals and process, we switch to implementation mindset. De-biasing is harmful here because it tends to reduce overconfidence.
-
Forgot to mention that implementation mindset is where we actually work towards our goal. Overconfidence is useful because it improves the chances of success. I first came across deliberative/implemental mindset in Phil Rosenzweig's book "Left Brain, Right Stuff".
-
Cool, thanks for the clarification. Interesting you are optimizing for preserving over-confidence. Sounds like what I mentioned here https://twitter.com/backus/status/987223845575639040 … I might also argue that you might just waste your time too trying to debias each step vs. just executing
-
That's a good view! We should debias when planning things, but be biased when executing things. There is some spillover between these two mindsets, but the spillover can be reduced with practice
End of conversation
New conversation -
-
-
This is one reason why CFAR focuses on practice over theory. Accruing *declarative* knowledge about "good decisionmaking" will not give you the *procedural* knowledge of how to make good decisions in your daily life.
-
I also want to echo
@juliagalef's point- while trying my hardest not to veer into "no true scotsman"- that most LW rationality explicitly distances itself from "squash your FEELINGS using LOGIC!" and puts a huge weight on integrating all sources of data, *especially* gut feelings -
Yeah I'm not even bothered when people make this assumption, because ignoring gut feelings IS a common mistake people make when trying to "Be more rational". Critics aren't crazy to complain about it. It's just that LWers are much more aware of this mistake than usual.
End of conversation
New conversation -
-
-
It doesn’t have to be a premise, it can just be a way to explain the world and promote improving it with larger scale actions, akin to economics (economists aren’t in it to improve their business acumen).
-
Maybe it would be helpful to talk about specific fallacies or errors and discuss if you get mileage out of them? I have thought of fundamental attribution error when counseling friends on (relation|friend)ship issues; basically it just reminds me to be more empathetic.
-
Bayes, too. E.g. “it’s way more likely to be a bug in my code than a bug in the library.”
-
Maybe a problem with determining if some specific rationality tool is useful is you’re encouraged to internalize the ideas. So you don’t think “Oh, per Bayes theorem the prior on a library bug is higher, so it’s probably that”, you just intuit that without crediting Bayes.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.