My sense of 'being able to navigate the world accurately' has been less 'learning facts about the world', like I predicted, and more 'learning how to accurately gauge the trustworthiness of experts.'
Turns out, experts are wrong, a lot! But how do you know when they're wrong, if you're not an expert? If you learn a fact that seems to contradict an expert, how do you know if this is important or if you're simply too ignorant to know how that fact is irrelevant?
It's hard to figure this out, but lately I pay attention to a few things
1. If an expert is talking about something outside their field, they're much more likely to be wrong and also less likely to be noticed, because they have confidence and authority that reassures everyone.
2. If an expert is subject to incentives: anything ideological, in any direction, is a big pressure to not think accurately - so for example "professor of feminist studies at UC Berkeley" or "Famous Qanon ex-CIA youtuber"? Be more suspicious.
3. If what they say simply don't make sense to you. This is a bit vague, because a lot of experts are right even if you don't understand them. But if you are trying, really hard, to genuinely understand, if you're not ideologically opposed, but something still seems off? Be sus.
4. If the requirements to become an expert in their field require heavily on memorization, instead of paradigm-challenging problem solving. Lots of credentials are just meant to show you know the existing information well, and don't mean you're good at accurate predictions.
Expertise is very often a fuzzy veneer of authority that we bestow on a few select people for being able to pass the right tests. And being able to pass tests is good, but we still need to be informed and discerning in how we choose to give out our trust.
Friendly amendment: Since a *very* large fraction of experts will have *some* imaginable incentives inside of their field, don't consider that a disqualifier per se, but pay additional attention to experts who have visibly gone against those incentives on some past occasion.
Eg:
No: "Pay no attention to what ML experts say about AGI risk - they have incentive to underplay AI risks!"
No: "Pay no attention to MIRI - they have incentive to overplay risk!"
Yes: "So Eliezer, unlike other 'AI risk' types, doesn't warn of AI unemployment? Interesting."
Is there anyone who’s not subject to this though? Even trying to be “neutral” can act as it’s own incentive to avoid information that might lead you to condoning a specific position.
Or "Professor of Economics at George Mason ". Conservative and libertarian sources are at least as subject to biases and predisposition to worldview interpretation as anyone. We all do this, me and you too.