Anonymous
08/13/2019 (Tue) 11:22:36
[Preview]
No.6300
del
You don't need to throw out all of epistemics. You need to be wary of ideology. Wrong ideological opinions are caused less by an impaired ability to find the truth than by bad incentives.
Even if communism is fundamentally correct, being communister and less willing to accept conventional economics than the next guy marks you as a more reliable ally to the cause.
Even if climate change is real and catastrophically dangerous, being more alarmist than the next guy will get you more attention.
Even if Python is the best general-purpose programming language today, playing down its performance problems suggests you're better at writing fast Python code.
Even if the rationalist approach to epistemics is correct and useful, overusing terms like "Chesterton's fence" and "planning fallacy" makes you sound rationalister.
So what do you do? Well, you may have to ignore some of the most heavily politicized areas of knowledge. But keep in mind that they seem more important than they really are, because they're politicized. They're overrepresented. They make up a smaller share of interesting uncertain knowledge than you'd intuitively think.
Giving into social incentives for these areas is ok. You're going to do it whether you resist or not, and maybe doing it deliberately even makes it easier to occasionally be aware that you're biased.
If you want to get accurate knowledge, try topics that aren't very politicized, and try to find people who are primarily rewarded for accurate predictions.
Prediction markets are alright to trust. A lot of scientific research is fine, or can at least be expected to be closer to correctness than your baseline. A lot of it isn't, but if useful, verifiable, true predictions are normal in the sub-area then it seems good.
Message too long. Click here to view full text.