It's impossible to support gun control without at least marginally increasing the epistemic probability that you are a person who should be shot.
A rationalist would likely say this is more a problem of forgetting what you already know every time you learn a new thing, and that knowledge that doesn't "bind anticipation" isn't real knowledge; but it is possible that some knowledge zeroes out like a scale does for a reason?