### Lottery or car crash or you'll join a cult

Dec. 7th, 2016 09:55 am**cincinnatus_c**

Currently at Havelock: 2, which is about as good as we're getting today. But holy shit the sun's shining.

Maybe this illustrates something about why people are overconfident in their beliefs on controversial matters, and maybe it doesn't:

Let's stipulate (counterfactually) that intelligence is a matter of getting things right as opposed to wrong. Let's suppose that A is a person of average intelligence, and that A gets things wrong 20% of the time. Let's suppose that B (whom I think we can say is of very much above average intelligence) gets things wrong half as often as A does, i.e., 10% of the time. Let's stipulate (very much counterfactually) that each of A and B is as likely to be right in any given case as they are right generally (i.e., for any given case, there is an 80% chance that A gets it right and a 90% chance that B gets it right). Now: out of all cases, A and B will agree on 74% of them: in 72% of all cases they will both be right, and in 2% of cases they will both be wrong, so that, just considering cases where they agree, they will both be wrong in 2.7% of them. Out of all cases, they will disagree on 26% of them; on 8% of all cases A will be right and B will be wrong, and on 18% of all cases B will be right and A will be wrong, so that, just considering cases where they disagree, A will be wrong on 69% of those, and B will be wrong on 31%.

B knows (let's suppose) that B is wrong 10% of the time. A knows (let's suppose) that A is wrong 20% of the time. Both feel fairly well justified in believing whatever they believe. (Being wrong in one out of five cases is not great, but, ya know, Nate Silver gave Trump a 50% better chance of winning than that going into the election, and everyone (except you, of course) was still shocked when he won.) But when they disagree, each of them is wrong more than three times as often as they are in the habit of being wrong. A, obviously, is likely to be very overconfident. But even though B is much more likely to be right than A, B is also likely to be very overconfident. (However: if A can recognize people like B and is in the habit of being wrong when in disagreement with people like B, then A may actually be underconfident when it becomes apparent that B disagrees with A: A may simply assume that B is right. This would be a better strategy for A than maintaining that A is right, but a worse strategy (all other things being equal, e.g., supposing that there is no other goal than eventually getting things right) than maintaining belief that there is a 31% chance that A is right.)

Maybe this illustrates something about why people are overconfident in their beliefs on controversial matters, and maybe it doesn't:

Let's stipulate (counterfactually) that intelligence is a matter of getting things right as opposed to wrong. Let's suppose that A is a person of average intelligence, and that A gets things wrong 20% of the time. Let's suppose that B (whom I think we can say is of very much above average intelligence) gets things wrong half as often as A does, i.e., 10% of the time. Let's stipulate (very much counterfactually) that each of A and B is as likely to be right in any given case as they are right generally (i.e., for any given case, there is an 80% chance that A gets it right and a 90% chance that B gets it right). Now: out of all cases, A and B will agree on 74% of them: in 72% of all cases they will both be right, and in 2% of cases they will both be wrong, so that, just considering cases where they agree, they will both be wrong in 2.7% of them. Out of all cases, they will disagree on 26% of them; on 8% of all cases A will be right and B will be wrong, and on 18% of all cases B will be right and A will be wrong, so that, just considering cases where they disagree, A will be wrong on 69% of those, and B will be wrong on 31%.

B knows (let's suppose) that B is wrong 10% of the time. A knows (let's suppose) that A is wrong 20% of the time. Both feel fairly well justified in believing whatever they believe. (Being wrong in one out of five cases is not great, but, ya know, Nate Silver gave Trump a 50% better chance of winning than that going into the election, and everyone (except you, of course) was still shocked when he won.) But when they disagree, each of them is wrong more than three times as often as they are in the habit of being wrong. A, obviously, is likely to be very overconfident. But even though B is much more likely to be right than A, B is also likely to be very overconfident. (However: if A can recognize people like B and is in the habit of being wrong when in disagreement with people like B, then A may actually be underconfident when it becomes apparent that B disagrees with A: A may simply assume that B is right. This would be a better strategy for A than maintaining that A is right, but a worse strategy (all other things being equal, e.g., supposing that there is no other goal than eventually getting things right) than maintaining belief that there is a 31% chance that A is right.)