Discussion about this post

User's avatar
Simas Kucinskas's avatar

Great post & very clearly written, too.

The mechanism you present seems very clever. However, I wonder if something simpler matters more in practice, namely, biased recall. Suppose you ask me to guess if a coin will come up heads or tails, I make a guess, but you will only flip the coin a year later. A year later, you flip the coin, show me its tails & ask me, “Hey, what did you guess a year ago?”, I’m willing to bet many people, even if they were being fully honest, would say “Tails.”

I’m not familiar with the literature, but there must be some way to differentiate between the Bayesian mechanism with ambiguity you describe vs simple biased recall.

Expand full comment
julia's avatar

Hey Kevin, interesting post, thanks! This argument seems to me to have some weird implications for how subjects should view their current epistemic position. If you're right that rational agents should commit hindsight bias when they trust themselves and when they're uncertain about their present (or past) probability distribution, then it seems that I, who am now aware of this conclusion, should reason as follows when I am asked my current probability (at t1) for the proposition that Biden is too old to campaign effectively:

"Well, I'm not absolutely certain what I think. If I had to put a number on it, I'd bet 60-40 that Biden is too old. However, I also know that at some later time (t2) evidence will come in that will enable everyone to know whether Biden is too old. And I also know that after that evidence comes in, I'll rationally shift my opinions about what my credences were at t1, such that if it turns out that Biden is too old, I'll believe at t2 that I was 70-30 (or something in that ballpark, depending on how great the hindsight bias shift is) at t1, and if it turns out Biden wasn't too old, I'll believe at t2 that I was 50-50 at t1. So, assuming it's rational to defer to my future self (who after all has better evidence than I do) it seems that I should now believe that my current credence that Biden is too old is either 70 or 50, though I'm not sure which."

There seems to be something very odd about a view which implies that rational agents should regard themselves as having a a credence in P that is either significantly more or significantly less than their current best estimate of the odds of P. What strikes me as possibly even weirder is that they should think that this is the case because their opinions track (are correlated with?) the truth in a manner that is nevertheless not detectable to them. At best, such an agent strikes me as confused; at worst, incoherent. Thoughts? Am I missing something? Maybe your reaction would just be to reject the rationality of deference?

Expand full comment
13 more comments...

No posts