Very good, I have never grasped this before but your explanation here makes it very clear
Presumably, experimentally, we could do two things
- if we had a more precise estimate of the strength and direction of the "Gambler's Fallacy Bias" in people's probability estimations, we could see if they match the predictions from Bayesian rationality>?
- if we could experimentally manipulate people's priors for steady / sticky/ switchy we could test if their Gambler's Fallacy behaviour was responsive in the way a Bayesian rationality actor predicted
Definitely. Yeah if there were some way to measure how strong their priors were for Switchy/Steady/Sticky, and how they expected it to evolve under each of those, we could see whether those expectations predict their GF rates and whether their belief about them evolve in a sensible way to new data. That'd be a bit hard to measure accurately of course.
Easier is probably to try to manipulate their beliefs in it, as you say. There's a way of reading the results we found in this paper (https://pubmed.ncbi.nlm.nih.gov/40472199/) I wrote with Yang Xiang and Sam Gershman in that way: when you do extra to make sure people think the process is IID, and not be suspicious of the strings, then their probability estimates stay closer to 50%. (But there are other things going on in that paper—the data we found about binary predictions, rather than estimates, don't fit so well with this theory.)
My sense is some of the dynamics of the theory wouldn't be that hard to test, so I'm thinking about doing it! Other things have priority right now though
Bayes’ theorem isn’t just math it’s a philosophy of rational living.
It says:
Every belief you hold is provisional.
Every new piece of evidence should shift your confidence.
Rationality is a lifelong process of updating, not defending.
In a world drowning in misinformation, Bayes is almost revolutionary. It teaches that truth isn’t a fortress you defend. it’s a probability you refine. That’s why some thinkers argue it’s the single most important tool for human rationality.
Imagine if public discourse worked this way: instead of shouting absolutes, we’d be negotiating probabilities, adjusting beliefs as evidence accumulates. That’s not just math it’s a blueprint for a more rational civilization.
nicely put
clear, enjoyable read
Very good, I have never grasped this before but your explanation here makes it very clear
Presumably, experimentally, we could do two things
- if we had a more precise estimate of the strength and direction of the "Gambler's Fallacy Bias" in people's probability estimations, we could see if they match the predictions from Bayesian rationality>?
- if we could experimentally manipulate people's priors for steady / sticky/ switchy we could test if their Gambler's Fallacy behaviour was responsive in the way a Bayesian rationality actor predicted
Definitely. Yeah if there were some way to measure how strong their priors were for Switchy/Steady/Sticky, and how they expected it to evolve under each of those, we could see whether those expectations predict their GF rates and whether their belief about them evolve in a sensible way to new data. That'd be a bit hard to measure accurately of course.
Easier is probably to try to manipulate their beliefs in it, as you say. There's a way of reading the results we found in this paper (https://pubmed.ncbi.nlm.nih.gov/40472199/) I wrote with Yang Xiang and Sam Gershman in that way: when you do extra to make sure people think the process is IID, and not be suspicious of the strings, then their probability estimates stay closer to 50%. (But there are other things going on in that paper—the data we found about binary predictions, rather than estimates, don't fit so well with this theory.)
My sense is some of the dynamics of the theory wouldn't be that hard to test, so I'm thinking about doing it! Other things have priority right now though
Nice! Also enjoyed the paper. i hope you find an experimentalist to tease some of this out in the wild.
Always good to see LessWrong posts turn into philosophy papers!
Bayes’ theorem isn’t just math it’s a philosophy of rational living.
It says:
Every belief you hold is provisional.
Every new piece of evidence should shift your confidence.
Rationality is a lifelong process of updating, not defending.
In a world drowning in misinformation, Bayes is almost revolutionary. It teaches that truth isn’t a fortress you defend. it’s a probability you refine. That’s why some thinkers argue it’s the single most important tool for human rationality.
Imagine if public discourse worked this way: instead of shouting absolutes, we’d be negotiating probabilities, adjusting beliefs as evidence accumulates. That’s not just math it’s a blueprint for a more rational civilization.