Confirmation Bias
The tendency to notice, remember, and trust evidence that supports what you already believe.
Share: fallacy.is/confirmation-bias
· also: fallacy.is/cb, fallacy.is/confirm-bias
In plain terms
Confirmation bias is what happens when your beliefs start doing the selecting. You give supportive evidence a pass, you scrutinize contradicting evidence hard, and you come away from a pile of mixed data more confident than when you started.
It doesn't feel like bias from the inside. It feels like being right.
Why it matters
Ordinary reasoning assumes you're weighing evidence on its merits. Confirmation bias violates that assumption quietly. You're not refusing to consider the other side — you're just finding a hundred small reasons to set each piece of their evidence aside, while waving yours through.
The result: beliefs get more entrenched the more you read about them, especially online where you can curate the inputs. More information doesn't move you toward the truth. It moves you toward a more decorated version of what you already thought.
Canonical example
Someone convinced a certain politician is corrupt reads ten news stories about them. The two critical pieces feel incisive and well-sourced. The eight favorable pieces feel like fluff or propaganda. The reader walks away thinking the coverage confirms their view — but if a supporter had read the same ten stories, the framing would flip. Both sides feel vindicated. Neither has updated.
Counter-example (not confirmation bias)
Updating toward a prior belief because the evidence actually supports it isn't bias. If you think smoking causes cancer, and you read a study that says smoking causes cancer, being reinforced is fine. The study is good.
The tell isn't the direction of your update. It's the asymmetry of scrutiny. If you would have picked apart a study that showed the opposite result, while letting this one slide, bias is in the room.
How to work around it
You can't turn it off. You can slow it down:
- Before reading, ask what evidence would change your mind. If the answer is "none," stop reading, you're not actually considering the question.
- Seek out the strongest version of the opposing view, not the weakest. The weak version is easy to dismiss, which is why the algorithm feeds it to you.
- Notice when critical and supportive sources get different standards of evidence. That gap is the bias showing.
How to call it out
Confirmation bias is rarely worth pointing out mid-argument. Saying "you're just confirming what you already believe" invites the response "and you aren't?" The honest answer is usually yes. Better to ask: "What would change your mind on this?" If there's no answer, the conversation has already told you what it needed to.