I used to believe I was rational. Not perfectly rational, I knew about cognitive biases from my psychology course, but fundamentally logical in how I approached big decisions.

And I also remember the exact moment I realised I wasn’t as smart as I thought I was.

It was 3 AM on a Tuesday. I was hunched over my laptop in my kitchen, the blue light burning my eyes, scrolling through study after study that contradicted everything I believed about climate change. My coffee had gone cold hours ago. My back ached from the cheap IKEA chair I’d been glued to since dinner.

And I was furious.

Not at the data. At myself. Because I could feel my brain doing exactly what I’d always mocked other people for doing - twisting, spinning, rationalising away anything that didn’t fit the neat little worldview I’d spent years constructing.

There’s this study by some guy at Yale, Dan Kahan. He found that the higher someone’s analytical reasoning skills, the more likely they were to interpret scientific data to support their political beliefs - even when the data clearly contradicted those beliefs.

Among the really analytical folks - the ones who pride themselves on being logical - 68% rejected climate science when it threatened their political identity. Out of the "less smart” people, only 23% did that.

I stared at those numbers until my vision blurred. I was the 68%.

The Mythology We Tell Ourselves

You know that kid in high school who always had to be right? The one who’d argue with the teacher, who’d spend twenty minutes after class explaining why their answer was actually correct?

That was me. Except I never grew out of it.

I carried this delusion into college, into my career, into every dinner party conversation where I’d smugly fact-check people on their phones. I thought I was the rational one. The evidence-based one. The guy who changed his mind when presented with better data.

What a fucking joke.

The truth is, I was just better at being wrong. Better at finding sophisticated reasons to believe whatever I wanted to believe. My psychology degree had taught me about cognitive biases, sure, but somehow I’d convinced myself that knowing about them made me immune to them.

Like knowing about viruses makes you immune to catching a cold.

I started watching myself more carefully after that night. Really watching. And what I saw made my stomach turn.

In the grocery store, I’d unconsciously grab organic vegetables while rolling my eyes at the “anti-science” people buying conventional produce.

I’d listen to podcasts that confirmed my existing beliefs while my brain automatically catalogued reasons to dismiss anything that didn’t.

I’d share articles on social media without reading past the headlines, as long as they supported my team.

My team. Jesus. When did I start thinking about truth as a team sport?

When Your Body Fights Your Brain

What nobody tells you about changing your mind is that it physically hurts.

UCLA researchers hooked people up to brain scanners and showed them information that challenged their core beliefs. The scans lit up in the anterior cingulate cortex - the same part that fires when you burn your hand on a stove.

I didn’t need a brain scanner to feel it. Every time I encountered data that threatened something I cared about believing, my chest would tighten. My jaw would clench. As overdramatic as it sounds, my heart would occasionally speed up like I was being chased.

Your body treats new information like a predator.

I noticed it most clearly during political conversations with my brother-in-law. He's one of those guys who gets his news from places I’d been trained to dismiss. Fox News. Talk radio. Facebook posts from high school friends who never left our hometown.

One Thanksgiving, he brought up vaccine side effects. Before he’d even finished his sentence, my brain was already assembling counterarguments, finding reasons his sources were unreliable, preparing to demolish his position.

I wasn’t listening to understand. I was listening to attack.

But here’s what fucked me up: later that night, when I actually looked up some of his claims, a few of them checked out. Not all of them, not even most of them, but enough to make me realise I’d been dismissing potentially valid concerns because they came wrapped in the wrong political packaging.

My sophisticated bullshit detector wasn’t detecting bullshit. It was detecting threats to my identity.

The Comfort Food of Confirmation

I used to make fun of my mom for only watching MSNBC. “Mom,” I’d say, “you’re living in a bubble.” Meanwhile, I was living in my own bubble, just one with better production values and more graduate degrees.

My news diet was junk food disguised as nutrition. I’d gorge myself on articles that confirmed what I already believed while telling myself I was staying informed. I subscribed to newsletters that felt like having smart friends validate my opinions. I followed Twitter accounts that made me feel clever for agreeing with them.

When I encountered information that challenged my beliefs, I’d immediately start looking for problems with it. Who funded the study? What’s the methodology? Are there conflicts of interest? But when I found information that supported my views, those same critical thinking skills mysteriously vanished.

I had a double standard, and I was too proud to see it.

The Beautiful Lie

For three hundred years, we’ve been telling ourselves this story: that humans are rational creatures who form beliefs through careful analysis of evidence. Schools teach critical thinking. Courts rely on logical argument. Democracy assumes voters make informed choices.

It’s all built on a lie.

Not a malicious lie. A beautiful, necessary lie that makes civilisation possible. But watching myself think, really watching, made me realise how much of what I called “reasoning” was actually just sophisticated rationalisation.

I’d make a decision based on gut feeling, then reverse-engineer logical arguments to support it. I’d claim to follow the evidence wherever it led, while unconsciously seeking evidence that led where I wanted to go.

The scariest part wasn’t that I was biased. The scariest part was how invisible the bias was to me. I felt perfectly rational while being completely irrational.

It’s like being drunk and insisting you’re sober. The alcohol doesn’t just impair your judgment, it impairs your ability to recognise that your judgment is impaired.

I wish I could tell you that recognising my bias cured me of it. That would make for a better story, wouldn’t it? The hero’s journey of intellectual humility.

But it doesn’t work that way.

I still catch myself doing all the same things. I still gravitate toward news sources that make me feel smart and informed. I still feel that physical resistance when someone challenges beliefs I’m attached to.

The difference is that now I notice it happening.

It’s like learning to recognise when you’re getting sick. You still get sick, but you’re not surprised by it anymore. You can take medicine earlier. You can rest before you’re completely wrecked.

I’ve started treating my own thinking like a wild animal that needs to be watched carefully. Powerful, useful, but not entirely trustworthy.

When I feel that familiar tightness in my chest during a disagreement, I try to pause. When I catch myself looking for flaws in information I don’t like, I try to apply the same scrutiny to information I do like. When I realise I’m seeking confirmation rather than truth, I try to seek disconfirmation instead.

It’s exhausting. And I fail at it constantly.

What I’m Learning to Live With

Here’s what I’ve figured out, or think I’ve figured out: we’re not rational creatures with emotions. We’re emotional creatures with rationality.

The emotions come first. The reasoning comes after. Emotions dress up in logical clothing so they can walk around in polite society.

This doesn’t mean we should give up on truth or logic or evidence. It means we should be honest about how much harder it is to think clearly than we pretend it is.

My therapist - yes, I have a therapist now, says the goal isn’t to eliminate emotions from reasoning. It’s to get better at recognising when emotions are doing the reasoning.

“Notice the feeling,” she says. “Name it. Then decide if you want to let it drive.”

Easier said than done. But I’m trying.

I’m trying to read things that make me uncomfortable. I’m trying to steel-man arguments I disagree with instead of straw-manning them. I’m trying to hold my beliefs more lightly, like birds that might fly away if I squeeze too hard.

The smartest thing you can say isn’t “I’m right.” It’s “I don’t know, but here’s what I think and why, and I could be wrong.”

That’s still hard for me to say. Probably always will be.

But it beats the alternative of living in a prison made of my own certainty, spending my intelligence on defending errors instead of discovering truth.

I’m still that kid who wants to be right all the time. I just hope I’m getting better at wanting to be right about what’s actually true.

Rationality Is Just Emotion in Disguise