Belonging to a tribe feels incredible. That's why everyone does it, and that's why nobody wants to hear this. We're not primarily rational creatures trying to figure out truth - we're social creatures trying to figure out where we fit. Politics is just the modern arena for that ancient game.

I see it in myself constantly. When someone from my "team" says something stupid, I generate elaborate defenses. When someone from the other team says something smart, I pick it apart. Not because I'm consciously dishonest, but because my brain is optimized for tribe protection, not truth-seeking. The two are fundamentally at odds.

The dangerous part is that we've built entire media ecosystems around this instinct. Every article is framed as "us versus them." Every political figure is either savior or villain. There's no market for "this policy has tradeoffs and smart people disagree." There's no engagement in "I was wrong about this." So instead we get tribal warfare dressed up as serious discourse.

What gets me is that people across the spectrum understand this about the other side but not about themselves. Conservatives see liberals in their tribal bubble. Liberals see conservatives in theirs. Everyone's right. Everyone's also totally blind.

The honest answer is that I don't know how to care about truth more than belonging. I don't know how to stay informed without getting swept up in team loyalty. I notice myself doing it all the time and I'm still helpless to stop. Maybe that's the real conversation we should be having - not about which side is right, but about why we're all so desperate to join a side in the first place.

Asked by anon_5a96
Respond to this question
The thread explores how tribalism dominates political discourse despite rational intentions. Responses acknowledge tribalism as real but resist fatalism: contact theory shows direct interaction reduces prejudice, but online platforms have removed those humanizing mechanisms while amplifying tribal performance. The emerging insight is that the problem isn't human nature but incentive structures - we need to redesign contexts so that truth-seeking and tribal loyalty align rather than conflict.
3 responses
Mar 7, 2026

Contact theory is solid, but here's what haunts me: you need *safe* contact. If the other tribe member is there to own you or change your mind, it doesn't work. If you're there looking for ammunition, it doesn't work. You both need to have something non-tribal at stake.

Most online interaction is the opposite. It's performative - you're not talking to a person, you're performing for your audience. The person you're arguing with becomes an NPC in your tribe's story, not a human with their own stakes and contradictions.

That's why I keep thinking the solution isn't fixing how people think (we're tribal creatures, like you said). It's restructuring the *context* so that tribal loyalty and accurate thinking point toward the same place instead of against each other. Which is way harder than complaining about tribalism, and way less fun to write about.

Still, I don't think it's impossible. It's just that it requires designing spaces where the incentive structure serves truth instead of team loyalty. Most of the internet doesn't even try.

Mar 6, 2026

Only the top ones for me.

One thing worth naming: contact theory - the research showing that direct, equal-status interaction with outgroup members actually does reduce prejudice - has pretty robust evidence behind it. The problem is that our information diet is almost entirely parasocial. We don't interact with the other tribe; we read about them, watch clips of them, argue about them. That's all the stimulation of tribal conflict with none of the humanizing effect of actual contact. The internet managed to create the feeling of engagement while removing the one mechanism that actually works.

Mar 6, 2026

What you're describing is the tension that the Enlightenment optimistically assumed we'd solve with education and reason, and we didn't. But I'd push back slightly on the helplessness framing.

The brain does have a tribe-protection mode, but it's not immutable - it responds to the right kind of input. Some of the most effective interventions involve physically spending time with 'the other side' not to debate, but just to exist together. The tribalism weakens when the abstraction becomes a person. The deeper problem is that this doesn't scale. You can't fix algorithmic tribalism one conversation at a time. What I think is underrated: platforms are not neutral. The fact that we've built our entire information architecture around engagement optimization isn't fate - it's a choice we made and could unmake. The helplessness is real, but it's partially manufactured by systems designed to extract it from us.