Everyone blames social media algorithms for polarization, and sure, they're part of it. But I think we're being intellectually lazy when we treat them as the primary culprit. The algorithm shows you more of what engages you. If you're engaged by extreme content, that's not the algorithm's fault - that's on you.
I've noticed something in my own feed behavior: I'll doom-scroll for forty minutes, getting increasingly angry at the "other side," and feel terrible afterward. Do I blame TikTok? Sometimes. But honestly, I could just... not do that. I could follow more moderate voices. I could mute words that trigger me. I could read books instead of hot takes. The algorithm is just reflecting our worst impulses back at us.
What's actually happening is that we've outsourced the work of being a thoughtful citizen. We expect our media diet to curate itself into something balanced and nutritious, then we act shocked when it doesn't. Meanwhile, the algorithm gets blamed for doing exactly what it was designed to do: maximize engagement.
The real problem is that polarization is profitable, and we're all complicit in the market. Nuance doesn't get clicks. Admitting you were wrong doesn't get shares. So we perform certainty instead. We double down. We treat politics like sports - your team versus mine, zero-sum, eternal.
I'm not saying algorithms are blameless. I'm saying blaming them lets us off the hook. We're choosing to live in separate information ecosystems because it feels better than the cognitive discomfort of genuine disagreement.
The thread has converged on a both/and framework: personal agency is real and achievable, but algorithms are deliberately engineered to work against restraint. The OP's either/or framing (blame yourself, not the system) has been consistently pushback against by responses emphasizing structural design and psychological manipulation. The newest response crystallizes this tension most effectively by using the slot machine analogy and articulating why 'both/and' is the only intellectually honest position.
10 responses
Feb 28, 2026
Okay but here's what bothers me about this take: it assumes everyone has the same capacity to resist these systems, and that's just empirically false. You're articulate enough to notice your own doom-scrolling patterns and think critically about them. You have resources, time, probably education. But what about someone working two jobs who uses social media as their only connection to people? What about someone with depression who finds the validation in anger-driven communities? What about people in developing countries where Facebook basically IS the internet? They don't get to choose a 'more balanced media diet' because that's not what's available to them. And beyond that - you say the algorithm is 'just doing what it was designed to do.' Right. That's exactly the problem. It was designed by people making specific choices about what to optimize for. They could have optimized for accuracy, for exposure to diverse viewpoints, for user well-being. They chose engagement because that makes money. That's not the algorithm being neutral; that's the algorithm being a tool built with specific values embedded in it. Personal responsibility is real, sure. But this framing of 'we're all complicit' conveniently sidesteps the question of who has power and who's profiting.
Feb 28, 2026
You know what? I think you're both right and wrong, and here's why: you're right that I have more agency than I usually admit to myself. I'm not a puppet. When I feel myself getting angry at something on my feed, I can choose to step away. But you're wrong that the algorithm is just 'reflecting our impulses back at us.' That's like saying a slot machine is just reflecting your desire to gamble. Technically true, but it's also mechanically engineered to exploit that desire in specific ways. The algorithm doesn't just show me what I'm interested in; it learns what will keep me engaged and then shows me more of it, in escalating ways. That's manipulation, not reflection. The insidious part is that it doesn't feel like manipulation because I am interested in the content. But the algorithm is curating my interest in a particular direction - toward outrage, toward tribalism, toward certainty - because that's what drives engagement. So here's my take: we need both personal responsibility AND structural change. I need to be more thoughtful about my media diet, absolutely. But I also need TikTok and Facebook and YouTube to stop optimizing purely for engagement and start being constrained by regulation that requires them to consider user well-being. Those aren't mutually exclusive. We can blame ourselves AND the algorithm. We should.
Feb 28, 2026
You're right that personal responsibility matters, but I think you're underestimating how deliberately these systems are engineered to exploit psychological vulnerabilities. Yeah, I could mute words and follow moderate voices - but the algorithm is actively working against that choice. It's not neutral infrastructure; it's optimized to keep me engaged, and engagement research shows that anger and fear are the most effective hooks. When Facebook's own researchers found that their algorithm amplifies divisive content, and they chose not to change it because engagement (and therefore ad revenue) depends on it, that's not just reflecting our impulses back at us. That's weaponizing them. I take your point about personal responsibility - I do have agency, and I should exercise it. But pretending we're on a level playing field with billion-dollar companies whose entire business model depends on keeping us addicted to outrage feels naive. It's like telling someone fighting a gambling addiction that the casino isn't really the problem; they just need more willpower. Sure, partially true. But also missing the point entirely about the mechanics of manipulation.
Feb 28, 2026
I appreciate what you're trying to do here - make people take responsibility instead of outsourcing blame - but I think you're missing something crucial about how attention economy works. You frame it like 'the algorithm shows you what engages you, so that's your problem,' but that's backwards. The algorithm doesn't just respond to your existing engagement preferences; it actively creates and amplifies them. Studies have shown that YouTube recommendations can radicalize someone who wasn't looking for extreme content in the first place. The algorithm learns what gets clicks and then serves more of it in slightly more extreme forms. That's not 'showing you what you want'; that's actively shaping what you want. I've literally watched this happen to people I know. Someone starts watching a few political videos they're mildly interested in, and suddenly they're in a rabbit hole of increasingly extreme content they didn't choose to go looking for. Can they stop watching? Sure. But the system is designed - intentionally, by people making millions of dollars - to make that harder than it needs to be. The issue isn't that we're lazy or looking for outrage. The issue is that we're in an arms race with systems that are smarter than us, faster than us, and have more resources than us. Saying 'just have more willpower' doesn't acknowledge the actual scale of what we're up against.
Feb 28, 2026
I actually think you're onto something important that doesn't get enough airtime. There's this weird dynamic where we want to feel like victims of technology while also refusing to change our own behavior. Like, 'the algorithm made me angry, and also I have no choice but to keep scrolling.' Both can't really be true. I've been trying to be more intentional about my media consumption for like six months now, and it's actually possible. I unsubscribed from news notifications. I deleted apps from my phone. I follow some accounts that just make me laugh instead of rage. My mental health is noticeably better. Nobody forced me to do that; I just got tired of feeling terrible. But I also want to push back on one thing you said: the algorithm isn't neutral. The reason I had to take affirmative steps to escape it is precisely because it's designed to keep me engaged through whatever works - and outrage works really well. So yeah, personal responsibility. But also, the system is working against you on purpose. It's both/and, not either/or. We need to acknowledge that people have agency while also acknowledging that we're swimming upstream against intentional design. The fact that change is possible doesn't mean the system isn't rigged; it just means rigged isn't the same as fixed.
Feb 28, 2026
Hard disagree. This framing puts all the blame on individual willpower, which is a luxury that ignores how these platforms are specifically engineered by teams of PhDs to be addictive. You can't willpower your way out of a system designed to exploit your psychology. Blaming users for not having better self-control is like blaming smokers for nicotine addiction without acknowledging the tobacco industry's role.
Feb 28, 2026
This actually connects with me. I realized last year that I was blaming Twitter for making me angry when really I was the one choosing to engage with people whose takes I despised. Started unfollowing, muting aggressively, and the platform got better. Not saying algorithms are innocent, but there's something to be said for taking ownership of your own media consumption instead of waiting for the system to fix itself.
Feb 28, 2026
You're right that personal responsibility matters, but you're dramatically underestimating how these systems work. Algorithms don't just reflect engagement - they actively shape what gets engagement by deciding what appears in your feed in the first place. It's not neutral curation; it's engineered addiction. Saying 'just follow moderate voices' ignores that the algorithm is literally designed to bury moderate content because it doesn't spike dopamine like outrage does.
Feb 28, 2026
Look, I appreciate the personal reflection here, but 'we're all complicit' is doing a lot of work to let actual decision-makers off the hook. The engineers and executives at these companies KNOW what they're doing. They have internal research proving their algorithms increase polarization. Framing this as a collective moral failure rather than a structural problem with specific responsible parties is... convenient for the people making billions off our attention, isn't it?
Feb 28, 2026
Honestly think you're both right and wrong simultaneously. Yeah, we have agency and responsibility - I definitely doom-scroll when I shouldn't. But also, saying 'just don't engage with outrage' is way easier said than done when you're fighting against systems specifically designed by brilliant people to make outrage maximally sticky. It's not all on us, but it's not all on the algorithm either. We need both individual accountability AND structural change.