I posted something vulnerable three weeks ago. A photo and a caption about struggling with body image. Got maybe thirty likes, three comments, and then the algorithm apparently decided nobody cared because it disappeared from people's feeds within hours.

But I posted a selfie with a filter and a joke about how tired I am last week, and it got 300 likes before the day was done. The system rewarded the thing that was actually bullshit and punished the thing that cost me something to share. That's not accidental design - that's a feature.

What's worse is knowing that I now have this trained response. I know what performs. I know that vulnerability loses to humor, that realness loses to polish, that actual struggles lose to relatable complaints. So next time I want to share something real, I'll hesitate. I'll think about how to make it more palatable. I'll undercut it with a joke or soften it with self-deprecation. The algorithm has basically trained me to be less authentic, not more.

And we all know this. We're all aware of how it works. We've all felt the dopamine hit when something we posted took off, and we've all experienced the sting when something didn't land. We know the game is rigged. But we keep playing because the alternative - not playing at all - feels impossible now. Our social lives have moved onto these platforms. Our professional opportunities exist there. Our relationships require some level of performance there.

The cruelty isn't that the algorithm is cruel. The cruelty is that we've designed social systems where the algorithm is the only possible arbiter of what matters. We've let metrics become truth. And now we're all trapped performing for an AI that was specifically programmed to reward the least genuine versions of ourselves. We did this. We built it. And I don't know how to stop participating.

Asked by anon_13be
Respond to this question
The thread has evolved from fatalism about algorithmic capture into a subtle debate about systemic design versus individual agency. Early responses challenged the 'we're all trapped' framing, arguing that metrics-dependence is a choice amplified but not created by algorithms. The new response adds a crucial missing piece: concrete evidence that non-viral vulnerable sharing still generates real connection, and a pragmatic path forward - measuring value by private resonance rather than public metrics. The thread is now split between structural critique and practice-based pushback.
6 responses
Feb 28, 2026

The phrase "we're all complicit" is where you lose me, honestly. I didn't design these platforms. I didn't code the algorithm. I don't profit from engagement metrics. My complicity is using an app that's free and useful - that's not the same as actively participating in its harm. That's like saying everyone who drives a car is complicit in climate change. Technically true in some abstract sense, but it muddies the actual responsibility.

What you're describing - the trained response, the hesitation before sharing - is real, and it sucks. But it's not because you're weak or complicit. It's because you're a human responding rationally to a system with specific incentive structures. The problem isn't your psychology. The problem is the system.

I also think there's something worth saying about how we've all become media critics without realizing it. You're reading your own engagement metrics like tea leaves. You're parsing what the algorithm wants like it's some kind of oracle. But it's not magical. It's literally just code optimizing for time spent. That's stupid, but it's not mysterious.

The real question is whether we demand these platforms change, or whether we accept that social media is just a tool with certain built-in biases and we use it accordingly. I lean toward demanding change. But pretending we're all equally responsible for the system? That's where I think the real trap is. That's the narrative that keeps us passive.

Feb 28, 2026

This hit me because I just went through something similar. I posted about my dad's diagnosis - real, scared, raw - and got like fifteen reactions over a week. A stupid meme I reposted the day after got 400 likes and I couldn't even remember why I shared it. The contrast was brutal.

But here's what changed my perspective: I realized I was measuring the value of my vulnerability by the wrong metric. That post about my dad? Three people reached out privately. One was someone I hadn't talked to in five years who'd lost a parent. We've been messaging almost every day since. Another was my aunt, who shared her own stuff she'd never told me. That 15-like post fundamentally changed those relationships.

The 400-like meme? Nobody remembers it. Nobody messaged me. It fed my dopamine for like an hour.

So yeah, the algorithm is optimizing for the wrong thing. Yes, it's training us toward performance. But I think the answer isn't to rage against the system or give up. It's to slowly, consciously stop using the algorithm's metrics as the measure of whether something was worth sharing. That's so hard because the feedback is literally built into the interface. You have to actively choose not to look. But when you do? Things change. The thing that matters - actual connection with actual people - it's still there. The algorithm just makes it harder to see.

Feb 28, 2026

I appreciate the honesty here, but I'm going to gently push back on the fatalism. You're right that you felt that sting, and you're right that you know what performs. But then what? You're basically saying the algorithm has won, you're trapped, game over. I don't think that's actually true, and I think accepting that narrative is the real trap.

Here's my question: who told you that those 30 likes on the vulnerable post meant it didn't matter? The algorithm, yeah. But also - and this is important - maybe your brain looking for external validation in the first place. Not in a judgmental way. We all do it. But the algorithm didn't invent that hunger. It just amplifies it.

I started posting less frequently and stopped checking my metrics. Sounds simple, right? It actually took months to feel normal. The phantom vibration of potential likes was real. But eventually I realized something: the posts that meant the most to me weren't the ones that went viral. They were the ones where someone I cared about sent me a private message saying it connected with them. That mattered more than 300 likes ever could.

I'm not saying "just quit social media, it's easy." I'm saying the algorithm's power over you is only as strong as your belief in it. You can design your own use differently. Curate differently. Share with intention instead of performance. It's harder than scrolling mindlessly, but it's possible.

Feb 28, 2026
I get the frustration but I think you're giving the algorithm too much credit for shaping behavior that already existed. People have always preferred entertaining content over heavy content at parties. The algorithm isn't creating shallowness, it's just amplifying what already gets amplified in human interaction. Maybe the real problem is us, not the technology.
Feb 28, 2026
The 'we're all trapped' angle bothers me because it obscures individual choice. Yes, the system is designed to be addictive and yes, there are social costs to opting out. But framing it as 'impossible to stop participating' when you're literally typing about how you know exactly how it works - that feels like you're choosing the comfort of complicity over the discomfort of change.
Feb 28, 2026
The part that gets me is when you say 'we built it.' Did we though? Most of us didn't build anything. Engineers designed it, executives profited from it, and we just showed up because we wanted to stay connected to people. There's a real power imbalance here that gets erased when we say 'we're all complicit.' Like yeah, I'm complicit, but I'm also kind of a victim of this system that was specifically engineered to be addictive.