On Anger, Algorithms, and the Quiet Power of Resistance
Short on time? Take a moment to give the audio recording of this blog post a quick listen.
On Anger, Algorithms, and the Quiet Power of Resistance
It’s not exactly a secret anymore that algorithms on media platforms — from Facebook to YouTube to TikTok — tend to prioritize content with a negative spin. But what’s truly unsettling isn’t just that negativity grabs attention. It’s why it does.
Human attention is biased toward threat and uncertainty. It’s a product of evolution — we’re wired to pay more attention to danger than to calm or balance. Media algorithms, in their cold and indifferent logic, merely exploit that wiring. They’re not evil; they’re just ruthlessly efficient at feeding us what keeps us scrolling, clicking, and returning. And often, that’s outrage, fear, or some carefully crafted form of existential unease.
The real problem, then, isn’t just the platforms. It’s that they mirror back to us something we might not want to admit: that we’re more susceptible to emotional manipulation than we like to believe. This is a kind of existential discomfort Oliver Burkeman has written about — the idea that much of modern life is about trying to dodge discomfort, to find control where there is none. And algorithms know that, too.
In that sense, the negative spin isn’t just a byproduct of platform incentives — it’s a reflection of a deeper truth about our relationship with time, attention, and meaning. And unless we face that honestly, no amount of “fixing the algorithm” will change the outcome.
Psychological Perspective
Humans evolved to over-prioritize threats. This is called negativity bias — we react more strongly to negative stimuli than to positive or neutral ones. In the wild, this bias helped us survive (“Better to mistake a stick for a snake than a snake for a stick”). But in the digital age, it backfires.
Social media algorithms optimize for engagement, and what consistently grabs attention? Content that triggers fear, outrage, moral disgust, or anxiety. You’re more likely to click on “You’re Being Lied To About Climate Change” than “Climate Progress Made This Year.” More likely to comment on a post that angers you than one that quietly informs.
So what does the algorithm do? It learns, in effect:
“The more threatened or riled up people feel, the longer they stay. Show them more of that.”
As Oliver Burkeman might argue, this isn’t just about media — it’s about how we attempt to control our emotional lives. We scroll in part because we’re anxious and looking for relief, but the content we’re shown amplifies that anxiety — creating a loop that keeps us trapped in distraction and avoidance.
And distraction, for many of us, is a way to avoid confronting life’s biggest truths:
- that we have limited time
- that we can’t control everything
- that comfort and certainty are illusions
This is the core of Burkeman’s philosophy — facing finitude directly, rather than escaping it through numbing media loops.
Indeed, we could spend some time questioning the way algorithms are structured – which raises some hard questions:
- Who is accountable?
Platforms say the algorithm is neutral — it just “gives people what they want.” But if what people “want” is shaped by unconscious biases, trauma, or manipulation, is it ethical to feed that? - Do platforms have a duty to safeguard attention?
Just as we protect the environment from pollution, some argue we should treat attention and mental health as shared resources. If platforms pollute the cognitive environment with fear and division, should they be held to account? - Is neutrality a myth?
Choosing not to intervene — to let “the algorithm decide” — is itself a choice. One that favors profit over wellbeing.
Platforms often justify their inaction by claiming to be impartial. But as Burkeman would point out, there’s no such thing as neutral infrastructure. Every algorithm reflects a set of values — usually unspoken, often unexamined — about what’s worth seeing, engaging with, or feeling.
In the end, the problem isn’t just the platforms. It’s that we live in a culture deeply uncomfortable with discomfort — and tech companies monetize that avoidance. The real work is to develop a more intentional relationship with our attention, even if it means confronting what we’d rather ignore: our limits, our mortality, our lack of control.
The answer isn’t to make algorithms “nicer.” It’s to build practices, ethics, and cultures that make us more resistant to their worst tendencies — and more aligned with what really matters in the short time we have.
But one emotion is particularly useful to them: anger.
As we all know, anger is activating. Unlike fear or sadness, which can make us retreat, anger drives us to act. It makes us comment, share, quote-tweet, argue — and all those behaviors signal that “This content is working. Boost it.”
Posts that provoke outrage often outperform balanced or nuanced ones — not because they’re truer, but because they make us feel morally alive. Indignant. Righteous. Engaged.
And so the algorithm learns:
“More anger = more clicks.”
And we get trapped in a loop: the more angry content we see, the more we engage, the more of it we see.
Yes, we may be scrolling to escape discomfort — boredom, anxiety, a vague sense of lack. But when we’re met with anger-inducing content, we don’t feel better. We feel stimulated — and stimulation feels like purpose, even when it’s corrosive.
It’s not just about distraction anymore. It’s about addiction to outrage.
So, even online – Anger Management skills can really help.
The good news is: we’re not powerless. The same tools you learn in an anger management course are the same tools that can disrupt the algorithm’s grip.
Here’s how:
1. Pause Before Reacting
Remember, algorithms feed off your instant reactions.
In a 2014 study published by the National Academy of Sciences (PNAS), which analyzed how emotions spread on Twitter.
- “Anger increases the likelihood of retweeting by twice as much as joy.”
(Source: Fan et al., PNAS, 2014 — “Anger is more influential than joy: Sentiment effects on the diffusion of information in social media.”) - Content that provokes anger is significantly more likely to be shared than content that evokes joy, sadness, or other emotions.
Facebook internal research (leaked in 2021)(Source: Facebook Papers, 2021 whistleblower disclosures)
- Posts that triggered “angry” reactions were more likely to be amplified by the News Feed algorithm.
- The “angry” emoji reaction was 5x more likely to result in algorithmic boosting than the “like.”
MIT study on fake news (2018)
- False news stories (which often provoke anger or disgust) spread 6x faster than true stories.
- Emotional novelty — especially outrage — was a major factor in virality. (Source: Vosoughi, Roy, & Aral, Science, 2018)
Practicing the “anger pause” — taking even 5–10 seconds longer before replying, sharing, or commenting — interrupts the feedback loop – LITERALLY.
2. Label the Feeling
A trick to slow that immediate reaction down is to “Name what’s happening” in your mind.
“I’m feeling angry. This post is trying to provoke me.”
This creates psychological distance, and returns control to you. You then get to choose how you wish to ‘’use’ your attention. We aren’t called ‘users’ for no reason.
3. Ask: What’s the Goal Here?
Examine what you are reading. Is the post trying to inform you — or inflame you?
Once you spot manipulation, you’re less likely to reward it with attention.
4. Redirect Attention Intentionally
Algorithms reward reactivity. You can reward depth. Choose to follow people who inform without inciting, and spend time on content that builds, rather than burns.
5. Unfollow or Mute Strategically
Protecting your mental space isn’t passive — it’s a form of ethical attention. You don’t owe your outrage to every post that wants it. Just like you don’t have to attend every argument.
It’s easy to say “the system is broken.” But in reality, the system is working exactly as designed. It’s designed to provoke, to polarize, to feed off the emotions that keep us scrolling.
That means the deeper solution isn’t just technical — it’s emotional.
To reclaim our time and attention, we don’t just need better tech.
We need better tools for managing how we feel, and what we feed.
In the end, resisting outrage isn’t apathy.
It’s agency.