A lot has been said about the murder of Charlie Kirk. It hits differently when something so extreme happens to a person who never advocated violence, who only spoke common sense. We have entered a new dark period when political violence has gone from being an abstract fear into an absolute, terrible reality.
As I scrolled through Substack later, trying to make sense of it all, I kept seeing a familiar pattern: people on the left writing words so far from reality that it at first angered me and then I felt something I did not expect. Pity. They seem to live in a world disconnected from how the majority of people think, or feel, or breathe. Their posts are so full of delusion, anger, vitriol, certainty in a reality that isn’t real. They weren’t questioning or asking why, no, they just doubled down trapped in ideology that I and most cannot even comprehend.
I remember one author in particular. When we were both starting out, he was just a guy who wanted to lift up other writers, share tips, nurture community. I admired him. He gave me a lot of good advice. He was a friend. Then something changed. He became more vocal about his politics, he attacked those who questioned his views and it was then that I cut ties. I’ve seen that happen before, too many times.
Recently I found his Substack by chance. I wondered how he was doing now. I wish I didn’t click. But what I saw was something else: entrenched ideology so extreme it looked like a mirror warped by rage. Delusions, conspiratorial leaps, breakdowns in basic empathy. Posts filled with label-calling, sweeping judgments, no nuance. And then there was the insanity of those he’d gathered to his side. All reinforcing the delusion and lies that Kirk’s killer was a Trump supporter, that he was right wing, that those on the right were evil and violent, that they were the victims.
It’s heartbreaking, but I also fear for him. For millions like him. Because when you see that change, when someone moves from curious kindness to rigid extremism it is algorithms that are often behind it. The feed learns what provokes, what generates engagement. It feeds you more content like what you've already shown interest in. Emotions escalate. Anger becomes a lens. Certainty becomes a fortress. And debate ends. From reading their posts it’s clear that they do not even know why they believe what they believe. The simplest challenge to their dogma and they sling insults. There is no place for conversation.
The assassination of Charlie Kirk is the worst kind of evidence of a system gone dangerously wrong. It’s not just that someone was killed (though that alone is enough). It’s that the patterns of radicalisation via online arguments, dehumanisation of “the other side,” the virtual crowd feeding off rage look like they help pave the path. The algorithm doesn’t have moral culpability, but it turns every click, every reaction, into fuel. In short, it is a trap for the unaware.
I think about my friend’s journey. I wonder when exactly the shift happened. Was it gradual, one post at a time, or abrupt? How many times did the algorithm suggest a more extreme take? How many times did he see fellow authors confirming his growing outrage? And did he notice when empathy and balance faded from his feed until he no longer saw what he used to be?
How it all changed
When I joined my first social platform everything arrived in order. Friends’ posts scrolled past like a digital diary. It was messy, but at least you knew you weren’t missing anything. Then the platforms grew. Facebook’s News Feed appeared, and before long it was ranking posts. Twitter switched to an algorithmic timeline. Instagram followed suit. TikTok went further and built its entire experience around recommendations. The tech companies wanted cash and we the users were that source of income. They exploited that.
These changes didn’t look sinister at the time. They were sold as convenience and in fairness, there’s a lot to be said for not drowning in thousands of irrelevant updates. But there was a hidden trade-off. Engagement became everything. Whatever kept us scrolling, clicking or watching got rewarded. Outrage, shock, envy and fear are powerful motivators, so the system tilted towards content that evoked those feelings. We didn’t notice it happening day by day, but the mood of our feeds began to shift. One look at a post pushing an ideology, one click and bam, the feed begins to feed you more and more until you go down a rabbit hole that you cannot escape from unless you’re self aware of what it’s trying to do.
Living inside an echo
Think about someone who clicks on a few posts about a political issue. The system takes note and serves them more of the same. They start following like-minded accounts. Recommendations narrow. Before long their feed looks like a monoculture: one worldview, reinforced again and again.
That doesn’t just confirm their opinions it distorts reality. If every post you see agrees with you, it feels like everyone agrees with you. Likes, shares and “trending” badges act as proof. For someone already on the edge of a conspiracy theory or extremist belief, that sense of validation can be intoxicating. It’s easy to see how people can get pulled deeper, until they’re convinced anyone outside their bubble is ignorant or malicious.
The toll on our minds
All of this happens on platforms designed to be hard to put down. Infinite scroll, autoplay and unpredictable notifications keep us hooked. It’s no surprise that heavy social-media use is linked with disrupted sleep, anxiety and low mood, especially among teenagers. Yes, there are positive stories too — support groups, creative communities, long-distance friendships — but the constant churn of high-emotion content wears many of us down.
I’ve felt it myself: that jittery, restless feeling after too much time online, or the creeping sense of anger at strangers I’ll never meet. These are not side effects; they’re signals that something in the system isn’t healthy.
Could we just go back?
Sometimes I dream of a big switch that takes us back to the good old days of MySpace. In that world there’d be no secret ranking, no algorithm quietly pushing whatever gets the most engagement. Everyone would see posts in time order. It would be calmer, more transparent. And yes, it would probably reduce the acceleration of polarising content.
But it wouldn’t magically fix everything. People would still choose who to follow, and if you follow a hundred angry accounts, your feed will still be angry. Chronological feeds also mean wading through noise and spam, which was one of the reasons ranking appeared in the first place. The truth is, the problem is as much about our behaviour and incentives as about the code itself.
We’re in a place now where ideological echo chambers aren’t just making us separate; they’re inflicting damage. They render people unable to see tragedy without saying “but,” unable to hear grief without pointing blame, unable to consider another person’s humanity if they breathe or believe slightly differently.
If we want to stop more stories like Charlie Kirk’s, more people drifting from kindness to anger, more democracy turning into two siloed mobs, we need change. And not just “platform policy” change, but personal, human change.
So, what can we do
Reclaim control of our feeds. Switch off recommendations when possible. Use chronological feeds if we have the option. Unfollow accounts that only inflame. Follow accounts that challenge us kindly.
Reach out with compassion. As hard as this sounds we can only try. Chances are they’ll throw it back into your face but we have to try and reach them.
Hold platforms accountable. Demand transparency: why is this post showing up? What signals are being used? Push for systems that don’t reward rage, but reward insight, curiosity, dialogue.
Cultivate empathy. We all have reasons we believe what we believe. Trauma, fear, loss, loneliness. If someone is radicalised, they are often hurting or yearning for belonging, meaning. If we meet them with judgement, they retreat further. If we meet them with compassion, maybe there’s a crack in the fortress. (I have my doubts.)
The algorithm may be powerful. But we are powerful too. We are the ones who click, who comment, who share or don’t share. We build our own bubbles. We can decide what to listen to, what to reject. We can choose a path that bridges rather than divides.
Charlie Kirk’s death is a tragedy in so many dimensions. But if it reminds us of something, let it remind us that when ideology becomes a cage, empathy becomes the key. And when enough of us care enough to try unlocking the doors, change starts.
We must find a way to fix this — because if not us, then who? And if not now, then when?
You'll NEVER "fix" the problems with social media until you fix the school system here. As long as teachers indoctrinate children with their insanities, instead of teaching reading cursive, and math, and self discipline, things will only get worse. No child should sit in front a screen ---whether jumbo-tron or cell phone --- for more than an hour a day. Our kids are reality blind. They should be reading, aloud. And not the multi-radical crap they're fed about "we all the same." We're not. There are nutbags and there are sane folks, and children need to learn the difference.
Thanks for this. I'm trying to mind my own business, anymore. It's hard. I'm a very opinionated person and I have my core beliefs which have shaped my life. They aren't hostile beliefs, but they do have boundaries. One is to live and let live, unless there's a threat to my family or myself, of course. Anymore, there are a lot of threats out there and cause for concern.
One must step outside with caution.