top of page

Perfectly Curated FYP/Feed? That Might Be the Problem.

Updated: Mar 2

In this explainer, Bryan explores how social media algorithms shape echo chambers, feeds that feel personal but quietly narrow our perspectives. From viral challenges to youth radicalisation, he examines their risks and shows how small, deliberate choices can help us break free.


Ever had one of those moments where your For You Page seems to read your mind?


Almost as if TikTok, Instagram Reels, or YouTube Shorts just get you — from that oddly specific meme to the niche hot take you didn’t even know you agreed with?


That’s not intuition. Its design.


Algorithms are engineered to keep you scrolling, and the easiest way to do that is to show you more of what you already like. Over time, this creates a “perfectly curated” feed that feels personal, but quietly shuts out anything unfamiliar. It’s comfortable, but closed. That’s how echo chambers form. So what’s happening behind our feeds? And how have they eroded our ability to think critically, disagree respectfully, and even spot danger when it scrolls right past us?


The Hidden Risks Behind the Scroll

The risks are real. Since 2015, the Internal Security Department has identified 13 Singaporean youths aged 20 or younger who have been self-radicalised online. It didn’t start with hate. It began with a video, a vibe or a voice that felt right. Gradually, their entire online world began to echo that one narrative, until it became their truth.


And radicalisation is only one extreme outcome. Echo chambers also speed up the spread of misinformation, reward outrage over accuracy, and fuel viral but dangerous trends. When your digital world constantly affirms you, it can quietly distort you.


Is my feed showing me the world or just a reflection of me?

Technology was once seen as the great connector. In the late 20th century, the advent of email, mobile phones, and the early internet was imagined as a means to transcend physical borders and bring people closer together. The dream was a global dialogue open, accessible, and enriching. You could learn from anyone, anywhere; understand a world far bigger than your own.


Today, we are more digitally connected than ever before, yet paradoxically, more ideologically siloed. Instead of expanding perspectives, social media often narrows them. Rather than engaging with a marketplace of ideas, our feeds increasingly serve content that reinforces what we already believe. The original promise of connection has been replaced with a design focused on retention and engagement.


This creates what former U.S. President Barack Obama calls a digital bubble:

"If you are getting all your information off algorithms being sent through your phone and it’s just reinforcing whatever biases you have... at a certain point, you just live in a bubble"

These bubbles don’t just happen; they feel good. And that’s exactly the problem. As the philosopher Nietzsche argued, people rarely seek truth for its own sake; they chase comfort, security, and control. We are more likely to believe something not because it’s true, but because it feels good to believe.


In this environment, where attention spans are shorter and speed often trumps substance, people are more likely to engage with content that confirms their worldview. Over time, the digital spaces we inhabit begin to resemble echo chambers, self-reinforcing environments where disagreement is filtered out and complexity is flattened. What began as a tool for connection has evolved into a system of isolation, personalisation, and ideological sorting.


How does an echo chamber form? 

The term “echo chamber” isn’t new, but its roots run deep in the design of the internet we use every day.


In 2011, internet activist Pariser coined the concept of the filter bubble, the idea that algorithms personalise what we see online, selectively exposing us to content that aligns with our existing beliefs while filtering out views that might challenge them. Instead of a shared digital commons, we get digital tunnel vision: curated feeds that make us feel informed, but often only show us one side.


But how exactly does someone fall into an echo chamber on their feed? It doesn’t happen all at once. It’s a subtle, invisible, addictive cycle.


The Lifecycle of an Echo Chamber 

Photo generated with Gemini
Photo generated with Gemini

Here's how the echo chamber cycle typically unfolds:

  1. Exposure

It begins with what you see.


Social media algorithms track everything from what we like and comment on to how long we pause on a post. Based on this data, they begin curating our feed, showing us content that aligns with our views and preferences. An online network often reflects your real-life circles; we mostly see posts from people who think like us.


But it’s not just about agreement; it’s about engagement. Algorithms are built to maximise time-on-platform, not truth. So they boost content that’s emotional, moralised, or divisive. This makes emotionally charged posts like outrage and fear go viral, while calm or complex ideas fade out. Additionally, platforms like TikTok can tailor your entire feed within your first 200 videos watched


  1. Repetition

See it once, it’s new. See it twice, and it starts to feel true.


This is the Illusory Truth Effect. It describes how repeated claims become more believable over time, even when false. Why? Because repeated information is easier to mentally process, the brain treats it as more credible.


Even one repeat is enough to raise your sense of truth. That’s why misleading headlines, memes, or TikToks feel more convincing the second time you see them, and nearly unshakeable by the fifth.


  1. Internalisation

At this point, it’s not just content, it's a belief.


Repeated exposure solidifies ideas. Counter-arguments start to feel wrong or even threatening. In some countries, where social media is the main source of information and regulation is weak, this leads to the unchecked spread of polarising or extremist content. For youth, this stage is especially risky. The teen brain is wired for reward and novelty, but its “braking system”, impulse control, is still developing. Algorithms that promote thrill-seeking and peer validation can trap young users in echo chambers of risk-taking.


Just look at the Tide Pod Challenge: what started as a joke became a dangerous viral trend, as teenagers globally filmed themselves doing stunts for likes. According to the Mayo Clinic, these challenges mimic peer pressure, turning engagement-driven content into a digital dare. 


  1. Amplification 

Now you're not just a viewer, you’re a contributor.


You start sharing, commenting, or creating content of your own that reflects the views you’ve internalised. You join online groups or communities that reinforce the same message. The echo chamber gets louder, more confident, and less tolerant of alternative views.


  1. Breaking Out

Escape is possible, but it takes work.


It often begins with discomfort. Maybe you notice the tone of your feed shifting to extremes. Maybe someone you trust challenges your view. Or perhaps you simply pause and realise: you haven’t seen a different perspective in weeks.


Breaking out doesn’t mean unplugging completely; it means using the platform on your terms. Start by asking:

  • “Why am I seeing this?” Many platforms let you check why a certain post or ad is appearing. Use this to understand how the algorithm is shaping your feed.

  • “Do not show this again.” Don’t be afraid to mute, unfollow, or click “not interested” on content that feels manipulative or one-sided.

  • Follow beyond your comfort zone. Add creators, news sources, or voices you wouldn’t normally hear from—even if you don’t always agree with them.


These small actions disrupt the algorithm’s assumptions.


So what now? 

Echo chambers don’t always look dangerous. At first, they feel comfortable, like your feed “gets” you. But that comfort comes at a cost.


When we’re only exposed to what we already believe, we lose the chance to grow. We risk becoming more certain, but less informed. More reactive, but less reflective. In the worst cases, echo chambers don’t just shape opinions, they shape behaviour, pushing people toward misinformation, polarisation, and even harmful stunts and ideologies.


And while social media platforms play a huge role in how these chambers are built, we still have agency. The choices we make on who we follow, what we share, when we pause and question can either reinforce the bubble or start to break it.

Comments


bottom of page