The Social Media Echo Chamber

When the Rabbit Hole Becomes an Information Silo

BILL ASTORE

APR 16, 2026

If you’ve spent time on social media, you know it can be quite unsocial. 

Profane, angry, hostile posts and comments can be dismissed for what they are. But what about more subtle threats to the free and civil exchange of ideas? Social media sites aren’t necessarily designed to encourage such exchanges. They’re not primarily designed to educate us, to challenge us to think critically, while promoting tolerance and an open mind. 

Instead, they are designed primarily to capture and command our attention, to keep us “on the app,” reading and clicking and doom-scrolling in an addictive way. Sites keep track of what we read, what we share, even what we pause over, and feed us more of the same. An information silo is created controlled by algorithms that feed you more of what you like, or more of what angers you or titillates you or otherwise occupies your attention and time.

It’s easy to end up in an echo chamber that confirms your biases, one that reinforces the idea that people who think differently from you must be willfully misguided or stubborn or maybe just plain stupid or even evil. If you already dislike or distrust “the other side,” social media will tend to make you dislike or distrust them even more.

We’re warned about going down the rabbit hole, but we’re not warned about the information silos being created for us without our knowledge or consent.

All this has been on my mind after I watched this short and stimulating TEDx talk by journalist Ryan Biller.

As Biller notes, social media can impoverish human interactions. It can serve as a hostile wall instead of a transparent window or an open door. I wrote to Biller to thank him for his talk and to share my perspective on echo chambers, siloed information, and the like, and he wrote back that social media can create “a merciless cycle and feedback loop that has a psychological ‘funhouse mirror’ effect; in other words, it exaggerates and distorts reality in, I think, a really negative way.”

We need to recognize how social media apps, sites, and algorithms manipulate us; how they’re designed to keep us clicking, scrolling, and otherwise (over)stimulated. And how these interactions are, in a way, dehumanizing. Or, if not dehumanizing, how they bring out the keyboard commando in some people.

With respect to echo chambers, what I do to combat that is to read a range of sources daily. I get daily updates from mainstream media sites like the New York Times, the Washington Post, and the Boston Globe. I check sites like the British Guardian, NBC News, and BBC News. I occasionally turn to Fox News to see how certain events are being covered.

And then there are a range of alternative sites and podcasts that I’ve found useful, such as TomDispatch.com, Judging Freedom, Antiwar.com, Chris Hedges, Glenn Greenwald, and Caitlin Johnstone. I listen to (among others) Jimmy Dore, Tucker Carlson, Joe Rogan, Briahna Joy Gray, Max Blumenthal at The Grayzone. Of course, I don’t listen to all of these, all the time, nor do I listen to them because I always agree with them.

In having this site, Bracing Views, I contribute to this complex informational ecology, putting my own spin and exhibiting my own biases. I deeply appreciate my readers and commenters who have largely avoided the often unsocial nature of social media.

When I need a strong dose of humor and reality, I return to George Carlin. I am reminded that telling one’s truth in a provocative and humorous (and even profane) way can have great value.

Finally, remember that sometimes the best social interaction is sitting down and breaking bread with the people around you—even the people you disagree with. For I continue to believe that we can agree to disagree, that we can disagree in ways that aren’t disagreeable, and that sometimes disagreement can become agreement, and that common ground can be found.

Addendum: I shared the comment below in response to a reader who noted that manipulation is nothing ne

Absolutely. As I.F. Stone said, all governments lie. Propaganda is everywhere. But social media is more insidious because there’s an illusion of control. People think they’re the ones doing the picking and the clicking.

Not only are you often “swimming in the shallows” online–those shallows are more like a puddle whose boundaries are set by algorithms.

It’s fascinating to think of the ocean of information that’s out there even as some people are figuratively drowning in puddles partly of their own making.

Zuckerberg Tells a Truth

W.J. Astore

Ready for Your AI “Friends”?

I caught this snippet from Mark Zuckerberg, guru of Facebook:

There’s this stat that I always think is crazy. The average American, I think has, I think it’s fewer than three friends, three people that they’d consider friends and the average person has demand for meaningfully more. I think it’s like 15 friends or something.

If you’re familiar with Facebook, every personal contact you make on there is categorized as a “friend.” When you want to add someone to your Facebook page, you “friend” them. Alternatively, when you want to get rid of someone, you “unfriend” them.

Now, the typical Facebook user has roughly 200-300 “friends.” What Zuckerberg is unintentionally revealing in that snippet above is that Facebook “friends” aren’t real friends. They’re mostly acquaintances. People we’ve met once or twice, maybe even people we’ve never met. They’re not close friends, intimate friends, “real” friends. 

So why call them “friends,” Facebook? For obvious reasons. Just about anyone would like more friends, and indeed I know people with over 2000 “friends” on Facebook. But, again, how many close or intimate friends can you really have?

That’s where Zuckerberg comes in, yet again, riding to the rescue with AI “friends.” Yes, he’s suggesting that the solution to loneliness in America, our lack of intimacy, is AI programs that will be your “friend,” a little bit like the movie “Her” with Joaquin Phoenix and Scarlett Johansson.

So, I suppose you’ll soon be able to buy AI “friends” from Mark Zuckerberg or someone like him. Or perhaps they’ll be offered for “free,” as Facebook is, with your most intimate data being sold to the highest bidder.

I really don’t want AI “friends.” I have a few real friends, people I’ve known for decades, people I do feel close to, and I’m lucky to have them. Two quick lessons come to mind. First, of course, friends aren’t perfect. They can be annoying, frustrating, maddening. (Guess what? I can be too.) Part of being a friend and keeping one is tolerance, acceptance, patience. The second lesson: To have a friend you have to be a friend. If you want people to be there when you need them, it’s a good idea to be there when they need you.

Sorry, Zuckerberg: I don’t think AI “friends” are the answer here. But thanks for debunking the whole idea of “friends” on Facebook.