The Problems of the Social Media Echo Chamber

By now, the ideal dream of consuming a wide array of sources to understand a diverse range of opinions and cultures should have been fully realized.

Where do you get your news and information around the world? By now, the ideal dream of consuming a wide array of sources to understand a diverse range of opinions and cultures should have been fully realized.

So why does it feel like we are soaking up an ever-narrower range of information? Why does it feel like every news story out there, and every response to it, perfectly matches our own prejudices?

Blame the internet. More specifically, blame social media and the echo chambers that we have built inside them.

In the mid-1990s, the promise of the early internet was to connect divergent communities from across the world–to surface the kinds of events and opinions that, though we may find challenging, would yield greater global understanding.

At the time, The Hitchhiker’s Guide to the Galaxy author Douglas Adams wrote of his hope that this “fourth wall” of separation would come crashing down as we shared in online communities, forcing us mere “villagers” out to mingle in the whole wide world.

Bitterly, what is transpiring is the opposite. In Facebook, Twitter and others, by connecting with our existing friends and by following our preferred information sources, we have only replicated our old villages. We have recreated our online communities in the image of our old worlds. And by reconstituting our same network of acquaintances, rather than reaching out to the unknown, we have merely transposed our own social class, views and opinions.

Case in point: Having spent the majority of the U.K.’s European Union referendum campaign away from my homeland, my primary experience of the debate was mediated through my friends and the content they shared through Facebook. Such was the expression of support to remain that I had expected the outcome to be a walkover. So I was flabbergasted, on my return on results day, to find that the U.K. had voted the opposite way.

I had become just the latest victim of the big Facebook filter bubble. We surround ourselves with like-minded friends, they post the news content they feel validates their opinion and, insidiously, this becomes the lens through which we each view the world.

It didn’t used to be this way. Once upon a time, we flipped through the pages of a newspaper or watched the nightly newscast specifically to discover what we didn’t yet know about the world. But the past couple of years have seen a destructive inversion of this cycle, leading us only to learn what we choose—and, more often than not, what we know to justify our preconception.

Thanks to this social echo chamber, we are approaching a consequence at a dangerous precipice. How can we learn what is new in politics or culture when we only choose to read news reports we choose to be aligned with our existing views, or those shared by the friends whose world view matches our own?

Technology has created this problem. Now it is technology’s duty to help.

Of course, social bubbles can be a good thing. When I endured a family tragedy a couple of years back, for instance, it was much easier to post news and arrangements to my Facebook family network than to buy ads in the local newspaper.

The problem with technology today, however, is that algorithms amplify the bubble–tracking systems identify where we go, analyze what we like and simply serve us back more of the same. We have become unaccustomed to venturing forth in search of truly new or left-field experiences.

When you think about it, this is counterintuitive. After all, especially in these days of feeds and streams, the brain is wired to demand constant info-stimulus.

But in a world where we have retrenched to our villages, we need a techno town crier, someone who can challenge us with information that shocks us out of our predictable feedback loop of self-validation.

We nearly had it. StumbleUpon, founded in 2001 in Calgary, was an early content recommendation engine that tried hard to inject serendipity and exploration into the online experience. I used it voraciously, though it has now waned somewhat, leaving in its wake social sites and news discovery algorithm operators more interested in keeping you within your own, you-sized lane.

That’s why, as the cure to the filter bubble problem, I am proposing a reverse recommendation algorithm.

Imagine if, after reading consecutive posts about Hillary Clinton, your network purposely served you articles about opponent Donald Trump instead. Wouldn’t you get a rounder view of the whole presidential campaign?

This is the content equivalent of Amazon, where at the bottom of a product page for running sneakers, you’ll be served an inverse recommendation of a walking stick.

The world badly needs to develop ways of taking its inhabitants outside of their comfort zones, to make good on the promise of the internet to foster not enhanced tribal differences, but greater global understanding.

We need to train technology toward helping us challenge, not reinforce, our own perceptions of the world daily. That could involve awarding points for exposure to contrary or alternative viewpoints, for example. After all, everyone likes discovering something new, something different—the tastemakers who introduce new material to a network are often the ones who walk away with most social currency.

The future was in sight once. Now we badly need to burst the bubble, knock down the walls and link arms across the info waves.

Vincent Gibson is the founder of social video aggregator Centric App.

Image courtesy of Shutterstock.



Publish date: August 11, 2016 https://dev.adweek.com/digital/vincent-gibson-centric-app-guest-post-social-media-echo-chamber/ © 2020 Adweek, LLC. - All Rights Reserved and NOT FOR REPRINT
{"taxonomy":"","sortby":"","label":"","shouldShow":""}