The Problems of the Social Media Echo Chamber

By now, the ideal dream of consuming a wide array of sources to understand a diverse range of opinions and cultures should have been fully realized.

Where do you get your news and information around the world? By now, the ideal dream of consuming a wide array of sources to understand a diverse range of opinions and cultures should have been fully realized.
So why does it feel like we are soaking up an ever-narrower range of information? Why does it feel like every news story out there, and every response to it, perfectly matches our own prejudices?
Blame the internet. More specifically, blame social media and the echo chambers that we have built inside them.
In the mid-1990s, the promise of the early internet was to connect divergent communities from across the world–to surface the kinds of events and opinions that, though we may find challenging, would yield greater global understanding.
At the time, The Hitchhiker’s Guide to the Galaxy author Douglas Adams wrote of his hope that this “fourth wall” of separation would come crashing down as we shared in online communities, forcing us mere “villagers” out to mingle in the whole wide world.
Bitterly, what is transpiring is the opposite. In Facebook, Twitter and others, by connecting with our existing friends and by following our preferred information sources, we have only replicated our old villages. We have recreated our online communities in the image of our old worlds. And by reconstituting our same network of acquaintances, rather than reaching out to the unknown, we have merely transposed our own social class, views and opinions.
Case in point: Having spent the majority of the U.K.’s European Union referendum campaign away from my homeland, my primary experience of the debate was mediated through my friends and the content they shared through Facebook. Such was the expression of support to remain that I had expected the outcome to be a walkover. So I was flabbergasted, on my return on results day, to find that the U.K. had voted the opposite way.
I had become just the latest victim of the big Facebook filter bubble. We surround ourselves with like-minded friends, they post the news content they feel validates their opinion and, insidiously, this becomes the lens through which we each view the world.
It didn’t used to be this way. Once upon a time, we flipped through the pages of a newspaper or watched the nightly newscast specifically to discover what we didn’t yet know about the world. But the past couple of years have seen a destructive inversion of this cycle, leading us only to learn what we choose—and, more often than not, what we know to justify our preconception.
Thanks to this social echo chamber, we are approaching a consequence at a dangerous precipice. How can we learn what is new in politics or culture when we only choose to read news reports we choose to be aligned with our existing views, or those shared by the friends whose world view matches our own?
Technology has created this problem. Now it is technology’s duty to help.
Of course, social bubbles can be a good thing. When I endured a family tragedy a couple of years back, for instance, it was much easier to post news and arrangements to my Facebook family network than to buy ads in the local newspaper.
The problem with technology today, however, is that algorithms amplify the bubble–tracking systems identify where we go, analyze what we like and simply serve us back more of the same. We have become unaccustomed to venturing forth in search of truly new or left-field experiences.
When you think about it, this is counterintuitive. After all, especially in these days of feeds and streams, the brain is wired to demand constant info-stimulus.