Escaping the Social Media Echo Chamber
Does it seem like you’re only hearing and seeing what Facebook wants you to? You are. But here’s your breakout plan.
Human beings crave acceptance, approval and validation. How many times have you timidly ventured an opinion that a particular restaurant is crap, had someone agree enthusiastically, and then become firmer in your opinion, convinced that you have something there?
This is Facebook in a nutshell – a place where you’re constantly seeking out, or only seeing, people who agree that the restaurant is crap. Which means that you’re only seeing one point of view — which could very well be wrong or misinformed, but you’re convinced it’s right.
There’s nothing to tell you otherwise, after all.
Get the blueprints
(That is, understand what causes the social media echo chamber)
A study from the Proceedings of the National Academy of Sciences journal examined Facebook data from 2010 and 2014, categorizing information into science news, conspiracy rumors, and trolling. They found that, despite the world wideness of the web, users effectively “tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization.” The study also found that the quality of information was compromised, and narratives supplemented by unverified rumours ran amok.
This phenomenon has been categorised as a social media echo chamber, not unlike JK Rowling’s Chamber of Secrets, complete with hissing basilisks. The echo chamber effect occurs when a network of like-minded people share theories, views and news, according to their biases and selections – then, when the same information is repeated back to them, accept it as confirmation of fact.
Much has been made of how Facebook systems facilitate this, unbeknownst to its users. Facebook newsfeeds are operated by an algorithm that tracks and analyses your activity on the site. This Facebook algorithm runs through a series of rules and steps to figure out what you like – and show you more of it.
The algorithm takes thousands of factors into account – a few are: what you click on, what you react to (all those emojis of like, love, cry, laugh, etc.), what your friends share (and thus you react to), the pages you like, or the news stories that Facebook is paid to push. A post with a high number of likes, shares or comments is more likely to appear, and stay, on your feed.
How often you interact with a friend or page also determines how frequently you see that set of posts. And the algorithm considers the post’s content, too; if you’ve interacted with a particular type of post more or less often in the past, that type will appear more or less frequently respectively – for example, if you like puppy or kitten posts, you will see more and more of them. This means that all of us – every user of social media (Twitter and Google use similar algorithms) — are trapped in a social media echo chamber of some kind.
To ensure its own continued profitability and in response to criticism, Facebook does change its algorithm – almost constantly. However, the latter is always reactive, after ‘damage’ has been done. After coming under fire for the Facebook echo chamber’s contribution to the pooh-poohing of climate change and the fake news debacle of the 2016 US Elections, Facebook has announced certain new measures like categorising pages known for sharing inaccurate posts, info or clickbait headlines.
Facebook is also outsourcing fact-checking to organisations and firms like the Associated Press, Snopes and Germany-based Correctiv to combat echo chambers that spread fake news on social media, like the one that widely promulgated the image of Syrian refugee Anas Modamani, posed in a selfie with German Chancellor Angela Merkel, with the false claim he is a terrorist.
And after the recent Women’s March in the US, Facebook has changed its Trending Topics algorithm; Trending Topics are no longer influenced by what you like, but what location you’re in, a more news-media, informative way of doing it.
And finally, Facebook recently announced that longer videos (90 seconds or more) would receive higher rankings in newsfeeds, the assumption being these contain more information and context — though one cynically wonders how this will reveal itself as a boost to Facebook Live somewhere down the line. And, given the company’s recent testing of ‘mid-roll’ ads, or, ads that appear any time after 20 seconds of the video starting, it’s a change that is likely to yield big revenue.
Facebook exists to show you more of what you like, and thus, know what you like, and recommend stuff that you may like. And sell its knowledge of what you like to advertisers – for a lot of money. It has no responsibility to show content that a user doesn’t engage with – and why would it, when our clicks are currency?
The social media echo chamber is easy to condemn. But, if you’re using Facebook as your primary source of information – and it’s being irresponsible – then aren’t you being irresponsible, too?
Now, get the f@$^ out
(That is, set limits on Facebook’s emotional manipulation)
Breathe in, breathe out. Have a cup of tea — but pack your shiv. Also:
- Adjust the settings under News Feed Preferences, and set the posts you want to see when you log in. Choose sources that are well-rounded and wide-ranging to get a mix of different opinions, as well as feeds of pages opposing your interests (e.g. conservative versus liberal).
- While you’re there, change your News Feed settings to Most Recent, instead of Top Rated. This means you’ll end up seeing more recent news, instead of posts that have more interaction.
- And mosey on over to the Ad Preferences section of the control panel to figure out what ads you are seeing based on your interests – and adjust your interests accordingly, rather than let the Facebook algorithm decide for you.
- Unfollow pages you are no longer interested in, or post political messages despite not being related to the topic.
- Read articles before liking or sharing. Verify suspicious items on Snopes or other myth-debunking sites, or even Google them and verify with prestigious news sites.
- And perhaps most importantly: Go to the source. Facebook might be convenient, but the best way to break out of the social media echo chamber is to go straight to trusted news sources. Make a point to visit these websites every morning or evening – before you spend time on Facebook.
The promise and potential of the internet was an info-democracy, where everyone had access to objective news, knowledge and information, encouraging literacy, social discourse and progress. Dare we use the cliché, global village. It can still be that – if we all take responsibility for making it happen, starting with our own newsfeed.