Why Is Everyone So Upset With Facebook? A Primer.
A brief history of a social media giant in trouble.
You may have heard Facebook is having some trouble these days. If you can’t keep track of what seems like a John le Carré plot unfolding before your scrolling fingers, we put together a tidy little summary of where things currently stand and how they came to be.
Some background
Facebook is the world’s largest digital community. For years, Facebook enticed users to join the platform because they would gain the many benefits of being part of a huge global, digital network. Users can find each other, communicate, express ideas widely, share information easily. Over time, Facebook collected troves of data about those users’ communications, and ultimately, the sale of that data became the backbone of its business model.
There are a few issues with that business model that form the foundation of the problems Facebook has today.
Because the News Feed became Facebook’s most popular feature, it eventually became the method through which 45% of the world’s content (articles, videos, etc.) was seen. This put Facebook in the awkward position of wanting to remain an agnostic platform that simply presented news, against the reality that it had, in large part, become a publisher. As far back as 2015, media insiders began to question Facebook’s stated objectivity — and how long it could claim to not have any editorial responsibility when it effectively controlled a huge proportion of the world’s news.
At the same time, there was another growing inconsistency in Facebook’s ‘story’ around objectivity: the algorithms that determine what appears in a user’s NewsFeed. Tech insiders started pointing out that algorithms can never really be objective — they are written by humans to achieve a specific purpose. And in this instance, Facebook’s purpose was to keep its users engaged and active on the platform. The algorithms were written specifically to keep people surrounded by emotions and ideas they already subscribe to (the so-called ‘echo-chamber’ effect) so that they feel happy while they’re on Facebook and keep coming back for more. All of this concern around the purposeful manipulation of people’s behavior further undermined Facebook’s stated objectivity.
Finally, it’s important to remember that Facebook’s business model is based on the sale of its user’s data. This is how it works: Facebook collects personal data on everyone who signs up for Facebook, and then continues to build user profiles based on everything users post, share, Like, or comment on. Over time, Facebook built a repository of information aggregated from millions of users that allows advertisers to target users who are, for example, divorced, just went on a vacation, love dogs, and have recently expressed an interest in hypnosis. The platform is so vast and detailed that it allows advertisers to target users with extreme specificity. Facebook also started selling its valuable user data to third parties (such as app developers), with no ability to control or determine what they did with that data. (Remember this part; it’s particularly significant later.)
The 2016 US election
The 2016 US presidential election was the first time that a lot of these tech/media insider concerns really got mainstream coverage and permeated the American public’s consciousness. There are reports, now mostly substantiated by a criminal indictment of several Russian agents, that Russian operatives manipulated Facebook’s audience-targeting capabilities to disseminate political propaganda they knew would stir emotional responses in certain user bases. This highlighted for the general public not only how successful Facebook targeting can be at influencing people’s behavior, but also, for the first time, how this could be (and was) used for nefarious means.
Unfortunately, Mark Zuckerberg’s response to these allegations was particularly tone-deaf (he called the assertion that Facebook manipulation could have swayed the election “pretty crazy“), leading many people to question whether the tech giant really understood the power of the platform he had created. This further stirred the seeds of discontent over big tech’s role and responsibility toward the people it purports to serve.
Cambridge Analytica
All of this brings us to the last few weeks, arguably the worst weeks of Mark Zuckerberg’s life (certainly, of his career). Remember the part about how Facebook’s business model is to sell user data to third parties? Well, just recently, the story broke that one of those third parties was Cambridge Analytica, a data mining company that openly advises its clients on how to use data patterns to manipulate and influence elections. Perhaps most problematic is the fact that Cambridge Analytica advised members of US President Donald Trump’s 2016 campaign using Facebook data obtained without those users’ consent; Cambridge Analytica used not only the Facebook data of the people who signed up for its app (and therefore consented to share it), but also the data of all of their Facebook friends, leading to an illegal breach of data from about 50 million users. Further adding to the problem, the most recent allegations include assertions that the Cambridge Analytica employees who worked for Steve Bannon on the 2016 election were not US-based employees, implicating foreign actors in an already problematic election meddling scandal.
Why this matters
If you’re a Facebook user, this matters because it’s important to know and understand how your personal information is being stored, sold, and used. India is Facebook’s fastest growing market; people are increasingly using it not only as a news source, but as their primary networking and social tool. Without a general awareness among the user population about the implications of sharing this personal data with Facebook, and the various ways that their own behavior patterns on the platform can be used against them, Indian users are vulnerable to the same abuses and manipulations. And whether or not you’re a Facebook user, the magnitude of the platform’s reach — and what can happen when its data falls into the wrong hands — should worry everyone.