Facebook is evil for two reasons. First, it makes you dumb. Second, it takes over your life. That’s why I chose to quit Facebook.

Before I get into the hows and whys, I have a confession to make. Early last year, I decided to get off Facebook for the making-me-dumb reason. I proclaimed that triumphantly to everybody who matters on www.theotherview.in, a website maintained by K. Ram Kumar, executive director at ICICI Bank Ltd.

Then a few months ago, much like dormant smokers sneak out to take just that one last puff and relapse, I reactivated my account. I argued and told myself that my intent was to build and promote my personal brand on social media. That it was a terrible mistake was made obvious when my friend and former colleague Rohin Dharmakumar shared a piece about Facebook’s new privacy policies that kicked in from 30 January. To put it bluntly, it is nothing short of devious.

Before I get into that though, allow me to present why I thought Facebook was making me dumb.

A little over two years ago, I read a lovely book, Truth, Beauty and Goodness Reframed, by Howard Gardner, an American developmental psychologist at Harvard University. The book compelled me to write him a note and ask for an interaction. The man was gracious and offered me his time.

Gardner’s hypothesis is that the world we live in is one where it is ridiculously simple to find people who agree with you. But there is a downside to that. Because everybody around seems to agree with you, prejudices that exist in your mind are reinforced.

As theories go, I thought it compelling. But the implications weren’t evident to me until recently, when I started to examine my media consumption habits, Facebook included. Each time I checked my Facebook feed, I thought I could see a pattern. On the one hand, I had in excess of 500 friends and subscribed to at least a dozen groups. On the other hand, my feed was populated by posts from friends that ran into just double digits.

Eventually, I realized that this monotony in feeds that populated my timeline was because all of it originated from two kinds of people. Those whose posts I hit the most number of likes on; and those who frequently hit like on my posts. Other friends on my network were invisible entities—unless I chose to actively seek their timelines out. The more I thought of it, the more I was convinced this is a problem. What I like is inevitably what I agree with. What about those I disagree with, or whose posts I don’t hit a like on? Why should they be invisible?

Interestingly enough, the numbers of people whose feeds I could see also ties in with the Dunbar Number. First proposed by British anthropologist Robin Dunbar, he argued, a human being could on average maintain 150 meaningful relationships. By meaningful, he meant that if all of these people were in the same room, everybody would be comfortable. Implicit to comfort is that everybody shares more or less similar worldviews.

My readings suggest social media architects of the kind who work at Facebook deploy this idea in their algorithms to limit interactions and visible ideas to the boundaries imposed by this number. Else, their product may seem chaotic to most humans. While the architects may be right in deploying the wisdom Dunbar’s Number contains, I am convinced these interactions in the digital world contain a fatal flaw.

In the offline world, I live in a space different from the one my elders do, or the one my sibling does. My friends come from backgrounds dramatically different from mine. That is why when my folks chide me for my lack of spiritual beliefs; my sibling disagrees with me on what constitutes the good life; and my friends vehemently argue over political ideology, there is no acrimony—at best, animated conversations. I can’t hit a like button here or unfriend them. By the very nature of my relationships, my biases aren’t confirmed. Instead, they are challenged, unlike Facebook, which provides me comfort that comes with homogeneity.

That said, Facebook is only a metaphor for a larger problem that terrifies me. Allow me to put that into context. Around two years ago, I took to drinking green tea because I was told it is good. To understand why, I punched “is green tea good for me?" into Google’s search box. Practically every answer the engine threw up pointed me to resources that argued why it is indeed good.

As a little experiment, I typed, “Is green tea bad for me?". The numbers of arguments on why it may not be so good after all were as many as why it is good. The answers I was looking for were dependent on the bias built into my question. The algorithms that power the searches were feeding my biases.

To understand what happens if I eliminate the implicit bias in my question, I typed green tea into the search bar. This time around, the results were mixed. Some pointed to why it is good and others to why it is bad. This is where my problem really is. We live in times where the potential to find ways to amplify our biases is unprecedented.

Most of us carry tablets, smartphones and every kind of always-on device. Every media company, whether established or in start-up mode, has latched on to the ubiquity of these devices. That explains the explosion of applications, which offer personalized news feeds, the communities they seek to build and relationships they hope to forge on their platforms. This explosion in personalization is killing diversity and making us dumber.

To draw yet another parallel, think of food. Once upon a time, food was at a premium. Human ingenuity took over and agriculture was industrialized. Scarcity gave way to abundance and foraging became a thing of the past. Fast food culture took root. This culture though, came with an underside. Humans became sedentary, turned corpulent and acquired lifestyle diseases. To combat these diseases, we are now compelled to make informed choices on the food we consume and proactively avoid sedentary lifestyles.

If this analogy is extrapolated into our world, Clay Johnson’s advice in The Information Diet resonates loudly: “Consume deliberately. Take in information over affirmation."

In our own interests, therefore, it is incumbent to deliberately decide what information to consume, what communities to be part of, and what relationships to nurture. The alternative is: Be dumb, stay dumb.

Then there are Facebook’s new privacy policies to contend with. Salim Virani, a teacher based out of Sofia in Bulgaria, used a fine toothcomb to pore over all of the fine print. What got his goat are the changes that have just been implemented.

Virani writes: “Facebook is demanding to track what you buy, and your financial information like bank account and credit card numbers. You’ve already agreed to it in the new terms of service. It’s already started sharing data with Mastercard. They’ll use the fact that you stayed on Facebook as permission to make deals with all kinds of banks and financial institutions to get your data from them. They’ll call it anonymous, but like they trick your friends to reveal your data to the third-parties with apps, they’ll create loopholes here too."

“Facebook is also insisting to track your location via your phone’s GPS, everywhere and all the time. It’ll know exactly who you spend your time with… They’ll know how many times you’ve been to the doctor or hospital, and be able to share that with prospective insurers or employers. They’ll know when you’re secretly job hunting, and will sell your endorsement for job sites to your friends and colleagues—you’ll be revealed."

“They’ll know everything that can be revealed by your location, and they’ll use it however they want to make a buck."

“And—it’ll all be done retrospectively. If you stay on Facebook past January 30th, there’s nothing stopping all of your past location and financial data to get used. They’ll get your past location data from when your friends checked-in with you, and the GPS data stored in photos of you. They’ll pull your old financial records—that embarrassing medicine you bought with your credit card 5 years ago will be added to your profile to be used as Facebook chooses. It will be sold again and again, and likely used against you. It will be shared with governments and be freely available from loads of ‘third-party’ companies who do nothing but sell personal data, and irreversibly eliminate your privacy."

“…..This is unprecedented, and just like you’d never have guessed that Facebook would sell your endorsements when you signed up in 2009, it’s too hard to predict what Facebook and those third-party data sellers will do with this new power."

“This is simply a consequence of their business model. Facebook sells you out, because that’s exactly how they make money. And they’re under heavy pressure from their investors to make more."

But this is only the tip of the iceberg. Virani has documented in excruciating detail how Facebook has, over the years, systematically violated your life and sold you over a dozen times in its pursuit of valuations. All of his findings and clinical arguments on why you ought not to be on Facebook can be read on saintsal.com/facebook.

If all of this is not enough, allow me the latitude of two more lines by way of caution.

1. When you sign in to Facebook and accept their terms, you sign away all rights. This means the companies cookies embedded on your device to access the application can track everything you do, including accessing your private messages.

2. Then there is that hugely popular group chat application, WhatsApp. It resides on your phone, has access to all of your contact details, messages you send out, the groups you belong to, and the links you click on. May I ask you take a wild guess why Facebook paid a whopping $22 billion to acquire it in October last year?

In spite of all this, if you choose to stay on the Facebook gravy train, good luck. Clearly, you don’t give a damn about yourself.

Charles Assisi was managing editor at Forbes India and is now at work on his first venture. He maintains a personal website on www.audaciter.net and tweets on @c_assisi

Close