V.H. Belvadi


Information overload and an overly social web — an experiment

26 October 2016 —

That the internet is an integral part of our lives is no surprise: it is simply a step up from the type and ease of access of information that was affordable in the last century, such as bookstores and well-catalogued libraries. Why social media and networks have become an integral part of our lives, though, is a more pressing issue and a question most people are afraid to ask themselves.

A little over one year ago I asked myself this question and wanted to find out the answer. Nobody had one that was satisfying enough to me and — nearly fourteen months later, the answer I have is fair enough on a personal level but still hazy to some extent when generalised. What I found out along the way, and the potential long-term benefit it would have, was certainly worth the time I spent on it. I first wrote about this in August of 2015 (see Marginalia) asking myself — and everyone else by extension — how many social networks we are on and why. However, this was probably not a knee-jerk thought and could, on some subconscious level, have been prompted by the fact that, at the time, I was reading William Powers’s book, Hamlet’s Blackberry. Indeed my previous two quotations scribbled on Marginalia were from that book.

Note that we use the terms ‘social networks’, ‘social web’, and ‘social media’ interchangeably. They are all in reference to that part of the web which encourages you to create your profile, share personal and general information, your work or life or travel and so on, connect with known and unknown people and which enable you to stay connected all day long if you choose to. It is best we keep the definitions flexible, lest we miss the forest for the trees.

The structure of this article is worth paying attention to, owing to its size: we begin with an overview of the experiment, followed by a quick rundown of the prevalent scientific thinking on this issue, and then some notes on ways other than cutting back on social media that can potentially help in everyday life, followed by the actual steps I took over the past year as part of this experiment and finally some thoughts on how and why they work as well as, of course, what I intend to do from this point on as a direct consequence of this experiment. This last bit is particularly important because several people prefer to believe in either jumping right back into the social web or refraining like monks altogether. One is an addict, the other a luddite. The purpose here is to find and carve out a much more agreeable middle ground.

Part one

This began a year-long experiment for me where I aimed to explore this question alongside a host of other similar ones. Specifically, would the reason why I got on social media — to spread my websites and photographic work — take a hit if I called it quits overnight? (See Cold turkey.) In what I consider a rather bold move, I decided to pull the plug on the social network where I had my largest group of followers and where my photography was the focus of almost everything I was sharing: Google+. To some, this might seem foolish or even daring, but to me it was simply part of an experiment and if I was going to do this for a year, I might as well do it thoroughly.

The reason why my experiment was a year-long affair was that I had seen or heard of far too many people who ‘quit social media’ on impulse and then promptly got back to it a few days later. This, in my opinion, serves no purpose. If one must feel the effects of any change, they will do well to give it time to show itself. The allure of social networks is said to be great, but it helped that I am an introvert and am rarely inclined to go on a sharing binge, or, in turn, it helped that not sharing would not really be a problem for me.

As important as it is to understand what my experiment was for, it is important to realise what it was notabout. This was not an attack against social media. It was not an attempt to make a case against social media. It was certainly not to unearth the ill effects of social media (those closets have been emptied already as we shall see in a moment). And it was not to force people to quit social media. The entire experiment was simply to see if the presence of social media has stretched its arms beyond the purpose it was intended to serve, and how that can be curbed, and, consequently, what benefits this may yield, particularly in terms of ‘enriching’ our lives — a word far too often associated with quitting social media. Quitting some social networks was part of the process to understand what this meant, but it was not so much about quitting social media as it was about learning to use it more intentionally. This is the core of my entire argument and possibly my favourite part of the outcome of this experiment.

Part of this experiment was to find out if limiting social media use would necessarily result in our returning to a more productive, efficient life.

What else prompted me to initiate such an experiment was the new idea called a Fear of Missing Out (characteristically shortened by the internet to FoMO). Anil Dash came up with a counter proposal that he called the Joy of Missing Out (JoMO) and both the ideas caught on enough for the former to be recognised as a type of fear of regret. The idea that one may miss out on something scares people, and, to an extent, rightly so: proverbs such as ‘opportunity knocks only once’ do not quite help make a case. However, Kevin Systrom, the founder of Instagram, who agreed that his platform can be addictive, gave a sound reason why FoMO should not bother us: ‘We aren’t used to seeing the world as it happens. We as humans can only process so much data.’ My solution to FoMO is simple: you cannot help it, you cannot overcome it, so stop thinking about it as missing out at all. You will miss something, you are missing something right now. The point of life is not to cry over (or fear) what we might be missing, but to make the most of what we do have or are doing now. Or, better still, as Anil Dash explained his own experience, ‘I’d been mostly offline for more than a month… I wasn’t missing anything. I hadn’t realized that I was not only not in fear, but actually in a state of joy.’

An important part of this experiment, therefore, was to see if JoMO was really a substantial benefit of limiting social media use. Specifically, would such a limit result in a return to a more productive, efficient life in which we enjoy the present (which is, to some extent, what I expected), or will it actually be a source of newfound joy? While I understand that the spirit behind JoMO mainly stemmed from the fact that one would be just fine and does not have to be afraid of missing out on things, the promise of missing out also giving us joy was something I had my doubts over for the simple reason that if using social media is not resulting in pain, getting rid of it will probably not result in joy.

Lastly, it is not my intention to suggest that the social web is keeping us from living a productive or efficient or fuller life, or that it is preventing us from living ‘in the moment’. An analogy may be drawn with something more familiar and physical: walking freehand on a road for a thousand metres is possible, so is walking for a thousand metres on road dragging a boat behind you, just in case you reach shore someday. The latter is doable, but demands unnecessary effort, and social media is something like that (I will expand on this as we go) in that its benefits, while not entirely absent, are incomparable to alternative means that exist of obtaining the same, or better, benefits.

Part two

All the negative impacts social media can have have been researched quite well over the years. The proof is scientific, and more so than most people would realise. Social media is, quite literally, like a drug: it ramps up dopamine production in the body. Dopamine is a neurotransmitter that is stimulated by reward cues and by peeking into small amounts of information — getting a taste for things, in a manner of speaking. The act of sharing on social media is often seen as a ‘reward’ which further prompts Dopamine stimulation.

Further, as a whopping 68% of users admitted, the reason for sharing, almost outrageously, is not because people find it interesting, rather because people want to be seen associated with certain information to influence (or, as some would prefer to say, moderate) how others perceive them. I found this ridiculous at first, but, curiously enough, it makes sense when you think about it: our clothes, gadgets, books, hobbies are all reflections of what we are, so why would the information we consume — and show the world we have consumed — not fall into the same category?

Besides Dopamine, the same process works for the chemical released when people kiss, oxytocin, which shoots up by more than 10% when you send a tweet. Most surprisingly, this is the same amount of increase people experience on their wedding day. Somewhat dangerously, this also increases our trust in people at the time, which explains why Facebook users are 43% more likely to trust internet users than people not on Facebook. (My own use of Facebook was culled more than five years ago, so it will not be part of this experiment.)

But this is not the end: it has been shown that as much as 40% of our conversations are about ourselves while offline, and nearly 80% online. And, although people share on social media in an attempt to showcase what they feel is the best version of themselves (or of their alter egos), the target of a considerable number of social media posts are nota group of people but one specific individual. A part of me even wants to believe this is a direct reference to veiled, passive aggressive status updates that are designed to look like a general thought being shared but are painfully obviously a rant directed towards one person — whose identity is also obvious in quite a few cases.

It is not the fault of social media that people insist on using it to seek validation. The problem with social media is our inclination to overuse it.

Even ‘liking’ and ‘favouriting’ are done to appeal to the person who shared it so one can remain in their good books. I understand this is not true for everyone or for every share but the numbers are worrying nonetheless: 44% of users insist on this practice to ‘maintain relationships’. This is, once again, nothing new. It is like nodding in faux agreement at a party, either out of courtesy or because you are not in the mood to debate things just then. And people like being agreed with. But what it boils down to, in my opinion — besides the fact that, right from using to sharing, social media has traces of potential addiction written all over it — is that once a piece of information is shared, nearly 60% of people state they derive actual ‘happiness’ from others’ act of appreciating their shares, be it ‘liking’, ‘favouriting’, ‘plussing’ or whatever else. This, once again connected to dopamine, is our biggest reward.

An equally important factor for me — and one which I had been thinking about for a few years myself — was how social networking influences our thoughts. A study conducted by The New York Times that was compiled and reported by Hootsuite shows that 85% of people process the contents of an article better after reading others’ comments on that article and based on those comments. This is troubling to say the least: our thoughts and reflections should be based on the article we read, not on what others think of the article. It is worrying still because comments that attack the author personally without any factual foundation are just as effective as any other comment in influencing how we perceive the writing itself. This has often been my biggest complaint — or even case — against social media: it dictates our likes and dislikes and preferences and tastes to such an extent that I often joke that we are on our way to becoming completely ‘tasteless’.

Given how many negative signs the social web is already threatening users with, it might seem almost justifiable to put an end to it. However I believe this would be a hasty decision not thought fully through. It would be akin to blaming a tool just because people are misusing it. It is not the fault of social media that people insist on using it to seek validation. It is not the fault of social media that people associate it with a reward scenario and prompt dopamine and oxytocin release. Social networks have their benefits: they do help people connect, they do help get the word out, they do help share information, they do help people stay informed, they do help people entertain themselves, they do help people get their work out there for the world to see, and to have a platform on which to voice their opinions. The problem with social media is not social media itself, but our inclination to overuse it. The solution lies in what I stated previously as the crux of my argument: intentional, goal-driven social media use.

Part three

Let us return our focus to the experiment now that we have something of a scientific justification to — at least — reconsider our use of the social web. Since embracing JoMO was partly what drove the experiment, I decided to make a small diversion from social networks: the one thing nagging me all day long was not my e-mail (see, Inbox zero and Updates) but, on a similar vein, was my pending reading list on Instapaper.

Over time I had accumulated so many articles that I wanted to read (and would read, in an ideal world), that the numbers were shooting up to the late hundreds, while my read count was likely nowhere past the lower hundreds. Interestingly, most articles I wanted to read I would read then and there, and those I did read off my Instapaper were often never too old, but ones I had saved only a few days ago. This was more unnecessary weight in my eyes, to carry around a constant reminder that there exists a long list of articles I probably will never get around to reading — and how many of those may be outdated by the time I do get to them?

My solution to this was somewhat as radical as my move with Google+. I decided to get Instapaper off all my devices and stop saving articles to that service to read later. However, the important thing to realise is that Instapaper itself played no part in this. The spirit behind the service was actually built around more targeted, less distracting browsing, where one could focus on what they were working on at the moment and yet not lose some interesting piece of writing they may have come across.

In other words, I would continue to use the concept of ‘saving to read later’, except without Instapaper and with one added restriction: the read later list must be emptied every week, whether I have read everything on it or not. I chose to use Safari’s inbuilt reading list service lazily named ‘Reading List’. Not only does it sync like Instapaper all around, it supports keeping articles offline and ready. And it was handy for my new style of use: a week’s worth of articles saved for later, read and either saved to my Evernote Premium or cleared off my reading list; and every Sunday night the entire list gets cleared and readied for the following week — regardless of whether there still are unread articles or not. My other intention in choosing this Safari feature was to avoid using yet another app, but for those of you working cross-platform, perhaps the same limited approach with Instapaper or Pocket would work just fine, even while not being as handy.

In the same vein as this, we would do well to avoid a barrage of news sources. Aggregators were supposed to address this problem, but they are not all that different from RSS — this is a point that will be addressed in greater detail soon. Intelligent aggregators still send some poorly written, fact-free, logic-repellant writing even though they grow better over time. And then there is the question of bias and worldview. One argument that can be made in support of intelligent news aggregators is that they might expose you to multiple perspectives. This is true; it is also true that they save time initially by just sending news your way — albeit with a lot of badly written non-journalism and self-celebrating clickbait rid of any and all substance — and then learning your likes and dislikes as you go. However, this is exactly their problem: by learning your likes, these news aggregators may, over time, only give you news that you like to hear, thereby skewing your views. That is not how news works.

RSS is an undoubtedly more straightforward, simpler, and distraction-free solution that guarantees you a regular stream of quality content like no other.

A better approach would be what I like to call ‘targeted reading’. Not unsurprisingly, this is part of a purposeful web browsing experience, but that will be outlined towards the end of this essay. Targeted reading involves picking news sources you like, picking weblogs you like, picking magazines you like and so on, and then subscribing to them via good old RSS. This has several advantages. First of all, it reduces clutter giving you updates structured by source or manually chosen genres; second, it serves your news from sources you like and exposes you to counter arguments which are present in these sources (e.g. a libertarian paper will often address conservative views, thereby exposing you to both) unlike an aggregator which might disregard articles on the basis that opposing views are talked about, under the wrong impression that you may not like them at all — I am sure that algorithms are not so simple, but I am not entirely sure of their current status; improvements may have been, or will eventually probably be, made in this regard.

The other, equally important reason why I encourage RSS is weblogs. Besides news/magazine or dedicated genre websites, a majority of good, worthwhile content on the web comes from independent writers with active weblogs. Some are extremely personal in nature and may not appeal to everyone, but many (including this one) are more focussed on opinion pieces, most of which are well-written and worth reading. If you were to invest in the social web, carefully chosen blogs are your best bet. It takes time and effort to maintain and run a good weblog — unlike a social media profile, which is specifically designed to let people share as much as possible in as little time as possible (read, spending as few thoughts on it as possible) — which translates to the guarantee of good content on weblogs, even if not a fairly regular stream of it. But I would take a well-written, irregular blog any day over a fast-paced, mediocre social media profile of random articles interspersed with video game scores. At the end of the day, though, RSS is an undoubtedly more straightforward, simpler, and distraction-free solution that guarantees you a regular stream of quality content like no other.

Besides RSS, another interesting productivity-driven move I attempted for a few months was to limit all news/RSS apps to my tablet and use my smartphone solely as a daily assistant. I like how this segregates the task each device does to some extent, but I found myself with bits of leisure time, or waiting in queues and antechambers, with nothing but my phone to keep me occupied, which was a pain when there was nothing worthwhile on it for me to read through. (I rarely browse through articles, preferring instead to read them in full or not read them at all.) Needless to say, my trusty RSS app, Reeder, was back on my iPhone pretty quickly. Magazines, though, are confined to iPad for aesthetic reasons, and novels to my Kindle Voyage in a bid for an experience closer to paper books. And I still love and read paper books of course.

Speaking of apps, an interesting approach I found was to ask myself, before installing an app, whether it did something a first-party app already on my phone did not, or, alternatively, whether it did the same thing better — not differently, but better. If so, the app is welcome; if not, I leave it aside and ponder over it — you will surprise yourself how quickly you will end up changing your mind about an app you were sure you ‘needed’. This is not something I strongly advocate, because everyone has their own style of use and their own use cases, but if you do have the time and the interest, try out a smartphone experience as close to stock as possible and your device will become less of a distraction as time goes.

Part four

Sometime after I first asked about the many social networks we are on — perhaps even rhetorically now that I think of it — was when I first decided to formally start this experiment and quit Google+, which was my favourite social network at the time, Ello, another which I really liked and found active and welcoming, and a couple of other profiles online (see Cold turkey). Back then, this is what I had to say about Twitter, which was, interestingly, one of the ‘big’ social networks I had chosen not to quit: ‘I picked Twitter not only because it is the fastest to update and interact on but also because the ratio of the number of useful interactions/new networking opportunities I’ve had on Twitter to the time spent updating it is clearly higher than other networks.’

This was an interesting assessment and to me — a big fan of one-liners — this remains true till date. I think this is of particular importance because, unlike the myriad of cases we have seen over the past year of people quitting Twitter for one grandiose reason or other, I had no such precursor to my decision to quit. It was simply a conscious choice I made because I thought it would be beneficial to me in enough ways to make the whole idea appealing.

A year later I had cut down on everything but Twitter, Flickr and Instagram — and I did not miss a single one of them. I remained on Flickr because that is where my photography resides, Instagram because that is where I share my work as a secondary platform after Flickr, and Twitter because it is succinct and to-the-point and I have never been one for following people out of obligation or in return for gaining a follower. This has probably largely influenced my experience on Twitter which has been calm, sane and informative, unlike the raucous, short-tempered, racist, vulgar network so many make it out to be. (See, particularly, how Jon Weisman, a deputy editor at the Times, quit Twitter over complaints of harassment and hate speech.) This is likely either because I am not well-known enough to attract as much attention as Mr Weisman or I am more picky about whom I follow and bother to respond to, but to each his own: Twitter has been fairly good to me.

The trouble, though, came sometime in June this year when a study showed that 60% of people do not even read the articles they shareonline. Whole trends in the online world are created this way. This is something that had popped up in my mind once too often, but it was interesting — and disheartening — to see it put into numbers. So then arises the new question: how many such people do you follow? Even if they all did it just once, you have a few hundred links to supposedly interesting articles shared blindly. How many of these suggestions must you trust? And couple this with the fact that a lot of people share things only to improve the way others perceive them and it becomes all too clear that they may be sharing links to articles with interesting titles (or, worse still, clickbait), all unknowingly, in an attempt to look good. Twitter just went from a quick moving highway of information and one-liners to a cauldron of steaming hot, trendy mess.

Social networks are the opposite of fora: the fact that they are not specific content-driven means they can keep you for longer than you like, by serving you with a constant stream of content dynamically tailored to your liking.

For now I have eased away from Twitter: I rarely tweet, I do check once or twice a week for messages and mentions since it is still the quickest way for many to contact me and a lot of people do send me direct messages to discuss an article or two. (Twitter is quick, but e-mail is a more reliable means.) This meant I had to move my Twitter list members from my list of secondary news sources (those interesting enough to follow, but not important enough to subscribe to via RSS) into a dedicated folder in Reeder where they now reside. This leaves me on two networks, Flickr and Instagram, both meant for my photographic work and nothing else, both extremely targeted, which makes their use that much more justified.

That is the problem with social networks: they are not targeted, they have no niche which they serve, and are classified, at best, more by how we share information rather that what information we share. This is what makes online fora a much better way to network and communicate if that is, indeed, our intention. A photography forum, however critical, biased, shallow or gear-driven it may be, still has a scope and an intention that makes it specific content-driven. You go there when you wish to talk about photography and leave when you are done. The same goes for science fora or literature clubs and such. They all have their scope within which all discussions lie, they are targeted and you enter and leave at will when you wish to indulge in conversation about that topic.

Social networks thrive on just the opposite: the fact that they are not specific content-driven means they can keep you for longer than you like, by serving you with a constant stream of content dynamically tailored to your liking. Tired of a cat video? There is no need to leave yet, because they have another hilarious video with chimps and zebras that might interest you. Done with animals? How about this video about the top ten this and thats? The fact that social networks are so encouraging of genre-hopping has done two things to people: it has dramatically reduced the average reader’s attention span (I am convinced at least a thousand people closed their tab when they saw the length of this article) and it has made us slaves to a faux sense of gaining valuable information — in which both the key terms, ‘valuable’ and ‘information’, are suspect.

In March this year, The Guardian reported as much, with brief case studies of a handful of millennials who were turning off social media. More recently, less than two weeks ago, the Financial Times reported an Ofcom study showing that nearly half of internet users claimed they were ‘spending longer online than they wished’. Subtle signs exist all around that with clever moderating of information and data, social networks can subconscisouly affect our decision making — including our decision to leave their network or stay back on their website for just another two harmless minutes. Just two.

Erza Klein writes, in an excellent article in The Washington Post —

If I neglect my RSS feed today, the posts will still be there tomorrow. The same is true for the books I’m reading, the magazines piled on my nightstand, the tabs open in my browser, the long-form I’ve saved to Pocket, the e-mails I’ve filed away to read later, the think tank papers saved to my desktop, and pretty much every other sort of information I consume. The backlog nags at me, but I’ll get to it.

Twitter elicits a more poisonous information anxiety. It moves so fast that if I’m not continuously checking in, I completely lose track of the conversation — and it’s almost impossible to figure out what happened three hours ago, much less two days ago. I can’t save Twitter for later, and thus there’s always a pressure to check Twitter now. Twitter ends up taking more of my time than I’d like it to, as there’s a constant reason to check it rather than, say, reading a magazine article.

Addiction is the lifeblood of social networks. They have their uses, and Twitter, in particular, is a godsend for some journalists. It is the 21st century equivalent of hanging out at the mill, keeping an ear on the local village gossip; except, back then, you went to the mill with a purpose, whereas with Twitter, listening to the gossip is itself the purpose. With Facebook, it is all about staying in touch with people, but how many of them, really? Dunbar’s number is the maximum number of people one can maintain a stable and meaningful relationship with. At 150, it is a stretch for most introverts. The average number of Facebook ‘friends’people have is 155. My own circle is considerably more modest and I keep in touch with them just fine via text messages or iMessage.

Part five

Steven Baxter, in New Statesman, put the entire idea of the futility of sharing life updates into a dashing tagline: ‘Say hello to a world where you can just do stuff, without talking about the stuff you’re doing.’ He had just quit Twitter and his reasoning was clichèd but effective. Two years after him, even Simon Pegg gave the same reason as Mr Baxter — ‘it’s not you, it’s me’. This is of significance, once again, because these are not people who were motivated by incidents. They simply decided to call it a day.

However, there is a deeper reasoning behind this, which, I strongly believe, extends to all of the social web. It makes people ‘tasteless’. Social media creeps into every inch of our life without our express consent or knowledge and, before we know it, begins to subtly but certainly influence our every move as well as our mood. Knowing X did this and Y did that is enough to influence people’s daily chores and subconscious thoughts. By replication, repellency or misguided inspiration, every other action is influenced by what people see and read and hear on the social web (a lot of it untrue) to such a point that some obsessively log into the web to ‘check things out’ while what they are really hoping for is to be told what to like.

This can be hard to swallow at first, but is prevalent to a truer extent than we are probably ready to acknowledge. There has been evidence of external stimuli, from media to news about friends of friends, affecting our own thoughts, lives and decisions. A shallower, more easily understandable example of that is advertising: over 90% of people visit a store following an advertisement or word-of-mouth (text?) marketing they came across online, 89% of people look up reviews online and 72% of people trust these reviews as much as a personal recommendation, and, surprisingly, 62% of people buy productson this basis alone.

Social media, on a subconscious level, influence our ways more than we would have wanted them to had their effects been more obvious.

The comparison with advertisements is a more pronounced and opener case of precisely the same effect that everyday, seemingly harmless social network banter is having on us. That is not to say social networks are ‘harmful’. But they do influence our ways more than we would want them to if their effects were more obvious. (And for everyone who thinks they are not influenced — neither did the people in the above survey, who even claimed it was ridiculous to think that way. They were actually unaware of it and that is what makes this border on being dangerous.)

Adam Brault makes this point by considering people’s updates as a sort of advertisement of their lives and thoughts, an idea that I found extremely interesting and unquestionably valid. Although he speaks of Twitter in particular, the core idea holds for absolutely any social network out there —

But the problem that occurs is that it can be a huge mental lease we’re signing when we invite a few hundred people into our Twitter life. To some degree, it is choosing to subject ourselves to thousands of ads throughout the day, but ones that come from trusted sources we care about, so they’re actually impactful.

Even if the people we know aren’t explicitly selling things (not that there’s anything wrong with that) or promoting their personal brand (there is everything wrong with that), we’re still choosing to accept their stream of one-second ads with some kind of message all day.

Part six

While my initial view of social media has not changed in that I still believe a lot of it has no direct benefits that outweighs its negative side, I was probably wrong about Twitter. Granted, I have not had a bad experience there for the most part, but I think the urge to share a hundred-and-forty character thought is just that, an urge. And when you let the urge sit, it simmers into a deeper, more meaningful thought. Or you come to realise how pointless it was in the first place, which is not all that bad when you think about it. Twitter is still great for one-liners, but then not all thoughts are best put as one-liners. My reason to stay on, though, has more of a community aspect to it: if I want my writing to be read, shared and discussed, it is only fair that I too add something of value to the community in return.

My two pronged approach to growing thoughts remains this website, where I write about my opinion and other factual essays on science, technology and society, and Marginalia, my Tumblr where I share shorter pieces of thought. I like this demarcation for two reasons: I am more in control over what I have to say, and the moment I find all of this too cumbersome and worthless, I can call it a day. However, so long as this website exists, as with anybody else’s, the effort is clearly visible, be it in design, maintenance, or writing. With social media, this is wholly absent and all that counts is how active your profile is. And there is always the urge that many have to keep it active while really adding nothing of value.

All said and done, there are some avenues of the social web that I would not advocate against. For example, Slack, iMessage or (and this may come as a surprise to some who know me) WhatsApp. All three have their benefits — flexible, quick work communication or savings on SMS costs in case of iMessage and WhatsApp (which is especially huge in Germany) — and, perhaps more importantly, they do not require you to expend your time and thoughts on a profile or a stream of updated content and are therefore not as great a distraction or influence as traditional social networks.

As far as not getting lost in a browser chain with thirty-five tabs goes, a sort of luddite’s approach is to note down what you want to look up online and then dedicate an hour or two a day to follow up on those thoughts. While this might seem counterintuitive to the point of an ever-accessible web at first, one must realise that being accessible was not the primary intention of the web; making information accessible was. What cannot be denied, though, is the fact that this old-fashioned approach will go a long way in reigning in your wandering habits on the web.

By now (for you — this happened about halfway into the experiment for me) it ought to be clear that the entire idea of pulling away from the clutches of the social web and attempting to usher in some of that JoMO was not simply about quitting social networks. There is a lot more to it, most of which revolves around cutting down distractions so that you can have a more purposeful, intent-driven experience on the web which, by any measure, is a more beneficial experience than any other. You could install selective website blockers on your browser or trust your own will power — since not all of us are addicted to social media. But there is no doubt that putting in a conscious effort to regulate the time we spend on the social web can have lasting positive impact on our life.

Nick Bilton calls this ‘reclaiming our real lives from social media’ in his article on The New York Times, where he gives the example of Ernest Hemmingway, explaining how, once when he came across a pocket of free time, he began to write what would become his memoir, A moveable feast. Had the writer been alive today, points Mr Bilton, he could probably have not put pen to paper, preferring to pull out his smartphone and proceed to ‘waste an entire afternoon on social media’ instead. While this is possibly a more dramatic example, it serves to make a point and makes it well. The question should never be about quitting social media, but rather about all the wonderful things we can do if we chose to better manage our time online and save lots of it for our own ‘moveable feasts’. After all, I have little doubt in my mind that if Hemmingway were alive today, he would have really liked Twitter and would have exhibited his brilliance there in 140 characters (much to Mr Franzen’s chagrin — see section VII below) on a daily basis.

This is where my call for more targeted browsing and targeted reading comes in: back in the days of libraries — I know they still exist today — people had access to more information they could handle, much the same as today, but with a catch. In a library, your research had to be targeted; you had to look up a genre, a topic, a book, and then flip through and read several pages to accumulate the information you wanted. In the process, you learnt a few new, related things, and you learnt what you were looking for. Perhaps you came across a reference to another work that led you on a hunt, once again across topics and book titles until you came across it and then had to leaf through some more before you found what you were looking for. It took time and effort and imparted a proportionally large value to the knowledge you had just gained.

The internet is different. It takes almost no effort to find what you were looking for and going from one source to another does not send you on a hunt, rather a click does the job for you. Everything is a click or few away. And the thoughtful appreciation, like pausing and blowing on a cup of hot coffee, that came with a few hours spent in a library is nowhere to be found on the web. Here you are gulping in your drink so fast that you cannot tell what you drank. The value of a given piece of information has not lowered, but the access to nearly all of the information on earth and the fact that the internet affords us that means, ironically, that the scale of this enterprise is simply lost on us and the value of a piece of information is not as readily apparent. That leads to an information overhaul, where we do not realise we have had enough until we have had too much and lost track of it all.

In providing us with infinitely more information than we can handle at any given time, the internet has made things counterproductive. But the same argument could be made for a library as well, which means the onus is on us as users to be more conscious of how we use it, to take things in in a planned fashion, just as much as we want, knowing, all along, that the rest of the information is still out there and that we can come back for it any day, anytime. The purpose, right now, is to remain focussed on what we are looking for and take that much away. The internet, for better or worse, led itself to be designed like a maze with the sole intention of keeping us in it. But we still control the off switch.

Part seven

At the end of my experiment, I have come to make a number of changes. It started with quitting Google+ and Ello, a brief exit from Instagram (although what prompted it was the fact that I had temporarily got locked out of my account) and eventual infrequent return, increased use of Linkedin, despite its horrid design, because I found it useful from a professional perspective (not to land a job, rather to have informative discussions on professional groups), easing out of Twitter, which, like Google+, will automatically send out links to my latest posts while not carrying too many manual tweets, but where I will keep track of direct messages and mentions. I am still on two minds about Twitter, mostly because I did reap good benefits out of it and have had a pleasant experience, contrary to several others, so that stays for now.

While I did not blindly cut myself off from all of the social web — that would be a dumb thing to do and a sign of not adapting to changing times and taking advantage of potentially beneficial solutions — the fact that I reduced my use on three-quarters of them has given me considerable free time and peace of mind. This was surprising because I was not a heavy user on any of them, but the fact that I did not have to update or keep up with any of them did leave me somewhat freer.

Some argue that social media is a bad thing. I beg to differ. The American writer, Jonathan Franzen, for instance, blames Twitter for not citing facts or creating arguments in 140 characters. It is hard for me to take a man seriously when he hates so many things. Mr Franzen has spoken of his dislike for everything from Facebook to emoticons to ebooks to Salman Rushdie to Jeff Bezos (he once called Amazon’s boss one of the four horsemen of the apocalypse), giving me the impression that he is a staunch luddite unwilling to let the 20th century go. Plus, at the end of the day, if you go to Twitter for 140-character-long facts, you are missing the point by a wide margin. Twitter, to me, is more about directing people’s attention to things, be it with links or other media, and having crisp conversations. It is not for writing long-winded arguments backed with MLA format citations. But, yes, if you are clever enough, you canmake a sound argument in 140 characters. The point is, nobody gains anything by painting the social web as the Lex Luthor to our ‘Superselves’ and running away from it. Further, blaming social media for things like spreading false news speaks more about your inability to judge the trustworthiness of links etc. than about any fault of the network itself. Like everything else, it comes down to what we make of it.

The simplest approach would be to schedule social media use like you would schedule any other activity. Avoiding it first thing after you get up and last thing before you go to bed makes a considerable difference.

That said, I did alter my Twitter use slightly. I now group it under my news folder rather than my social media folder, and I browse it twice a day along with my news. I use Tweetbot which makes lists more prominent than the official Twitter app (where they are an unfortunate afterthought and advertisements are more front and centre for whatever reason) and use lists to browse updates for no more than five swipes, thereby keeping my use limited. I also check up direct messages and mentions and more ‘social’ tweets from my following list the same way. This balances what I take from the network and give back as well — it enriches the community and possibly helps Twitter get some funds in the process.

The key point is that social media is no longer an obstacle now, rather a small part of my day. This does not work in favour of advertisers, but that does not bother me. The simplest approach would be to schedule social media use like you would schedule any other activity. Perhaps not as rigorously, but something as simple as avoiding social media first thing after you get up or last thing before you go to bed and ensuring you use it for no more than a few minutes when you do use it and have at least two- to three-hour breaks between consecutive uses should all make a considerable and visible difference. A lot of my argument may appear to focus on Twitter, but that is simply because I am active there and am not on Facebook; however, since none of my arguments have to do with Twitter’s character counts or usage patterns specific to it, these arguments may well be extended to Facebook, Google+, Ello and others. One might be prompted to quit because they had enough and ‘it’s not you, it’s me’ and that is fine.

Over the course of this year, I managed to update my readers with an article every month or two regarding my current thought process as I went through this experiment. You can find some of them below:

  1. How many social networks are you on?
  2. Inbox zero
  3. Updates on inbox zero
  4. Cold turkey
  5. Practise web introversion
  6. The Joy of Missing Out
  7. On pre-crastination and JoMO
  8. Are stock apps more than sufficient? (Part I)
  9. Are stock apps more than sufficient? (Part II)
  10. Tumblr is what you make of it

Like any tool, it comes down to how you use the social web. I will remain on Twitter, but I will remain dormant for the most part. My activities will be based around Marginalia and my main weblog from where links to whatever I write will be shared automatically to Google+ and Twitter like they have been for years now. I will interact with anyone who interacts with me, but I will stop feeding these social networks any more than necessary and stop being mined for data in return.

My suggestion to whoever cares to listen would be to first identify the problem/s in question: this is a five-fold issue wherein the web is dwindling our attention span, disturbing our thought process, handing us more information than we can handle, exacerbating our Fear of Missing Out, and keeping us hooked on it. We can keep our attention span through targeted browsing as described above; we can have our uninterrupted thought processes by scheduling our social media use; we can counter information overload by not keeping ‘read later’ backlogs and, hand-in-hand, fight FoMO by facing the fact that we will miss out on many things, most certainly on information, and that that should not hold us back by any means; and, lastly, we can remain ‘unhooked’ by taking a vacation from social media.

Some people use August as a month to keep off the social web, others like the Dutch initiative, 99 days of freedom, came up specifically in response to carelessness on the part of social networks, prompting people to imagine a life without them and then live that life as part of a joint experiment with people around the world for 99 days or forever. There are calls to unplug, disconnect and smell the air around you.

My own stance is that one need not viciously abandon social media for one’s own lack of self-control; the middle ground is better: whether you join an experiment, take a vacation or do anything else, make sure that you manage your time better on the social web once you get back on it. That is one of two things that my experiment taught me. If, on the other hand, your reason to quit is simply a choice, that you have experienced all you wanted to and you now want to call it a day, then by all means quit. That is the second thing my experiment taught me: social media will go on and evolve and die like everything else, and it will always have users and quitters; make it a part of your day so quitting makes little difference. Use social media as a platform to voice your thoughts and do not spend time making it a representation of yourself. If, however, you want to invest your efforts in something that will personally represent you, and in something that will last, forget your social media profile. Start a blog instead.

Recommended reading

Share: (return to the journal)

Information overload and an overly social web — an experiment