Random approaches to better living

Knowing what is enough, working on traction, and developing the beginner mindset can enrich our life.

‘Better living’, as interesting as the idea sounds, is hard to define in a manner that would please everyone. For my own part I think of it as living happily and productively, nothing more. And to that end I have of late been entertaining three ideas in my mind that I now feel have matured enough to put to words. In short, defining what is ‘enough’ for us, identifying traction, and developing the beginner mindset are three key ways of ensuring ‘better living’ — at least in my humble opinion.

As a practising Stoic stumbling through the philosophy on a daily basis I do not find myself quoting Diogenes often. Diogenes was a Cynic and their school of thought differed from that of the Stoics on a fundamental level. Yet the story of Alexander meeting Diogenes has always interested me: when Diogenes the Cynic was sunbathing one afternoon the great king Alexander supposedly visited him and asked if he wanted anything (I get the feeling there is more to this story that is lost in history). I will let Plutarch continue with the rest of the tale—

…when that monarch addressed him with greetings, and asked if he wanted anything, ‘Yes,’ said Diogenes, ‘stand a little out of my sun.’ It is said that Alexander was so struck by this, and admired so much the haughtiness and grandeur of the man who had nothing but scorn for him, that he said to his followers, who were laughing and jesting about the philosopher as they went away, ‘But truly, if I were not Alexander, I wish I were Diogenes.’ and Diogenes replied ‘If I wasn’t Diogenes, I would be wishing to be Diogenes too.’

Diogenes obviously comes off as an arrogant man in this story. (There are other versions of this tale as well and Diogenes appears arrogant in every single one of them.) Needless to say, I prefer the Stoic approach to this myself: the Stoics believed in knowing one’s preferred indifferents, those things which one might possess—intentionally or otherwise—which one can enjoy without dependence or fear of being crippled should it be lost.

What this comes down to is knowing what is enough. This does not mean giving away all of one’s possessions, rather it means gaining better perspective about what one does have and deciding educatedly from that point on. Diogenes needed nothing from Alexander because he had ‘enough’. The trouble with a lot of people is not knowing what enough is, which is not the same as wanting more: one can entertain the idea of a life with more while still living a perfectly content life in the present with what one has. Simply put, not knowing ‘enough’ means being a slave to desire to the point where one cannot progress without first getting more — a vicious cycle that never ends.

True story, Word of Honor:
Joseph Heller, an important and funny writer
now dead,
and I were at a party given by a billionaire
on Shelter Island.

I said, “Joe, how does it make you feel
to know that our host only yesterday
may have made more money
than your novel ‘Catch-22’
has earned in its entire history?”
And Joe said, “I’ve got something he can never have.”
And I said, “What on earth could that be, Joe?”
And Joe said, “The knowledge that I’ve got enough.”
Not bad! Rest in peace!”
Kurt Vonnegut

This also brings to mind a story involving Joseph Heller, the author of the wonderful book Catch-22. Kurt Vonnegut wrote back in May 2005 for the New Yorker about Heller, describing how, when the two were at a billionaire friend’s party, Vonnegut asked Heller how he felt about the thought that their host probably made more in a day than Heller’s magnum opus would make in a lifetime. Heller replied that he had something their host could never have: the knowledge that he has enough. This idea is key to a better life for one simple reason: it clears your mind. Knowing what is ‘enough’ helps you be grateful for things you have and lets you pursue things you want with clarity without blindly falling for them.

The second thing is handling distractions. By ‘distraction’ I do not refer solely to what people usually think about when they hear the word. These are more than just entertainments or any of the various forms of addictive, habit-forming activities that give people a dopamine rush. Some of the ideas I discuss under controlling distractions come from an interesting book called Indistractable: How to Control Your Attention and Choose Your Life by Nir Eyal. The argument is that acknowledging what we think and feel is the first step towards overcoming obstacles that make us unproductive.

The words traction and distraction both come from the Latin word ‘trahare’ which means to pull; traction then is action that pulls you in your intended direction, i.e. towards your goal, while distraction is action that does not. In other words, traction is positive intent while distraction is a lack of intent. The first step at every stage is to categorise an action as either traction or distraction.

While the categorising is simple enough, choosing the tasks one intends to do is often neither simple nor straightforward. From this stems the ancient Greek idea of Akrasia. Simply put someone who is akratic lacks command; more specifically, this translates to a lack of self-control or, even more specifically, akrasia is one’s tendency to act against one’s better judgement. Plato asks if one knows an action to be the best course to take, why would they not do it? Why, in other words, do we watch that one extra show on Netflix knowing that is not the best course of action?

Nir Eyal argues that the way to overcome this is not simply to make up one’s mind — a brute force attack such as that would prove to be feeble. Backed by lots of research he says it is futile to rely on will power, rather we should be employing methods that help make us more conscious of our choices. This is a long term approach of sorts.

Sensations that lead us towards distractions often crest and subside; they rarely stick to an eternal upwards path. We need to take time and identify these, a process called ‘surfing the urge’. Literally, take time off from whatever you feel like doing, even if it is just ten minutes; and in those ten minutes let it sink in that this is what you want to do, and then, at the end of ten minutes, do whatever you want — even if it still happens to be the most distracting of choices.

This seems simple but is quite a powerful approach. By pausing for ten minutes before starting that one last episode on Netflix we are explicitly acknowledging our action — whether it is traction or distraction — and we are letting it sink. It is highly possible, says Eyal, that in those ten minutes we realise better and change our mind. But it is equally possible that we do not. Either way, we are now more aware of our choices than we were before and this translates, in the long run, to a habit of making better choices consistently or at least of being able to control and brace ourselves against poorer choices more effectively.

It is time to learn one of those oddly specific words from the Orient that refers to something I have noticed for an incredibly long time with my own journeys in learning. Shoshin loosely translates to beginner’s mindset. I have often noticed when I start to learn something new that I have an immense openness to the learning process, but this fades over time. It never disappears entirely, but neither does it stay the same after a while as it was when I started out. This is also not a drive to learn: the drive never disappears. This is simply an openness to ideas. Seeing how a word like Shoshin exists I cannot help but think this is something lots of others experience too.

To be productive and in turn to live a better, happier, healthier and more meaningful life (in my opinion — need I say it anymore?) the third approach is to try to develop Shoshin. Start looking at things — really looking at things — like you do not already know them. Live through questions, ask questions, look stupid, admit ignorance and focus on learning something new all the time. Relax from the pressure to have all the answers before taking a single step because then you will never take any step ever. Every good book ever written was written by a learner sharing their experience, never by a master claiming to know it all.

Call it divergent thinking, unlearning, adaptability or whatever else, putting the effort to constantly empty your mind before doing something and building up the enthusiasm for the task we are about to do can have a huge impact on how we work.

Shoshin

There is another way to look at all this: as kids we were open to more possibilities than we are as self-perceived ‘experts’. For some years now I have referred to this with myself as ‘aiming for the ideal’. This also brings to mind Alain Souchon’s song Foule Sentimentale in which he sings ‘On a soif d‘idéal’ — we are thirsty for the ideal. As we grow we are told more often about the impossibilities than about possibilities which makes us unknowingly build a walls and moulds around ourselves which in turn forces us to fit everything we come across into these familiar moulds. Break free.

One simple method to consciously overcome this is to improve our listening skills in a particular way: the next time someone is talking and they say something you already know, hold yourself back from pointing out that you are aware of it. Instead listen and build on it, reciprocate and drive the conversation forward with the hope of learning something. This way you will not end up curbing your own learning and self-improvement by ending conversations with the explicit acknowledgement that you are already aware of something.

This is perhaps what feels easiest among the three ideas discussed in this essay but its simplicity is deceptive because we are, as humans, predisposed to showing off what we know. This is not negative; our society is built to reward our capabilities and skills more than our enthusiasm to learn and we are eager to score points in this regard. Get over the mind block that you know something if you want to learn something effectively.

I doubt the reason for this is arrogance in a majority of cases although it might appear like that on first glance. Perhaps most of us are simply enthusiastic when we recognise that we already know something; equally, perhaps we learn over time to recognise and point out that we know something. With this we inadvertently come in the way of our own learning.

A follow-up to this essay is in the works and in it I expand on the idea of embracing the beginner’s mindset by simply slowing down in life — by which I mean hustling in a planned and intentional fashion, not being lazy. Until then these three powerful ideas should provide ample food for thought: by defining ‘enough’ we develop contentment and clarity; by identifying traction we slowly take control of our choices and, in turn, our life; and by adopting the beginner’s mindset we ensure that we can make the most of the time we spend doing productive things, be it work or learning or anything else. The end result would undoubtedly be a better, happier daily life.

Secrets of staying motivated

A bunch of practical ways to stay motivated, plus things that do not work. Fair warning: there is no magic potion in this essay.

Staying motivated seems to be a daily struggle for a lot of people. Those of us who have found this—luckily, perhaps—rather easy to do are no strangers to the oft-posed question, ‘How do you do it? How do you stay motivated?’ The answer is quite simple, really; it is simple enough that it turns a lot of people of, but that changes nothing. The secrets, if I may call it that, to stay motivated are contained in an exposition that need not be longer than two sides of a foolscap.

Motivation needs to be replenished

The first step is to think of motivation like water, nay, like alchohol. It is like water in that taking a sip is never enough; you need to replenish your stores every once in a while, all day long, all your life. And it is like alchohol in that you can quickly get addicted to it and this can be a terrible thing. This, in a nutshell, is why so many people think motivation does not work for them: it does work, but only briefly, only for a while after they devour their motivation—be it a book, a film, a quote or even time spent with somebody. They feel it never ‘works’ because it does never lasts.

That is lesson one: You cannot expect to get motivated once and for life, if you choose to get into it expect to keep refuelling yourself.

Take time off and use a crutch

Sometimes the trouble is not refuelling our motivation but having no idea where to get the fuel from. In such cases it helps greatly to look to something else that does motivate us. Your field might be mathematics, and you might feel unmotivated, but perhaps you are still excited about clay modelling or that painting project you started six months ago. Drop whatever it is that you feel unmotivated to do, even if only for a little while, and do something that you are in fact full of excitement for. But stay aware of this at all times because it is easy to get carried away and not come back eventually to your original task.

That is lesson two: When you feel unmotivated about something, drop it and work on somethng you do feel motivated to do, then return to the original task.

Is it motivation or discipline you need?

A lot of people far too often mistake motivation with discipline. They do not want to do something simply because they are too lazy to do it, but they put a spin on it making it look like they lack motivation because it sounds—and makes them look—better. However, the fact is that a lot of what we credit to motivation is due actually to discipline and consistency. This reminds me of what William Faulkner used to say: ‘I only write when inspiration strikes. Fortunately it strikes at nine every morning.’ Motivation and inspiration are great, but a lot of them are born out of consistency.

Try to work on your discipline rather than on seeking things that motivate you; motivation then becomes a perk that pushes you further ahead on the path you are already set on rather than fuel that gets you somewhere in the first place.

And that is lesson three: Seeking motivation becomes more meaningful when you establish a constant foundation on which to build yourself; discipline is key to building such a foundation.

Understand promotion and prevention mindsets

Motivation comes in two forms: promotion motivation and prevention motivation. The former is the stronger and more obvious of the two, it is what drives people, and is what people commonly associate with motivation. The latter is a bit indirect and is rarely seen or used as a form of motivation to the point where, ironically, people who do use it might mistake it for an absence of motivation. In a study published back in 2017 researchers found that the best way to stay motivated is to identify your progress and switch from one form of motivation (or mindset) to another.

Promotion mindset is when you are motivated by your end result, by the outcome of your project—be it preparing for class or trying to get fit. This is when you think of the great changes you can bring about by teaching students or by hitting the gym regularly. However, you cannot rely on this alone as you progrss towards your goal. Halfway through switch to prevention mindset, where you start focusing not on the end result but on all the things you have to stop yourself from doing to get there. For example, you might start off by planning everything you want to do for your students—such as fun new activities and dividing focus across a broad set of topics—and the things you want your students to do; but, as you progress, slowly switch to taking note of things you want to avoid—such as unplanned class hours, or too many wayward activities that teach little—and the things you want your students to avoid.

That is lesson four: Once you know what you want to achieve, track your progress and transition from a promotion motivation mindet to a prevention motivation mindset.

Cheer for yourself: set aside time and space

Goals come in various sizes: big or small, long term or short et cetera. If your goals are closer together, getting from one to another will itself serve as a huge motivator. En route, then, you might need a temporary boost or two at best. What is important, in the midst of all this, is to keep track of your progress and to do that effectively you need to set time aside for yourself, to examine where you were, where you are and where you want to go. Use this time also to do something completely unrelated that can free up or relax your mind. None of this has to be lone outing, though, so go with your spouse or colleagure or any friend you like. Or go out alone because that works too.

Equally important, create your space. Something as simple as a desk will go great lengths in helping you create a mindset for yourself that will drive you forwards. A little music in your earphones or some candles or even incense or a trip to the gym before work, pick from any of a million possibilities that exist to help you create a space for yourself—both physically and mentally—to help you find that zeal to work. The point here is simple: cheer yourself on, because no matter how much your family and friends cheer you on, you will go nowhere if you do not cheer for yourself.

In the midst of all this, however, do not fall prey to that oft-quoted reason for neither working nor feeling motivated: that there is something wrong with the place itself. Create your space and your time knowing that it is you who has to get somewhere and nothing outside you is to blame.

That is the final lesson: Create your own time and space—even if only mentally, without disrupting anybody’s physical space around you—and use that to cheer yourself on and create an environment that makes you want to work, all the while never becoming enslaved to it.


Keep at all this consistently and motivation becomes an extension of who you are. You will achieve self-sufficiency and, while you can continue to seek beacons elsewhere or in others, this will be by choice nor need because you will, yourself, always be what drives you on.

The ‘secrets’ for staying motivated then are pretty simple; as promised, they are in fact not much more than two sides of a foolscap.

A guide to living in the present

Examining what really works when it comes to living fully and blissfully in the present moment.

Oftener than we realise, we lose chunks of our lives to a nasty habit we are barely aware we have: the insatiable, perhaps uncontrollable, yearning to dwell on the past. We feel a sense of contentment pondering over what could have been. It takes us to an alternate reality in our thoughts. Our reality. Like alcohol, this offers an addictive sense of comfort but promises ill effects in the long term: effects that sneak up on us slowly enough that they are impossible to spot until it is too late.

The solution to this is simple: live in the present, not in the past; stop worrying about the future, work on it in the present instead. But, like many solutions, this is easier said than done. This is a short guide that takes on the ambitious task of simplifying how one can learn to live in the present and do so consistently. It all comes down to keep a few things in the back of your mind.

What does not work

Ukiyo

There is a word in Japanese that has no equivalent in English: Ukiyo means the floating world. It alludes to the practice of going with the flow, living detached from the worries arising from anywhere but the present.

Living only for the moment… and diverting oneself just in floating… buoyant and carefree, like a gourd carried along with the river current: this is what we call ukiyo.Asai Ryōi in ‘Ukiyo Monogatari’ (Tales of the Floating World) c. 1661.

While I am fond of such a romantic idea myself it is not hard to see the problem with this approach. Going after ukiyo makes it seem as though one ought to give up on life and get tossed around in the wind. It gives the impression that living in the present means turning a blind eye to the past and becoming a slave to circumstances. Worst of all, it makes living in the present seem like an impractical approach to life.

Ukiyo, beautiful as it is, can only be appreciated after one has learnt to live in the present. We ought to seek a different route.

Minimalism

The buzzword these days is minimalism which is frequently advertised wrongly as the technique of solving all problems by reducing material possessions. This is a bit like not making friends because you might disagree on something someday: eliminate the subject to avoid the problem. Minimalism, in some form or another, preaches, among other mindless things, that ‘removing items associated with past memories… frees us up to stop living in the past and start living in the present’.

This notion is somewhat like the clichéd scenes from television and film where old photographs are either burnt or flushed down a toilet or both. This might help in the moment but it is akin to pasting a bandaid over a cut while ignoring the fracture underneath your skin.

It comes down to one simple argument: you will eventually reach a point in your life where you can literally not afford to throw away anything more but you will still find yourself living in the past and in the future more than in the present.

Like ukiyo embracing the minimalist idea that throwing away material possessions is the panacea to everything can be more dramatic than practical. Unlike ukiyo I doubt it can lead to much even after one has learnt to live in the present but I digress because my opposition to minimalism is not the purpose of this article.

What does work

Stop distracting yourself

A lot of people I have come across tend to solve this problem by seeking distraction. Avoiding an issue this way not only fails to solve it but also lets it grow out of control. Confronting it is key, so sit idle for a couple of minutes and acknowledge that something is drawing you to the past or future and then apply the other steps discussed below.

Unselfconsciousness

Not being self-conscious is something that can do wonders. We live anywhere but in the present simply because we are far too conscious of our reactions from the past and of how we might react in the future.

Reducing self-consciousness allows one to be more mindful of the present experience rather than being dragged around by our feelings, self-esteem or self-confidence.

Breathe

This is probably the only romanticised notion that actually works. The reason so many people lose themselves to the past or the future is because they seem to have nothing to hold on to in the present. Even if they do, the lack of something constant over time weakens their will to hold on to it badly.

However, there is something that is eternally constant that one can hold on to. The rhythm of one’s breath. When you feel you have nothing to focus on, or when you feel yourself focusing on the past or future, quickly start observing your own breath. Notice when you start breathing in, how long you keep breathing in, when you start breathing out, and how long you keep breathing out—and then repeat it.

Be Stoic about it

While the above two techniques come from modern psychology there are older ideas that come from Stoicism, which has been my personal favourite philosophical approach to life since as long as I can remember, and which I constantly attempt—not always successfully but I keep trying nonetheless—to employ in my daily life. Stoicism offers beautiful insights that can practically help you live in the present.

While there are several Stoic philosophers one can think of in this regard it is Marcus Aurelius who comes to my mind now. There are two notes from his Meditations that are especially worthy of our consideration. In the first (Book VIII, no. 36) he speaks of why living in the present is so important: by focusing on the present we limit what weighs us down and leave ourselves with enough that we can tolerate and overcome. By thinking of the past or future we weigh ourselves down with more than we can handle and enter the so-called negative thought spiral.

Don’t confound yourself, by considering the whole of your future life; and by dwelling upon the multitude, and greatness of the pains or troubles, to which you may probably be exposed. But ask yourself about such as are present, is there any thing intolerable and unsufferable in them? You’ll be ashamed to own it. And, then, recollect, that it is neither what is past, nor what is future, which can oppress you; ’tis only what is present. And this will be much diminished, if you circumscribe or consider it by itself; and chide your own mind, if it cannot bear up against this one thing thus alone.

From Book VIII of ‘The meditations of the emperor Marcus Aurelius Antoninus’, translated by Francis Hutcheson and James Moor.

In the second (Book VII, no. 29) he speaks of how we can let go of the past. Adopt the philosophy that a fault should lie where the guilt lies.

Blot out all imaginations. Stop the brutal impulses of the passions. Circumscribe the present time; and apprehend well the nature of every thing which happens, either, to yourself, or, to others. Distinguish between the material and the active principle. Consider well the last hour. The fault another commits there let it rest where the guilt resides.

From Book VII of ‘The meditations of the emperor Marcus Aurelius Antoninus’, translated by Francis Hutcheson and James Moor.

The same idea holds for the future too.

More Stoicism

There is a lot more besides Aurelius to Stoicism and, for efficiency, this is a summary of ideas.

First, Stoicism teaches that thinking about the past and future is normal. Indeed it teaches that thinking about the past and future is wise and prudent. On the face of it this might seem like an argument against living in the present but there is more to it. What Stoicism asks next is what one does after having thought about the past or future because that is what matters most.

In case of the past, realise that there is nothing you can do to change it. You can reflect on it but when that reflection takes a toll on your present is the reflection still worth it? By dwelling on the past you are losing the present and, in turn, deforming more of what will soon be your past. This gives you even more reasons to dwell on the past that seem—at least on the face of it—to stem from a legitimate concern for self-improvement. This leads to a vicious cycle.

In case of the future, realise that you will get there eventually. You certainly should plan reasonably well ahead, but when you obsessively focus on the future you might find yourself eventually stumbling to some point having lost all the wisdom a mindful journey to that point might have provided.

The present is the only place in your control. The present time, the present place, your present situation are what deserve your full attention. You can do this by observing your mind. Stoicism encourages practising stepping out of your own mind and examining it, observing it, and recording its patterns without ever judging it.

Keep in mind, you will not make it in one fell swoop but you will get there if you keep trying. Start by first brining your mind under your control whenever you notice it rushing away from the present; then keep trying to consciously notice your mind every now and then to keep track of it; keep working on this until you gain enough consciousness to fully keep yourself in the present and only ever dwell on the past or the future when you choose to and within limits. It is then that you will experience the bliss of ukiyo.

A practical guide to always doing your best

A book that offers something new by not offering anything radical. A book that believes not in always being your best but always doing your best.

Many a self-help book starts by setting itself a tall order so it comes as no surprise when it falls short of delivering. Todd Henry’s book, Die empty, sets out with a somewhat grandiose aim too but, surprisingly, it manages to deliver a practical, actionable set of advices to help readers as promised—and keeps it up at least during the first chunk of the book. As it progresses it appears to be drawn out, starts targeting businessmen (seemingly unaware of this) and relies heavily on clichés. Yet, if you patiently pick up titbits you might come away gaining a lot from this book; not a lot of which will be new to you, but rather a lot of which you will find is told in an effective, eye-opening manner designed to make the idea last in your mind. In the end that is what makes Die empty a book worth reading.

The purpose of it all

The book starts of with an anecdote—one of many to come—in which Mr Henry speaks of how the urban planner and artist Candy Chang once created used a large chalkboard to make a work of art covering one wall of an abandoned home in her neighbourhood. On it she wrote the prompt ‘Before I die I want to __________’ several times and left chalks around for passers by to fill in the blanks. And many did, enough to spread the exercise to over a hundred cities around the globe. Mr Henry uses this to explain what connects a lot of people: the simple fact that we are aware of our limited time here on earth and that we all have something we want to do. Consistent practises can help us unleash our best work everyday, he says, so in the end we don’t regret how we spent our lives.

It was my wife who brought this book to my attention when she shared with me another of Mr Henry’s anecdotes. (It turns out, however, that this isn’t from Die empty but from Mr Henry’s previous book The accidental creative although he recounts it again briefly in this one.) A friend once asked a strange and unexpected question: ‘What do you think is the most valuable land in the world?’ It is neither Manhattan, says the friend, nor the oil fields of the Middle East, nor the goldmines of Africa. The most valuable land in the world is the graveyard; in the graveyard are buried all of the unwritten novels, never-launched businesses, unreconciled relationships. In short, all things that people thought they would get back to ‘tomorrow’, but their tomorrows soon ran out. This is a great idea, and Mr Henry uses this to urge readers to empty themselves of all the creativity lingering inside them. Rely on sustained effort, he says, not accident. The effort will be well worth it.

People often regret not having treated their life with purpose, he points out while clarifying the purpose of his book: to bring a newfound clarity and sense of urgency to how you approach your work on a daily basis. ‘In writing this book’, he says, ‘I’m taking my own advice and not leaving my best, most important work inside me.’ Good point, I remember smiling to myself.

The ‘sense of urgency’ can understandably be mistaken for a mindset of getting things done urgently, at all costs. Mr Henry rightly takes time to correct this. He compares this to Karoshi, an idea that has long interested me, saying, specifically, that the idea behind Die empty is not about working all the time, or working past reasonable limits; never ignore everything else in your life to get things done. Don’t work frantically, says Mr Henry, reiterating a point he mentioned multiple times in the book: make steady, critical progress. Dying empty is not the same as living ‘like there’s no tomorrow’.

Karoshi is a Japanese term that refers to occupational sudden death, or dying from overwork.

What counts as work?

The core belief of this book, therefore, is that your days are finite, that you have the capacity to make a contribution to the world, and that you cannot work with huge expectations in return for what you do. Making a point I particularly liked, Mr Henry talks of the overemphasis of what he terms ‘celebrity and recognition’ that is rampant in society today. This sort of outlook is unquestionably dangerous; one needs to work without expecting such recognition (more on this in a moment).

Another much-needed clarification Mr Henry provides has to do with what ‘work’ is: ‘Your body of work,’ says he, ‘should consist of what’s most important to you.’ He quotes Steve Jobs’s famous speech at this point, where the co-founder and former CEO of Apple famously said he stands before a mirror and asks himself, ‘if today were the last day of my life would I want to do what I’m about to do today? And whenever the answer has been “no” for too many days in a row, I know I need to change something ... Remembering that I’ll be dead soon is one of the most important tools I’ve encountered to help me make the big choices in life.’ That should help you pick what you work on.

A lot of people suffer from ‘purpose paralysis’, the fear of getting things wrong, and—here is something Mr Henry puts beautifully—they get frustrated ‘when the daily grind of work doesn’t seem to reward [their] pursuit’. This makes it all the more important to understand what one means when one refers to their ‘work’. Your work, he says rather succinctly, is your effort to create value where it didn’t previously exist.

He then goes on to describe three types of work, explaining how most of us always tend to focus on two and ignore one of these: mapping, meshing and making. Mapping is planning out your approach; making is actually doing the work; meshing is the so-called ‘work between the work’, skill acquisition, broadening your focus onto other areas of your industry or even other industries etc.—activities that ‘stretch and grow you’.

Mr Henry calls the ‘Developer’ mindset one where we focus equally on all three; this is what we should all strive to develop. But most of us fall, instead, into one of three other categories of mindset. Some of us focus on mapping and making, forgetting meshing. He calls this the ‘driver’ mindset—which makes us narrowly effective but generally unable to take advantage of opportunities; drivers have will and determination but they end up putting this to little use. Next, some of us focus on making and meshing, forgetting to map. This is the ‘drifter’, who goes by whim, and, because he has no map, cannot plan, has no strategy, and ends up with many wasted opportunities, failing to follow through on ideas effectively. And finally some of us focus on meshing and mapping—the ‘dreamer’—becoming obsessed with ideas and personal growth, which sounds effective but without focusing on ‘making’ dreamers never work on something long or well enough for it to matter.

To explain how the work we do can affect people and contribute in ways we cannot always imagine or do not always expect, and to see why we should work anyway, regardless of whether we see these effects ourselves, Mr Henry relates the tale of the Detroit-based singer–songwriter Sixto Rodriguez who released songs for a few years in the 70s in the United States but remained practically invisible there: his music never took off. Unbeknownst to him his album reached South Africa when someone from the US carried it on a trip there and, over the next two decades he became a cult icon in South Africa, no less than The Beatles. He was so far disconnected from all this that not only did he have no idea about his fame and impact, but also was believed to be dead already. It was only towards the end of the 90s—when someone realised Rodriguez had actually been living a quiet life in the States and tracked him down and he flew to South Africa to perform live—that he realised how important his work had been to so many people.

You do not always know the full impact of your work, as Mr Henry points out. You might never know it in all your life. And then he asks, had Rodriguez not received recognition, would it have diminished the impact of his work? I understand that not all of our work can have the same impact as Rodriguez’s, and to think that someone somewhere is looking at our work as central to their life would be taking it a bit too far, but the point is still valid: work without expectations because recognition is not alone what makes your work meaningful.

Flashy statements and clichés

Despite the unambiguous, action-oriented suggestions, the book is not entirely free from clichés and flashy terms such as ‘the seven deadly sins of mediocrity’, listed as rather cheesy acronyms going from A to G: aimlessness (define your aim), boredom (maintain disciplined curiosity), comfort (step out of your comfort zone), delusion (know your limits and your capabilities alike), ego (get over it, accept failure, grow), fear (try to question rather than fear the unknown), guardedness (remedy relational outages in your life). These ‘seven deadly sins’ do serve a purpose ultimately in that they define the purpose of the subsequent chapters of this book, each of which deals with one sin and methods to overcome it (mentioned briefly above in parentheses).

In speaking of passion Mr Henry talks about ‘productive passion’, i.e. passion driven by compassionate anger, the sort of unrest that makes you feel like you want to step in either on behalf of those who are suffering or in order to solve a problem. Choose a battle line that will shape every step you take, he says and goes on to quote how the Boka restaurant group’s statement ‘blow people away’ offered a targeted purpose of every person working for the company, from the managers to the waiters: Did what you do blow them away? Will this blow people away? What can I do for my customers today to blow them away?

There are other clichés too, such as maintaining a notebook where you jot things down, a practice that I have myself been employing for years now and one that has been around for centuries. Make a list of everything you want to do or know you should have done but have not. These form your open loop, says Mr Henry, and then suggests you start working daily to close these loops. This amounts to clichéd advice and is not particularly helpful in any way in which it has not already been presented. He goes on to use this to suggest we develop a sense of curiosity. Pursue inspiration via probing questions and stay curious—do not sit back and wait for it. This, again, uncharacteristically of the rest of the book, is not all that actionable.

More cliches come in the form of step, sprint and stretch goals—merely fancy names for short-, mid- and long-term goals—and the suggestion of performing constant SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis on yourself. These are both classic techniques marketing executives have been taught for ages and every other self-help and personal development book targeting marketing majors mentions these almost matter-of-factly. It is also around this point that ‘Die empty’ lost me briefly, with a shift towards marketing jargon and suggestions seemingly targeted at businessmen, a direction the book was taking that I felt Mr Henry seemed blissfully unaware of.

And some other gems

As said earlier, to dismiss the book just because it carries some clichés would be wrong—although the existence of said clichés must be addressed, as they were a moment ago—because there are some impressive points Mr Henry makes throughout.

I like that there is a principle at the beginning of every chapter about what the chapter deals with and there are simple questions, called checkpoints, at the end of every chapter. This is in line with the generally action-oriented nature of this book and is a particularly good thing.

Another idea that appealed to me were Mr Henry’s thoughts on boredom not least because they were in line with my own thoughts and because they agreed with an essay I am currently working on (and which I hope to publish here sometime soon): boredom cannot be gotten over because we have (in our gadgets) a ‘seductive’ stream of entertainment. But, he points out, it is during boredom that we have our best ideas.

And then there is what he calls ‘the curse of familiarity’, how we often mistake a passing familiarity with knowledge. Although we have read a bit about things and although (or in my opinion ‘because‘) we have quick access to an endless stream of information via the Web, we end up believing we have the required knowledge at hand while, in reality, we have ‘not done the heavy lifting’ yet to fully understand how our newfound knowledge fits into our perception. If this idea appeals to you I recommend reading The knowledge illusion by Steven Sloman and Philip Fernbach as this idea is somewhat central to that book. In short, Mr Henry says intellectual growth occurs not from titbits of information but from considering and integrating it.

Towards the end of the book he takes a moment to further clarify what dying empty means from a positive sense. Do not ask yourself what you will do if today is your last day to live. Humorously, but also rightly, he points out that you might want to binge eat pizza and jump off a plane if today is your last day. Instead, he suggests that we ask ourselves how we would spend our day if we would be accompanied today by someone who would watch our every move, take detailed notes, draw conclusions and write a definitive book about us. On some level this technique sounds considerably less morbid and a lot pleasanter and more promising than the ‘last day alive’ scenario.

Live with a focus on E.M.P.T.Y., he says finally, putting forth a last cheesy acronym. Focus on your Ethics, focus on your Mission, focus on the People, focus on Tasks, focus on You.

I like books I can read in a few days. As dull as the comparison might seem, these are like soda cans rather than richly blended teas, best for when we are on-the-go, hurrying through our days, rather than sitting back one evening and sipping patiently. To me these serve as stepping stones between longer, more considered reads. And I am particularly overjoyed when such a book can offer as much as Die empty does because, in spite of falling in the self-help genre, the book offers more than just bland motivations or calls for action; it offers suggestions we can actually put to use and that alone, to me, makes this book worth reading.

What happens when our virtual selves take us over

One need hardly be a crusader against modern technology to realise that, like any tool, it has its good side and its bad. The trouble is that far too few of us are ready to acknowledge and come to terms with this fact.

In Ancient Greece there once lived a young boy whose handsomeness was dazzling. He was, however, blissfully unaware of it. At some point in his life a young nymph came to him and expressed her love, but this young lad dismissed her unabashedly and went about his day’s work. Apparently this severely displeased the Grecian gods and they decided to teach the fellow a lesson. They decided that he had lived in ignorance long enough and that it was time he realised his own handsomeness. That evening found him at a spring where he happened to bend down and catch a glimpse of himself in the water. For the first time ever, he saw his own reflection and was stunned; he was so stunned, in fact, that he fell in love with his reflection and began to pine for this ‘other person’. Like the nymph he would never win his heart; unlike the nymph our young boy would go on to die for this. The boy’s name was Narcissus.

I

There are far too many inconsistencies with this tale of Narcissus, not the least of which is its truth. Did such a boy exist? Did the Grecian gods have nothing better to do than focus on the love story of a random kid? Was this just an old wives’ tale designed to make a point about what is socially acceptable? Nonetheless it did not prevent Freud from using Narcissus’s name for a disorder that most of us are familiar with today: narcissism, where one feels a vain, exaggerated recognition of ones own importance often paired with a helpless desire to be admired.

There is more to Narcissus than Freud’s usage betrays. The myth continues describing how a plant bloomed where Narcissus fell to his death. This is the amaryllis plant that shares its name with the Greek lad. It is known for its narcotic numbing effect. In Greek narco means numbness. It was Narcissus’s lack of realisation, his numbness, so to speak, that led to his death. And narcissism might just as well be interpreted as a numbness one feels towards one’s self that causes them to blow up their own importance eventually losing track of who they are and possibly even beginning to desire to be someone else.

I admit perhaps I am letting myself wander at this point—the last thing I would want is to get into a losing battle with a psychologist. If I am, in fact, wandering rest assured that it is with good intention. Today’s world is slowly redesigning itself to normalise a certain degree of narcissism that would have been frowned upon only decades ago. And social media has played no small part in bringing about this change.

None of this is to claim that social media breeds narcissism or that it makes a narcissist where none existed. But it does actively bring to the surface that hovering bit of narcissism that lies dormant in us all—whether we like to admit it or not. What it then does is normalise it because social networks are designed to feed on this. To blame this entirely on the social web too would be wrong: it does help us in some other ways after all; the question is whether the tradeoff is worth it and it rarely is.

The social web is built to enable transfer of information. But information can be transferred only so long as someone is out there seeking it. And someone will only seek it when they believe, on some level, that they are likely to find it there. That is to say, the model atop which the social web is built is to see what information someone is looking for and to place that before them. But, in a characteristic fashion, modern technology has gone one step further. It now attempts to understand seekers well enough to be prepared with what they are likely to seek. Going further still, having understood what someone might be interested in, the social web simply shoves that in their face—targeted advertising—in the hope that at least a handful in a crowd of hundred pursue it further. All of this translates to money.

So if knowing us really well is what will drive the social web towards success and unimagined profits, what better motivation exists for social networks to want to make us share more about ourselves and our lives?

II

The human mind adapts and manipulates in equal measure. It is inherently biased in all its observations. The incentive-and-rewards system designed by the social web plays the mind slyly and carefully: it makes us want to share by tapping into our social instinct and it rewards us with responses from others, highlighting it for no obvious reason throughout our day. A little notification here and there that makes us rush to our phones is really a pitiable reward system in play wholly designed to benefit the platform serving those notifications. Added to all this, notifying creates a sense of urgency.

This also comes down to a numbers game. The only real way to ‘grow’ on social media—whatever that means—is to participate with consistency. ‘Likes’ and ‘Favourites’ and other such statistics do not mean a lot to everyone. Those to whom these numbers do make a difference are already within the platform and will work on staying there. It is those to whom such numbers are not of consequence that platforms have to work on retaining. And this is done using equivocations like ‘engagement’: the number of times people saw your updates, the number of people wanting to follow you, the number of people actually following you and so on. All of this comes down to cold, hard numbers. How many people would still keep sharing on social media if they knew nobody would ever see their posts?

We dress up events so often that we have slowly begun to lose our sense of reality itself, let alone the event.

Add this all up and you find yourself in a system carefully designed to pull you in and keep you in and make it as hard as possible to leave. No doubt a person can simply choose to quit and stay that way (I have done it myself) but just how representative of the population is this practice? The average social media user has anonymous private accounts, notifications turned on for all platforms, connects to the web as often as possible and has a constant fear of, one, missing out, and two, needing some entertainment to keep themselves engaged.

To want to remain engaged in a world where attention deficits are increasing might seem counterintuitive but it is not: the engagement that social media provides is by nature designed for attention deficiency. Everything is bite-sized so you can spend five minutes on something before heading out to the next entertainer with a faux sense of having gained ‘new information’ along the way. By contrast reading a book demands days of continual attention.

What is this attention deficit doing, though? Why does it matter and why is it important? The reducing attention span plaguing a lot of today’s population is meant to take your mind off something much more sinister. Users are slowly being numbed to the fact that their presence on the social web is not one where they exist but rather where they are constantly and deliberately curating a version of themselves to showcase before the world. Every time someone looks at the social web they are looking not at themselves but at their reflection in a spring. As more time passes in this the chance of a user recognising this distinction reduces. The social web becomes a modern-day retelling of Narcissus’s myth.

III

If all this repeatedly reads like a dramatic cry against social media the reader will have to make a conscious effort to keep in mind that it is nothing of the sort. As said already, social media neither breeds narcissism nor makes a narcissist where none existed. Tools are rarely to blame especially when they have several valid positive uses too. The fault does not lie in social media at all; the fault lies in us.

How often have we seen someone enjoying a little moment in their day only to be swept away by the urge to share it online? Speaking as someone who almost never shares everyday moments on social networks, there is a surprisingly vivd mask that gets drawn across people’s faces—perhaps unintentionally, perhaps by habit—as they morph from themselves, who were enjoying the moment, into the their virtual selves who are ready to pose and photograph (or realise in some other fashion) that visualisation of events which they would like to put up online.

Everybody dresses real life up and it is not a new practice. Specific photographs taken way back in the 60s too suggest people loved setting things up before making a picture. The difference is, back then this was an occasional activity; now we dress up an event so often that we have slowly begun to lose our sense of reality itself, let alone the event. Pete Nicholson puts it quite eloquently—

I find myself enjoying a fun or interesting or strange thing and then, at a certain point, as if some invisible switch were flipped, I suddenly notice myself wondering the best way to communicate the moment to other people, typically via something you can do on a smartphone. Invariably, when I attempt to return to the moment, it’s gone.

The trick is to balance things. One could share later rather than now so nobody focusses on making pictures to share while the moment is underway—we just need to make pictures if we feel like it and later share pictures if we have any. And if we have none there is no need to share it.

However it is not just pictures that are the culprits. The social web has given everybody a soapbox to shout from. The trouble is that no-one is listening. It becomes important then to realise that not every thought we have needs to be broadcast on social media. Some can simply be kept to ourselves.

What we need today are what the journalist and author William Powers calls ‘Walden zones’, places around our home and work where devices are banned. He also points out the clever idea to have long moments of disconnection between successive use of our social media.You can read about William Powers’s book Hamlet’s blackberry on my bookshelf. This is key to ensuring we can keep our shiny new toys—perhaps even that we have earned them—without experiencing any adverse effects on our lives.

But there is always the elephant in the room: Do we have it in us to develop such discipline? Do we have it in us to set up Walden zones and stand by them? Do we have it in us to keep track of our connected lives and rein ourselves in from time to time? After all this is something Narcissus could not do. For my own part this has not been hard which is what gives me hope that anyone else can do it too if they, firstly, acknowledge the issue and, secondly, make a sincere attempt—both of which are easier said than done. Ironically enough I have had some additional assistance of late from my iPhone which, with iOS 12, tells me how much I used my phone every day and even compiles a report every week. Like most graphs it is insightful and at times unusually helpful, and I have been making some progress on that front as well.

Our virtual selves, our faux reflections, ought not subtly run our real lives. But they are doing as much today. Despite the advent of technology, which will only ever increase in the coming decades, humanity is not about to disappear; human interactions are not about to be replaced except for our own downfall; and if we proceed as we have been in the past our virtual selves will not stop trying to take us over any time soon.

We must all sleep more

In a world that celebrates mindless hustling, it is sleep that is costing us the most.

These days I often hear people brag­ging about how little they sleep. It is as though sleep is a bad thing and that having as little of it as pos­si­ble is clever1. It will, per­haps, seem clever to people who sleep less pre­cisely because of what sleep­ing less does to a person: in one phrase, it crip­ples our abil­ity to think clearly. Sleep­ing less is nei­ther a sign of good health nor an indi­ca­tor of clev­er­ness and suc­cess.

More impor­tantly, it must be noted that I am, and have long been, of the opin­ion that sleep­ing is a waste of time because, if we could some­how go with­out sleep­ing, we could use that time pro­duc­tively for some­thing else. But never have I argued that sleep is a bad thing and that we must limit it severely. There is a dif­fer­ence between some­thing being unnec­es­sary and some­thing being a waste of time: if some­thing is unnec­es­sary, it is cer­tainly a waste of time; but if some­thing is a waste of time, it may yet turn out to be nec­es­sary.

The Atlantic pub­lished a won­der­ful arti­cle by James Ham­blin on this topic a few months ago titled How to sleep. It drew on Dr Hamblin’s own expe­ri­ence work­ing over­time as a res­i­dent doctor. He talks, at one point, about a mil­i­tary-led sleep depri­va­tion exper­i­ment that car­ries a tone, and yielded results, sim­i­lar to a number of other exper­i­ments of its type—

Around the (1960s) the U.S. mil­i­tary got inter­ested in sleep-depri­va­tion research: Could sol­diers be trained to func­tion in sus­tained war­fare with very little sleep? The orig­i­nal stud­ies seemed to say yes. But when the mil­i­tary put sol­diers in a lab to make cer­tain they stayed awake, per­for­mance suf­fered … (But) they couldn’t tell that they had a deficit.

“They would insist that they were fine,” said Dinges(^ David Dinges, chief of the divi­sion of sleep and chrono­bi­ol­ogy at the Uni­ver­sity of Penn­syl­va­nia.), ​“but weren’t per­form­ing well at all, and the dis­crep­ancy was extreme.”

… In one study pub­lished in the jour­nal Sleep, researchers kept people just slightly sleep deprived — allow­ing them only six hours to sleep each night — and watched the sub­jects’ per­for­mance on cog­ni­tive tests plum­met. The cru­cial find­ing was that through­out their time in the study, the sixers thought they were func­tion­ing per­fectly well.

Effec­tive sleep habits, like many things, seem to come back to self-aware­ness.

It is the last sen­tence in that excerpt that, I think, sums up the entire arti­cle: often people who sleep less are under the false impres­sion that they are per­fectly alright while objec­tive assess­ment clearly shows oth­er­wise. This is likely what prompts people who look at sleep­ing less as an issue of pres­tige to claim that they are on the right track.

Get­ting seven hours of sleep in a twenty-four-hour day has been sug­gested by quite a few sci­en­tific stud­ies as the sweet spot. How­ever, it is the lower end of a range they advise; adults, as Dr E.J. Olson of the Mayo Clinic puts it, ide­ally need seven to nine hours of sleep. Other stud­ies offer a broader range going from six to ten hours per day.

While, on the one hand, insuf­fi­cient sleep can have neg­a­tive effects most aptly summed as a ​‘drop in per­for­mance’, sleep­ing too much is also a bad thing. A major­ity of stud­ies tend to favour seven hours with some going so far as to claim that nine hours is too much2. In any case, any­thing above ten hours is enough sleep to sub­stan­tially increase the risk of depres­sion, dia­betes, heart dis­ease and gen­er­ally lower a person’s lifes­pan.

Sleep require­ments, there­fore, are a trick grey area to say the least. Too little sleep can be bad as can too much of it.

Einstein is known to have slept ten hours every night besides nap­ping during the day for twenty min­utes or so. Beethoven is said to have slept from ten to six, punc­tu­ally. Iron­i­cally, these two are exam­ples of normal humans in this case. There is a rare breed of so-called short sleep­ers who are capa­ble of sur­viv­ing on as little as four hours a day. Nikola Tesla and Leonardo da Vinci are both said to have slept for only a hand­ful of hours each day. Notably, Ben­jamin Franklin, who famously said, ​‘Early to bed and early to rise makes a man healthy, wealthy and wise’, slept six hours a day.

The idea of sleep­ing in short bursts, like Leonardo, is called polypha­sic sleep­ing. Such people prac­tise sleep­ing sev­eral times a day for any­where from five to twenty min­utes each time, totalling about four or five hours a day. The more com­monly seen prac­tice is that of sleep­ing twice a day, once at night and again in the after­noon, which is called bipha­sic sleep­ing. The United States Air Force Research Lab­o­ra­tory itself states a require­ment of eight hours of sleep per day for its troops to ​‘main­tain top-notch per­for­mance’.

One of my favourite com­posers is Mozart, whose sleep pat­terns have been well doc­u­mented in his own let­ters and show a par­tic­u­larly inter­est­ing fluc­tu­a­tion3. He was twenty-one, unem­ployed, living with the Weber family and making money teach­ing others to play instru­ments when he wrote the fol­low­ing letter to his father:

I am writ­ing this at eleven at night, because I have no other leisure time. We cannot very well rise before eight o’clock, for in our rooms (on the ground-floor) it is not light till half-past eight.

It is rea­son­able to assume that that rou­tine would give him a good seven hours of sleep or more. How­ever, five years later, in a letter to his sister, Mozart writes of a dif­fer­ent rou­tine that affords him only five hours of sleep a day:

At six o’clock in the morn­ing I have my hair dressed, and have fin­ished my toilet by seven o’clock. I write till nine. From nine to one I give lessons. I then dine, unless I am invited out, when dinner is usu­ally at two o’clock, some­times at three … I cannot rely on my evening writ­ing, so it is my custom (espe­cially when I come home early) to write for a time before going to bed. I often sit up writ­ing till one, and rise again at six.

This would imply a five-hour sleep rou­tine, but only because Mozart could not afford any more sleep; he never claims, at any point, that sleep­ing less helped him com­pose. How­ever, unlike Mozart, since most claims about Leonardo or Tesla can hardly be accu­rately con­firmed, with most accounts likely having been exag­ger­ated over the years, the sleep habits of suc­cess­ful people living in our own times are, arguably, a better topic of dis­cus­sion.

Work­ing more is what makes people suc­cess­ful, not sleep­ing less. And saying that you sleep­ less by no means sug­gests that you work more when you are awake.

Forbes reported on this a little over a year ago in an arti­cle by Alice Walton titled ‘The sleep habits of highly suc­cess­ful people’. The arti­cle talks about how much sleep a person needs and exam­ines the sleep sched­ules of suc­cess­ful CEOs, former pres­i­dents etc. citing an inter­est­ing info­graphic4 that, inter­est­ingly, claims that Microsoft co-founder and former CEO Bill Gates sleeps seven hours a day, which hap­pens to be the same as Twitter’s Jack Dorsey, Apple’s Tim Cook, and Amazon’s Jeff Bezos.

Argu­ing that clever people sleep less is clearly unfounded5. In fact, work­ing more is what makes people suc­cess­ful, not sleep­ing less. And saying that you sleep less by no means sug­gests that you work more when you are awake. Per­haps the reason why some (not all, not even most) famous, suc­cess­ful people ended up sleep­ing less is because they were work­ing a lot, not because sleep­ing less was a great thing.

It does, then, really come down to what an indi­vid­ual needs: some people can do with less sleep, about five to six hours a day, while others need seven to eight hours, whether in one go or two. And sleep itself is not related to suc­cess in anyway. Indeed if you think so, you should prob­a­bly sleep more because insuf­fi­cient sleep might be dis­turb­ing your rea­son­ing.


  1. This arti­cle was prompted by two things: my own assess­ment of my sleep pat­tern (on aver­age, I sleep six-and-a-half hours a day, from 22:30 to 05:00) and an increase in the number of people I hear claim­ing to be clever as a result of hardly sleep­ing at all. ↩︎

  2. This is, pos­si­bly, an effect of there not being suf­fi­cient research on sleep and its effects. ↩︎

  3. I rec­om­mend read­ing Maria Popova’s arti­cle ‘Mozart’s daily rou­tine’ on her won­der­ful web­site, Brain Pick­ings, for more com­men­tary about Mozart’s day. ↩︎

  4. The exact source of the data used in the info­graphic is uncer­tain, so I would sug­gest taking it with a grain of salt. ↩︎

  5. Claim­ing that people who sleep less are clever is even worse. ↩︎

Quiet reflections on unplugging from society

A seaside getaway with no cellular network teaches one what it means to live life to the fullest.

It seems as though I needed time off work more than I realised. I spent almost all of last week in a deserted beach cabin in a (fairly) remote corner of the world, without a dot of cellular connectivity. Our urban lives are so centred around technology that most of us are incapable of consciously disconnecting every now and then. My week-long getaway, though, left me no choice and, after the fact, I think it was exactly what I needed.

A private beach spells bliss.

With me was my family, so it was a doubly exciting period of vacation, however brief. Most of the week was spent idly, which is about as far as a day can get from our work lives. We played cards, drank a little too much lemonade and sat on rocks to get sprayed by the tide.

A new moon spells a rising tide

I like crabs because they are the second least weird sea creatures. (The other being the dolphin/whale family.) We had a private beach near our cabin, with a cook and a housekeeper to tend to things while we were off enjoying in the sun. Although I expected the place to be hot, sticky and humid, it was actually quite pleasant.

Early morning, apparently, is crab time. You could count at least a hundred tiny crabs on the shore and about ten to fifteen big ones without fail. By eight o’clock they were gone, perhaps buried deep beneath the sand. The midnight tide would clean the shore and brush it flat everyday, waiting for our footsteps to be laid in the sand.

There were rocks near the shoal where you could sit and watch the waves and simply slide down from if you wanted to get wet. This would have been where I spent every waking minute had the sun not been so unforgiving.

A cabin by the beach, up on the rocks.

One of the nights was a new moon and the tides rose considerably. A good fifty feet more of the sand bank had been submerged that night than usual. Gone were our sand art from past days and our footsteps. There was a small pond now on the other side of the beach where the tide probably could not recede from entirely.

What I found interesting—indeed what I always find interesting—is looking at rocks and wondering how it was when they were once submerged, a part of the seabed, a product of tectonic plate movements that left them on land, open to human admiration (and, sometimes, use). I stand by the rocks and wonder what secrets lie beneath, what undiscovered fossil holds the answers to our many questions, what new information lies here, untouched, because nobody has bothered to explore the region?

On second thought, perhaps it is good nobody has explored it. That is why the place is safe and sound.

The first day I wonder what is happening in civilisation. What would I have heard or seen had I been connected. But the soothing whispers of the evening tide draw my attention away from there, straight to the present. By day two I find myself rooted in the present effortlessly. It is blissful, I do not even think of the network I am missing. My phone says there is no service and I smile and put it back into my pocket as I lie down on a rock, my Kindle in one hand, and begin to read.

A hike up the mountains

We decided to spend half-a-day away from the shore and hike up a nearby mountain to get a bird’s eye view of the sea. Having recently recovered (somewhat) from gout, I find it hard to climb, but I do so anyway, knowing it is going to be worth it.

A hike up the mountain overlooking the sea.

There is a local tribe here that uses parts of these hills for farming during the monsoons. Stone pillars, walls, and terraced arrangements are left over here that are reminiscent of stone age relics. A walk through them, with the mountains on one side and the sea on the other, gives us a sense of place.

We, as humans, are part of this world. We do not own it, we do not have the right to exploit it, but we can make full use of it as harmoniously as we can; we need to protect ourselves, survive, and do our best not to harm other species living with us. These are thoughts that strike us as we trek, not so much as words as shards of thoughts. It just appears, and we realise our place on this pale blue dot.

Coming back down to earth, we reach the peak, a semi-flat, bushy area overlooking the sea, a fishing village and long stretches of beach only fishermen use. It is a beautiful sight.

To argue that we must give up our urban lives to live in a forest is absurd. All we have to do is ensure that our way of living does not damage the envi­ron­ment.

Also on the peak is a lone tree, curious enough because it seems to have been placed there. For one, there are no such trees around, and, two, there is a stony parapet wall encircling the tree. We later found out that early British explorers planted the tree there as a marker of sorts.

On the way back down we encounter local tribal men. Their contact with the outside world is not absent but not overwhelming either. They wait courteously while we pass. As city-dwellers, we step carefully through the loft ground; they walk with a nimbleness that can only come from knowing the terrain inside-out. They pause briefly to express their gratitude to us for keeping the environment safe; others destroy it, they tell us.

This made me happy but it then also dawned on me that these natives are more human than us in that they are more in contact with our earth than we are. We live on it, they live among it. I do not know if one is bad and the other is good, and to argue that we must give up our urban lives to live in a forest is absurd. These are two ways of living, neither bad. All we have to do is ensure that our way of living does not damage the environment.

Dusk by the sea.

More than halfway through my vacation, I have stopped thinking of how I have not electronically connected to the world for days now. Good, I remember saying to myself at one point towards the last few days. I would do this all over again if I had to. In hindsight, though, perhaps not permanently, but a week every now and then is a wonderful idea.

Reconnect cautiously

The refreshing taste of unplugging left me in an interesting place. I could easily choose to remain that way, but history has taught us that opposing something new (rather than understanding and choosing to let it into our lives in a restricted manner) rarely leaves us victorious. The other option, of course, was to use this excellent opportunity to reconnect but only when I absolutely have to.

The human mind is an interesting thing. Within two days I had almost completely reconnected. It was clear to me that, regardless of whether I needed to or not, I had found opportunities to connect and had taken them. Perhaps it was part of everyday life and/or work, but it was not that I gained nothing from my experiences.

I prefer not to start lecturing on the joys of missing out1 but what I have realised is that, given an opportunity, my first instinct now is no longer to connect, rather to go analogue. It seems like an excellent thing to me that unplugging for just a week can bring us closer to our analogue world, not only during the week but also after it.

Seaside showers.

But this is not the only effect my getaway has had. It has also given me a perspective on resting. I have long been of the opinion that people who claim to work all day and never rest are idiots. This has been strengthened considerably. Relaxing and rejuvenating oneself is simply clever. And if work is all that important to us then relating to enable better work is a no-brainer.

The takeaway here is simple. It is not that one needs a weeklong vacation every now and then in a remote corner of the world with a cottage on some rocks, a private beach and a mountain to climb. And this is, after everything is said and done, what rule inspired me. Every benefit this travel afforded me was something I could have gained without ever traveling: yes, there was the lovely sea, the beautiful forest, the adventurous climb, the crabs, the tides and the white noise from the waves hitting the shore, all of which require us to travel; but the other side of it, the disconnect, the relaxation, the idea of connecting with nature, of introspection, thinking, and living in the moment, all of these can be had right in our own homes, in the centre of the most bustling city on earth.

It reminds me of The Alchemist2 in some capacity: sometimes, we have to look no further than our your backyards; but it pays to dream and travel the world anyway.


  1. I have done that already as part of a year-long experiment↩︎

  2. I strongly recommend that you read Paulo Coelho’s ‘The Alchemist’, which, for all the right reasons (and its life lessons) is an international bestseller. ↩︎

Information overload and an overly social web

A year-long social media experiment on the little things that can make our lives fuller

That the internet is an integral part of our lives is no surprise: it is simply a step up from the type and ease of access of information that was affordable in the last century, such as bookstores and well-catalogued libraries. Why social media and networks have become an integral part of our lives, though, is a more pressing issue and a question most people are afraid to ask themselves.

A little over one year ago I asked myself this question and wanted to find out the answer. Nobody had one that was satisfying enough to me and — nearly fourteen months later, the answer I have is fair enough on a personal level but still hazy to some extent when generalised. What I found out along the way, and the potential long-term benefit it would have, was certainly worth the time I spent on it. I first wrote about this in August of 2015 asking myself — and everyone else by extension — how many social networks we are on and why. However, this was probably not a knee-jerk thought and could, on some subconscious level, have been prompted by the fact that, at the time, I was reading William Powers’s book, Hamlet’s Blackberry. Indeed my previous two quotations scribbled on Marginalia were from that book.

Note that we use the terms ‘social networks’, ‘social web’, and ‘social media’ interchangeably. They are all in reference to that part of the web which encourages you to create your profile, share personal and general information, your work or life or travel and so on, connect with known and unknown people and which enable you to stay connected all day long if you choose to. It is best we keep the definitions flexible, lest we miss the forest for the trees.

The structure of this article is worth paying attention to, owing to its size: we begin with an overview of the experiment, followed by a quick rundown of the prevalent scientific thinking on this issue, and then some notes on ways other than cutting back on social media that can potentially help in everyday life, followed by the actual steps I took over the past year as part of this experiment and finally some thoughts on how and why they work as well as, of course, what I intend to do from this point on as a direct consequence of this experiment. This last bit is particularly important because several people prefer to believe in either jumping right back into the social web or refraining like monks altogether. One is an addict, the other a luddite. The purpose here is to find and carve out a much more agreeable middle ground.

Part one

This began a year-long experiment for me where I aimed to explore this question alongside a host of other similar ones. Specifically, would the reason why I got on social media — to spread my websites and photographic work — take a hit if I called it quits overnight? In what I consider a rather bold move, I decided to pull the plug on the social network where I had my largest group of followers and where my photography was the focus of almost everything I was sharing: Google+. To some, this might seem foolish or even daring, but to me it was simply part of an experiment and if I was going to do this for a year, I might as well do it thoroughly.

The reason why my experiment was a year-long affair was that I had seen or heard of far too many people who ‘quit social media’ on impulse and then promptly got back to it a few days later. This, in my opinion, serves no purpose. If one must feel the effects of any change, they will do well to give it time to show itself. The allure of social networks is said to be great, but it helped that I am an introvert and am rarely inclined to go on a sharing binge, or, in turn, it helped that not sharing would not really be a problem for me.

As important as it is to understand what my experiment was for, it is important to realise what it was notabout. This was not an attack against social media. It was not an attempt to make a case against social media. It was certainly not to unearth the ill effects of social media (those closets have been emptied already as we shall see in a moment). And it was not to force people to quit social media. The entire experiment was simply to see if the presence of social media has stretched its arms beyond the purpose it was intended to serve, and how that can be curbed, and, consequently, what benefits this may yield, particularly in terms of ‘enriching’ our lives — a word far too often associated with quitting social media. Quitting some social networks was part of the process to understand what this meant, but it was not so much about quitting social media as it was about learning to use it more intentionally. This is the core of my entire argument and possibly my favourite part of the outcome of this experiment.

Part of this experiment was to find out if limiting social media use would necessarily result in our returning to a more productive, efficient life.

What else prompted me to initiate such an experiment was the new idea called a Fear of Missing Out (characteristically shortened by the internet to FoMO). Anil Dash came up with a counter proposal that he called the Joy of Missing Out (JoMO) and both the ideas caught on enough for the former to be recognised as a type of fear of regret. The idea that one may miss out on something scares people, and, to an extent, rightly so: proverbs such as ‘opportunity knocks only once’ do not quite help make a case. However, Kevin Systrom, the founder of Instagram, who agreed that his platform can be addictive, gave a sound reason why FoMO should not bother us: ‘We aren’t used to seeing the world as it happens. We as humans can only process so much data.’ My solution to FoMO is simple: you cannot help it, you cannot overcome it, so stop thinking about it as missing out at all. You will miss something, you are missing something right now. The point of life is not to cry over (or fear) what we might be missing, but to make the most of what we do have or are doing now. Or, better still, as Anil Dash explained his own experience, ‘I’d been mostly offline for more than a month… I wasn’t missing anything. I hadn’t realized that I was not only not in fear, but actually in a state of joy.’

An important part of this experiment, therefore, was to see if JoMO was really a substantial benefit of limiting social media use. Specifically, would such a limit result in a return to a more productive, efficient life in which we enjoy the present (which is, to some extent, what I expected), or will it actually be a source of newfound joy? While I understand that the spirit behind JoMO mainly stemmed from the fact that one would be just fine and does not have to be afraid of missing out on things, the promise of missing out also giving us joy was something I had my doubts over for the simple reason that if using social media is not resulting in pain, getting rid of it will probably not result in joy.

Lastly, it is not my intention to suggest that the social web is keeping us from living a productive or efficient or fuller life, or that it is preventing us from living ‘in the moment’. An analogy may be drawn with something more familiar and physical: walking freehand on a road for a thousand metres is possible, so is walking for a thousand metres on road dragging a boat behind you, just in case you reach shore someday. The latter is doable, but demands unnecessary effort, and social media is something like that (I will expand on this as we go) in that its benefits, while not entirely absent, are incomparable to alternative means that exist of obtaining the same, or better, benefits.

Part two

All the negative impacts social media can have have been researched quite well over the years. The proof is scientific, and more so than most people would realise. Social media is, quite literally, like a drug: it ramps up dopamine production in the body. Dopamine is a neurotransmitter that is stimulated by reward cues and by peeking into small amounts of information — getting a taste for things, in a manner of speaking. The act of sharing on social media is often seen as a ‘reward’ which further prompts Dopamine stimulation.

Further, as a whopping 68% of users admitted, the reason for sharing, almost outrageously, is not because people find it interesting, rather because people want to be seen associated with certain information to influence (or, as some would prefer to say, moderate) how others perceive them. I found this ridiculous at first, but, curiously enough, it makes sense when you think about it: our clothes, gadgets, books, hobbies are all reflections of what we are, so why would the information we consume — and show the world we have consumed — not fall into the same category?

Besides Dopamine, the same process works for the chemical released when people kiss, oxytocin, which shoots up by more than 10% when you send a tweet. Most surprisingly, this is the same amount of increase people experience on their wedding day. Somewhat dangerously, this also increases our trust in people at the time, which explains why Facebook users are 43% more likely to trust internet users than people not on Facebook. (My own use of Facebook was culled more than five years ago, so it will not be part of this experiment.)

But this is not the end: it has been shown that as much as 40% of our conversations are about ourselves while offline, and nearly 80% online. And, although people share on social media in an attempt to showcase what they feel is the best version of themselves (or of their alter egos), the target of a considerable number of social media posts are nota group of people but one specific individual. A part of me even wants to believe this is a direct reference to veiled, passive aggressive status updates that are designed to look like a general thought being shared but are painfully obviously a rant directed towards one person — whose identity is also obvious in quite a few cases.

It is not the fault of social media that people insist on using it to seek validation. The problem with social media is our inclination to overuse it.

Even ‘liking’ and ‘favouriting’ are done to appeal to the person who shared it so one can remain in their good books. I understand this is not true for everyone or for every share but the numbers are worrying nonetheless: 44% of users insist on this practice to ‘maintain relationships’. This is, once again, nothing new. It is like nodding in faux agreement at a party, either out of courtesy or because you are not in the mood to debate things just then. And people like being agreed with. But what it boils down to, in my opinion — besides the fact that, right from using to sharing, social media has traces of potential addiction written all over it — is that once a piece of information is shared, nearly 60% of people state they derive actual ‘happiness’ from others’ act of appreciating their shares, be it ‘liking’, ‘favouriting’, ‘plussing’ or whatever else. This, once again connected to dopamine, is our biggest reward.

An equally important factor for me — and one which I had been thinking about for a few years myself — was how social networking influences our thoughts. A study conducted by The New York Times that was compiled and reported by Hootsuite shows that 85% of people process the contents of an article better after reading others’ comments on that article and based on those comments. This is troubling to say the least: our thoughts and reflections should be based on the article we read, not on what others think of the article. It is worrying still because comments that attack the author personally without any factual foundation are just as effective as any other comment in influencing how we perceive the writing itself. This has often been my biggest complaint — or even case — against social media: it dictates our likes and dislikes and preferences and tastes to such an extent that I often joke that we are on our way to becoming completely ‘tasteless’.

Given how many negative signs the social web is already threatening users with, it might seem almost justifiable to put an end to it. However I believe this would be a hasty decision not thought fully through. It would be akin to blaming a tool just because people are misusing it. It is not the fault of social media that people insist on using it to seek validation. It is not the fault of social media that people associate it with a reward scenario and prompt dopamine and oxytocin release. Social networks have their benefits: they do help people connect, they do help get the word out, they do help share information, they do help people stay informed, they do help people entertain themselves, they do help people get their work out there for the world to see, and to have a platform on which to voice their opinions. The problem with social media is not social media itself, but our inclination to overuse it. The solution lies in what I stated previously as the crux of my argument: intentional, goal-driven social media use.

Part three

Let us return our focus to the experiment now that we have something of a scientific justification to — at least — reconsider our use of the social web. Since embracing JoMO was partly what drove the experiment, I decided to make a small diversion from social networks: the one thing nagging me all day long was not my e-mail (see, Inbox zero and Updates) but, on a similar vein, was my pending reading list on Instapaper.

Over time I had accumulated so many articles that I wanted to read (and would read, in an ideal world), that the numbers were shooting up to the late hundreds, while my read count was likely nowhere past the lower hundreds. Interestingly, most articles I wanted to read I would read then and there, and those I did read off my Instapaper were often never too old, but ones I had saved only a few days ago. This was more unnecessary weight in my eyes, to carry around a constant reminder that there exists a long list of articles I probably will never get around to reading — and how many of those may be outdated by the time I do get to them?

My solution to this was somewhat as radical as my move with Google+. I decided to get Instapaper off all my devices and stop saving articles to that service to read later. However, the important thing to realise is that Instapaper itself played no part in this. The spirit behind the service was actually built around more targeted, less distracting browsing, where one could focus on what they were working on at the moment and yet not lose some interesting piece of writing they may have come across.

In other words, I would continue to use the concept of ‘saving to read later’, except without Instapaper and with one added restriction: the read later list must be emptied every week, whether I have read everything on it or not. I chose to use Safari’s inbuilt reading list service lazily named ‘Reading List’. Not only does it sync like Instapaper all around, it supports keeping articles offline and ready. And it was handy for my new style of use: a week’s worth of articles saved for later, read and either saved to my Evernote Premium or cleared off my reading list; and every Sunday night the entire list gets cleared and readied for the following week — regardless of whether there still are unread articles or not. My other intention in choosing this Safari feature was to avoid using yet another app, but for those of you working cross-platform, perhaps the same limited approach with Instapaper or Pocket would work just fine, even while not being as handy.

In the same vein as this, we would do well to avoid a barrage of news sources. Aggregators were supposed to address this problem, but they are not all that different from RSS — this is a point that will be addressed in greater detail soon. Intelligent aggregators still send some poorly written, fact-free, logic-repellant writing even though they grow better over time. And then there is the question of bias and worldview. One argument that can be made in support of intelligent news aggregators is that they might expose you to multiple perspectives. This is true; it is also true that they save time initially by just sending news your way — albeit with a lot of badly written non-journalism and self-celebrating clickbait rid of any and all substance — and then learning your likes and dislikes as you go. However, this is exactly their problem: by learning your likes, these news aggregators may, over time, only give you news that you like to hear, thereby skewing your views. That is not how news works.

RSS is an undoubtedly more straightforward, simpler, and distraction-free solution that guarantees you a regular stream of quality content like no other.

A better approach would be what I like to call ‘targeted reading’. Not unsurprisingly, this is part of a purposeful web browsing experience, but that will be outlined towards the end of this essay. Targeted reading involves picking news sources you like, picking weblogs you like, picking magazines you like and so on, and then subscribing to them via good old RSS. This has several advantages. First of all, it reduces clutter giving you updates structured by source or manually chosen genres; second, it serves your news from sources you like and exposes you to counter arguments which are present in these sources (e.g. a libertarian paper will often address conservative views, thereby exposing you to both) unlike an aggregator which might disregard articles on the basis that opposing views are talked about, under the wrong impression that you may not like them at all — I am sure that algorithms are not so simple, but I am not entirely sure of their current status; improvements may have been, or will eventually probably be, made in this regard.

The other, equally important reason why I encourage RSS is weblogs. Besides news/magazine or dedicated genre websites, a majority of good, worthwhile content on the web comes from independent writers with active weblogs. Some are extremely personal in nature and may not appeal to everyone, but many (including this one) are more focussed on opinion pieces, most of which are well-written and worth reading. If you were to invest in the social web, carefully chosen blogs are your best bet. It takes time and effort to maintain and run a good weblog — unlike a social media profile, which is specifically designed to let people share as much as possible in as little time as possible (read, spending as few thoughts on it as possible) — which translates to the guarantee of good content on weblogs, even if not a fairly regular stream of it. But I would take a well-written, irregular blog any day over a fast-paced, mediocre social media profile of random articles interspersed with video game scores. At the end of the day, though, RSS is an undoubtedly more straightforward, simpler, and distraction-free solution that guarantees you a regular stream of quality content like no other.

Besides RSS, another interesting productivity-driven move I attempted for a few months was to limit all news/RSS apps to my tablet and use my smartphone solely as a daily assistant. I like how this segregates the task each device does to some extent, but I found myself with bits of leisure time, or waiting in queues and antechambers, with nothing but my phone to keep me occupied, which was a pain when there was nothing worthwhile on it for me to read through. (I rarely browse through articles, preferring instead to read them in full or not read them at all.) Needless to say, my trusty RSS app, Reeder, was back on my iPhone pretty quickly. Magazines, though, are confined to iPad for aesthetic reasons, and novels to my Kindle Voyage in a bid for an experience closer to paper books. And I still love and read paper books of course.

Speaking of apps, an interesting approach I found was to ask myself, before installing an app, whether it did something a first-party app already on my phone did not, or, alternatively, whether it did the same thing better — not differently, but better. If so, the app is welcome; if not, I leave it aside and ponder over it — you will surprise yourself how quickly you will end up changing your mind about an app you were sure you ‘needed’. This is not something I strongly advocate, because everyone has their own style of use and their own use cases, but if you do have the time and the interest, try out a smartphone experience as close to stock as possible and your device will become less of a distraction as time goes.

Part four

Sometime after I first asked about the many social networks we are on — perhaps even rhetorically now that I think of it — was when I first decided to formally start this experiment and quit Google+, which was my favourite social network at the time, Ello, another which I really liked and found active and welcoming, and a couple of other profiles online (see Cold turkey). Back then, this is what I had to say about Twitter, which was, interestingly, one of the ‘big’ social networks I had chosen not to quit: ‘I picked Twitter not only because it is the fastest to update and interact on but also because the ratio of the number of useful interactions/new networking opportunities I’ve had on Twitter to the time spent updating it is clearly higher than other networks.’

This was an interesting assessment and to me — a big fan of one-liners — this remains true till date. I think this is of particular importance because, unlike the myriad of cases we have seen over the past year of people quitting Twitter for one grandiose reason or other, I had no such precursor to my decision to quit. It was simply a conscious choice I made because I thought it would be beneficial to me in enough ways to make the whole idea appealing.

A year later I had cut down on everything but Twitter, Flickr and Instagram — and I did not miss a single one of them. I remained on Flickr because that is where my photography resides, Instagram because that is where I share my work as a secondary platform after Flickr, and Twitter because it is succinct and to-the-point and I have never been one for following people out of obligation or in return for gaining a follower. This has probably largely influenced my experience on Twitter which has been calm, sane and informative, unlike the raucous, short-tempered, racist, vulgar network so many make it out to be. (See, particularly, how Jon Weisman, a deputy editor at the Times, quit Twitter over complaints of harassment and hate speech.) This is likely either because I am not well-known enough to attract as much attention as Mr Weisman or I am more picky about whom I follow and bother to respond to, but to each his own: Twitter has been fairly good to me.

The trouble, though, came sometime in June this year when a study showed that 60% of people do not even read the articles they shareonline. Whole trends in the online world are created this way. This is something that had popped up in my mind once too often, but it was interesting — and disheartening — to see it put into numbers. So then arises the new question: how many such people do you follow? Even if they all did it just once, you have a few hundred links to supposedly interesting articles shared blindly. How many of these suggestions must you trust? And couple this with the fact that a lot of people share things only to improve the way others perceive them and it becomes all too clear that they may be sharing links to articles with interesting titles (or, worse still, clickbait), all unknowingly, in an attempt to look good. Twitter just went from a quick moving highway of information and one-liners to a cauldron of steaming hot, trendy mess.

Social networks are the opposite of fora: the fact that they are not specific content-driven means they can keep you for longer than you like, by serving you with a constant stream of content dynamically tailored to your liking.

For now I have eased away from Twitter: I rarely tweet, I do check once or twice a week for messages and mentions since it is still the quickest way for many to contact me and a lot of people do send me direct messages to discuss an article or two. (Twitter is quick, but e-mail is a more reliable means.) This meant I had to move my Twitter list members from my list of secondary news sources (those interesting enough to follow, but not important enough to subscribe to via RSS) into a dedicated folder in Reeder where they now reside. This leaves me on two networks, Flickr and Instagram, both meant for my photographic work and nothing else, both extremely targeted, which makes their use that much more justified.

That is the problem with social networks: they are not targeted, they have no niche which they serve, and are classified, at best, more by how we share information rather that what information we share. This is what makes online fora a much better way to network and communicate if that is, indeed, our intention. A photography forum, however critical, biased, shallow or gear-driven it may be, still has a scope and an intention that makes it specific content-driven. You go there when you wish to talk about photography and leave when you are done. The same goes for science fora or literature clubs and such. They all have their scope within which all discussions lie, they are targeted and you enter and leave at will when you wish to indulge in conversation about that topic.

Social networks thrive on just the opposite: the fact that they are not specific content-driven means they can keep you for longer than you like, by serving you with a constant stream of content dynamically tailored to your liking. Tired of a cat video? There is no need to leave yet, because they have another hilarious video with chimps and zebras that might interest you. Done with animals? How about this video about the top ten this and thats? The fact that social networks are so encouraging of genre-hopping has done two things to people: it has dramatically reduced the average reader’s attention span (I am convinced at least a thousand people closed their tab when they saw the length of this article) and it has made us slaves to a faux sense of gaining valuable information — in which both the key terms, ‘valuable’ and ‘information’, are suspect.

In March this year, The Guardian reported as much, with brief case studies of a handful of millennials who were turning off social media. More recently, less than two weeks ago, the Financial Times reported an Ofcom study showing that nearly half of internet users claimed they were ‘spending longer online than they wished’. Subtle signs exist all around that with clever moderating of information and data, social networks can subconscisouly affect our decision making — including our decision to leave their network or stay back on their website for just another two harmless minutes. Just two.

Erza Klein writes, in an excellent article in The Washington Post —

If I neglect my RSS feed today, the posts will still be there tomorrow. The same is true for the books I’m reading, the magazines piled on my nightstand, the tabs open in my browser, the long-form I’ve saved to Pocket, the e-mails I’ve filed away to read later, the think tank papers saved to my desktop, and pretty much every other sort of information I consume. The backlog nags at me, but I’ll get to it.

Twitter elicits a more poisonous information anxiety. It moves so fast that if I’m not continuously checking in, I completely lose track of the conversation — and it’s almost impossible to figure out what happened three hours ago, much less two days ago. I can’t save Twitter for later, and thus there’s always a pressure to check Twitter now. Twitter ends up taking more of my time than I’d like it to, as there’s a constant reason to check it rather than, say, reading a magazine article.

Addiction is the lifeblood of social networks. They have their uses, and Twitter, in particular, is a godsend for some journalists. It is the 21st century equivalent of hanging out at the mill, keeping an ear on the local village gossip; except, back then, you went to the mill with a purpose, whereas with Twitter, listening to the gossip is itself the purpose. With Facebook, it is all about staying in touch with people, but how many of them, really? Dunbar’s number is the maximum number of people one can maintain a stable and meaningful relationship with. At 150, it is a stretch for most introverts. The average number of Facebook ‘friends’people have is 155. My own circle is considerably more modest and I keep in touch with them just fine via text messages or iMessage.

Part five

Steven Baxter, in New Statesman, put the entire idea of the futility of sharing life updates into a dashing tagline: ‘Say hello to a world where you can just do stuff, without talking about the stuff you’re doing.’ He had just quit Twitter and his reasoning was clichèd but effective. Two years after him, even Simon Pegg gave the same reason as Mr Baxter — ‘it’s not you, it’s me’. This is of significance, once again, because these are not people who were motivated by incidents. They simply decided to call it a day.

However, there is a deeper reasoning behind this, which, I strongly believe, extends to all of the social web. It makes people ‘tasteless’. Social media creeps into every inch of our life without our express consent or knowledge and, before we know it, begins to subtly but certainly influence our every move as well as our mood. Knowing X did this and Y did that is enough to influence people’s daily chores and subconscious thoughts. By replication, repellency or misguided inspiration, every other action is influenced by what people see and read and hear on the social web (a lot of it untrue) to such a point that some obsessively log into the web to ‘check things out’ while what they are really hoping for is to be told what to like.

This can be hard to swallow at first, but is prevalent to a truer extent than we are probably ready to acknowledge. There has been evidence of external stimuli, from media to news about friends of friends, affecting our own thoughts, lives and decisions. A shallower, more easily understandable example of that is advertising: over 90% of people visit a store following an advertisement or word-of-mouth (text?) marketing they came across online, 89% of people look up reviews online and 72% of people trust these reviews as much as a personal recommendation, and, surprisingly, 62% of people buy productson this basis alone.

Social media, on a subconscious level, influence our ways more than we would have wanted them to had their effects been more obvious.

The comparison with advertisements is a more pronounced and opener case of precisely the same effect that everyday, seemingly harmless social network banter is having on us. That is not to say social networks are ‘harmful’. But they do influence our ways more than we would want them to if their effects were more obvious. (And for everyone who thinks they are not influenced — neither did the people in the above survey, who even claimed it was ridiculous to think that way. They were actually unaware of it and that is what makes this border on being dangerous.)

Adam Brault makes this point by considering people’s updates as a sort of advertisement of their lives and thoughts, an idea that I found extremely interesting and unquestionably valid. Although he speaks of Twitter in particular, the core idea holds for absolutely any social network out there —

But the problem that occurs is that it can be a huge mental lease we’re signing when we invite a few hundred people into our Twitter life. To some degree, it is choosing to subject ourselves to thousands of ads throughout the day, but ones that come from trusted sources we care about, so they’re actually impactful.

Even if the people we know aren’t explicitly selling things (not that there’s anything wrong with that) or promoting their personal brand (there is everything wrong with that), we’re still choosing to accept their stream of one-second ads with some kind of message all day.

Part six

While my initial view of social media has not changed in that I still believe a lot of it has no direct benefits that outweighs its negative side, I was probably wrong about Twitter. Granted, I have not had a bad experience there for the most part, but I think the urge to share a hundred-and-forty character thought is just that, an urge. And when you let the urge sit, it simmers into a deeper, more meaningful thought. Or you come to realise how pointless it was in the first place, which is not all that bad when you think about it. Twitter is still great for one-liners, but then not all thoughts are best put as one-liners. My reason to stay on, though, has more of a community aspect to it: if I want my writing to be read, shared and discussed, it is only fair that I too add something of value to the community in return.

My two pronged approach to growing thoughts remains this website, where I write about my opinion and other factual essays on science, technology and society, and Marginalia, my Tumblr where I share shorter pieces of thought. I like this demarcation for two reasons: I am more in control over what I have to say, and the moment I find all of this too cumbersome and worthless, I can call it a day. However, so long as this website exists, as with anybody else’s, the effort is clearly visible, be it in design, maintenance, or writing. With social media, this is wholly absent and all that counts is how active your profile is. And there is always the urge that many have to keep it active while really adding nothing of value.

All said and done, there are some avenues of the social web that I would not advocate against. For example, Slack, iMessage or (and this may come as a surprise to some who know me) WhatsApp. All three have their benefits — flexible, quick work communication or savings on SMS costs in case of iMessage and WhatsApp (which is especially huge in Germany) — and, perhaps more importantly, they do not require you to expend your time and thoughts on a profile or a stream of updated content and are therefore not as great a distraction or influence as traditional social networks.

As far as not getting lost in a browser chain with thirty-five tabs goes, a sort of luddite’s approach is to note down what you want to look up online and then dedicate an hour or two a day to follow up on those thoughts. While this might seem counterintuitive to the point of an ever-accessible web at first, one must realise that being accessible was not the primary intention of the web; making information accessible was. What cannot be denied, though, is the fact that this old-fashioned approach will go a long way in reigning in your wandering habits on the web.

By now (for you — this happened about halfway into the experiment for me) it ought to be clear that the entire idea of pulling away from the clutches of the social web and attempting to usher in some of that JoMO was not simply about quitting social networks. There is a lot more to it, most of which revolves around cutting down distractions so that you can have a more purposeful, intent-driven experience on the web which, by any measure, is a more beneficial experience than any other. You could install selective website blockers on your browser or trust your own will power — since not all of us are addicted to social media. But there is no doubt that putting in a conscious effort to regulate the time we spend on the social web can have lasting positive impact on our life.

Nick Bilton calls this ‘reclaiming our real lives from social media’ in his article on The New York Times, where he gives the example of Ernest Hemmingway, explaining how, once when he came across a pocket of free time, he began to write what would become his memoir, A moveable feast. Had the writer been alive today, points Mr Bilton, he could probably have not put pen to paper, preferring to pull out his smartphone and proceed to ‘waste an entire afternoon on social media’ instead. While this is possibly a more dramatic example, it serves to make a point and makes it well. The question should never be about quitting social media, but rather about all the wonderful things we can do if we chose to better manage our time online and save lots of it for our own ‘moveable feasts’. After all, I have little doubt in my mind that if Hemmingway were alive today, he would have really liked Twitter and would have exhibited his brilliance there in 140 characters (much to Mr Franzen’s chagrin — see section VII below) on a daily basis.

This is where my call for more targeted browsing and targeted reading comes in: back in the days of libraries — I know they still exist today — people had access to more information they could handle, much the same as today, but with a catch. In a library, your research had to be targeted; you had to look up a genre, a topic, a book, and then flip through and read several pages to accumulate the information you wanted. In the process, you learnt a few new, related things, and you learnt what you were looking for. Perhaps you came across a reference to another work that led you on a hunt, once again across topics and book titles until you came across it and then had to leaf through some more before you found what you were looking for. It took time and effort and imparted a proportionally large value to the knowledge you had just gained.

The internet is different. It takes almost no effort to find what you were looking for and going from one source to another does not send you on a hunt, rather a click does the job for you. Everything is a click or few away. And the thoughtful appreciation, like pausing and blowing on a cup of hot coffee, that came with a few hours spent in a library is nowhere to be found on the web. Here you are gulping in your drink so fast that you cannot tell what you drank. The value of a given piece of information has not lowered, but the access to nearly all of the information on earth and the fact that the internet affords us that means, ironically, that the scale of this enterprise is simply lost on us and the value of a piece of information is not as readily apparent. That leads to an information overhaul, where we do not realise we have had enough until we have had too much and lost track of it all.

In providing us with infinitely more information than we can handle at any given time, the internet has made things counterproductive. But the same argument could be made for a library as well, which means the onus is on us as users to be more conscious of how we use it, to take things in in a planned fashion, just as much as we want, knowing, all along, that the rest of the information is still out there and that we can come back for it any day, anytime. The purpose, right now, is to remain focussed on what we are looking for and take that much away. The internet, for better or worse, led itself to be designed like a maze with the sole intention of keeping us in it. But we still control the off switch.

Part seven

At the end of my experiment, I have come to make a number of changes. It started with quitting Google+ and Ello, a brief exit from Instagram (although what prompted it was the fact that I had temporarily got locked out of my account) and eventual infrequent return, increased use of Linkedin, despite its horrid design, because I found it useful from a professional perspective (not to land a job, rather to have informative discussions on professional groups), easing out of Twitter, which, like Google+, will automatically send out links to my latest posts while not carrying too many manual tweets, but where I will keep track of direct messages and mentions. I am still on two minds about Twitter, mostly because I did reap good benefits out of it and have had a pleasant experience, contrary to several others, so that stays for now.

While I did not blindly cut myself off from all of the social web — that would be a dumb thing to do and a sign of not adapting to changing times and taking advantage of potentially beneficial solutions — the fact that I reduced my use on three-quarters of them has given me considerable free time and peace of mind. This was surprising because I was not a heavy user on any of them, but the fact that I did not have to update or keep up with any of them did leave me somewhat freer.

Some argue that social media is a bad thing. I beg to differ. The American writer, Jonathan Franzen, for instance, blames Twitter for not citing facts or creating arguments in 140 characters. It is hard for me to take a man seriously when he hates so many things. Mr Franzen has spoken of his dislike for everything from Facebook to emoticons to ebooks to Salman Rushdie to Jeff Bezos (he once called Amazon’s boss one of the four horsemen of the apocalypse), giving me the impression that he is a staunch luddite unwilling to let the 20th century go. Plus, at the end of the day, if you go to Twitter for 140-character-long facts, you are missing the point by a wide margin. Twitter, to me, is more about directing people’s attention to things, be it with links or other media, and having crisp conversations. It is not for writing long-winded arguments backed with MLA format citations. But, yes, if you are clever enough, you canmake a sound argument in 140 characters. The point is, nobody gains anything by painting the social web as the Lex Luthor to our ‘Superselves’ and running away from it. Further, blaming social media for things like spreading false news speaks more about your inability to judge the trustworthiness of links etc. than about any fault of the network itself. Like everything else, it comes down to what we make of it.

The simplest approach would be to schedule social media use like you would schedule any other activity. Avoiding it first thing after you get up and last thing before you go to bed makes a considerable difference.

That said, I did alter my Twitter use slightly. I now group it under my news folder rather than my social media folder, and I browse it twice a day along with my news. I use Tweetbot which makes lists more prominent than the official Twitter app (where they are an unfortunate afterthought and advertisements are more front and centre for whatever reason) and use lists to browse updates for no more than five swipes, thereby keeping my use limited. I also check up direct messages and mentions and more ‘social’ tweets from my following list the same way. This balances what I take from the network and give back as well — it enriches the community and possibly helps Twitter get some funds in the process.

The key point is that social media is no longer an obstacle now, rather a small part of my day. This does not work in favour of advertisers, but that does not bother me. The simplest approach would be to schedule social media use like you would schedule any other activity. Perhaps not as rigorously, but something as simple as avoiding social media first thing after you get up or last thing before you go to bed and ensuring you use it for no more than a few minutes when you do use it and have at least two- to three-hour breaks between consecutive uses should all make a considerable and visible difference. A lot of my argument may appear to focus on Twitter, but that is simply because I am active there and am not on Facebook; however, since none of my arguments have to do with Twitter’s character counts or usage patterns specific to it, these arguments may well be extended to Facebook, Google+, Ello and others. One might be prompted to quit because they had enough and ‘it’s not you, it’s me’ and that is fine.

Like any tool, it comes down to how you use the social web. I will remain on Twitter, but I will remain dormant for the most part. My activities will be based around Marginalia and my main weblog from where links to whatever I write will be shared automatically to Google+ and Twitter like they have been for years now. I will interact with anyone who interacts with me, but I will stop feeding these social networks any more than necessary and stop being mined for data in return.

My suggestion to whoever cares to listen would be to first identify the problem/s in question: this is a five-fold issue wherein the web is dwindling our attention span, disturbing our thought process, handing us more information than we can handle, exacerbating our Fear of Missing Out, and keeping us hooked on it. We can keep our attention span through targeted browsing as described above; we can have our uninterrupted thought processes by scheduling our social media use; we can counter information overload by not keeping ‘read later’ backlogs and, hand-in-hand, fight FoMO by facing the fact that we will miss out on many things, most certainly on information, and that that should not hold us back by any means; and, lastly, we can remain ‘unhooked’ by taking a vacation from social media.

Some people use August as a month to keep off the social web, others like the Dutch initiative, 99 days of freedom, came up specifically in response to carelessness on the part of social networks, prompting people to imagine a life without them and then live that life as part of a joint experiment with people around the world for 99 days or forever. There are calls to unplug, disconnect and smell the air around you.

My own stance is that one need not viciously abandon social media for one’s own lack of self-control; the middle ground is better: whether you join an experiment, take a vacation or do anything else, make sure that you manage your time better on the social web once you get back on it. That is one of two things that my experiment taught me. If, on the other hand, your reason to quit is simply a choice, that you have experienced all you wanted to and you now want to call it a day, then by all means quit. That is the second thing my experiment taught me: social media will go on and evolve and die like everything else, and it will always have users and quitters; make it a part of your day so quitting makes little difference. Use social media as a platform to voice your thoughts and do not spend time making it a representation of yourself. If, however, you want to invest your efforts in something that will personally represent you, and in something that will last, forget your social media profile. Start a blog instead.