In Ancient Greece there once lived a young boy whose handsomeness was dazzling. He was, however, blissfully unaware of it. At some point in his life a young nymph came to him and expressed her love, but this young lad dismissed her unabashedly and went about his day’s work. Apparently this severely displeased the Grecian gods and they decided to teach the fellow a lesson. They decided that he had lived in ignorance long enough and that it was time he realised his own handsomeness. That evening found him at a spring where he happened to bend down and catch a glimpse of himself in the water. For the first time ever, he saw his own reflection and was stunned; he was so stunned, in fact, that he fell in love with his reflection and began to pine for this ‘other person’. Like the nymph he would never win his heart; unlike the nymph our young boy would go on to die for this. The boy’s name was Narcissus.
There are far too many inconsistencies with this tale of Narcissus, not the least of which is its truth. Did such a boy exist? Did the Grecian gods have nothing better to do than focus on the love story of a random kid? Was this just an old wives’ tale designed to make a point about what is socially acceptable? Nonetheless it did not prevent Freud from using Narcissus’s name for a disorder that most of us are familiar with today: narcissism, where one feels a vain, exaggerated recognition of ones own importance often paired with a helpless desire to be admired.
There is more to Narcissus than Freud’s usage betrays. The myth continues describing how a plant bloomed where Narcissus fell to his death. This is the amaryllis plant that shares its name with the Greek lad. It is known for its narcotic numbing effect. In Greek narco means numbness. It was Narcissus’s lack of realisation, his numbness, so to speak, that led to his death. And narcissism might just as well be interpreted as a numbness one feels towards one’s self that causes them to blow up their own importance eventually losing track of who they are and possibly even beginning to desire to be someone else.
I admit perhaps I am letting myself wander at this point—the last thing I would want is to get into a losing battle with a psychologist. If I am, in fact, wandering rest assured that it is with good intention. Today’s world is slowly redesigning itself to normalise a certain degree of narcissism that would have been frowned upon only decades ago. And social media has played no small part in bringing about this change.
None of this is to claim that social media breeds narcissism or that it makes a narcissist where none existed. But it does actively bring to the surface that hovering bit of narcissism that lies dormant in us all—whether we like to admit it or not. What it then does is normalise it because social networks are designed to feed on this. To blame this entirely on the social web too would be wrong: it does help us in some other ways after all; the question is whether the tradeoff is worth it and it rarely is.
The social web is built to enable transfer of information. But information can be transferred only so long as someone is out there seeking it. And someone will only seek it when they believe, on some level, that they are likely to find it there. That is to say, the model atop which the social web is built is to see what information someone is looking for and to place that before them. But, in a characteristic fashion, modern technology has gone one step further. It now attempts to understand seekers well enough to be prepared with what they are likely to seek. Going further still, having understood what someone might be interested in, the social web simply shoves that in their face—targeted advertising—in the hope that at least a handful in a crowd of hundred pursue it further. All of this translates to money.
So if knowing us really well is what will drive the social web towards success and unimagined profits, what better motivation exists for social networks to want to make us share more about ourselves and our lives?
The human mind adapts and manipulates in equal measure. It is inherently biased in all its observations. The incentive-and-rewards system designed by the social web plays the mind slyly and carefully: it makes us want to share by tapping into our social instinct and it rewards us with responses from others, highlighting it for no obvious reason throughout our day. A little notification here and there that makes us rush to our phones is really a pitiable reward system in play wholly designed to benefit the platform serving those notifications. Added to all this, notifying creates a sense of urgency.
This also comes down to a numbers game. The only real way to ‘grow’ on social media—whatever that means—is to participate with consistency. ‘Likes’ and ‘Favourites’ and other such statistics do not mean a lot to everyone. Those to whom these numbers do make a difference are already within the platform and will work on staying there. It is those to whom such numbers are not of consequence that platforms have to work on retaining. And this is done using equivocations like ‘engagement’: the number of times people saw your updates, the number of people wanting to follow you, the number of people actually following you and so on. All of this comes down to cold, hard numbers. How many people would still keep sharing on social media if they knew nobody would ever see their posts?
Add this all up and you find yourself in a system carefully designed to pull you in and keep you in and make it as hard as possible to leave. No doubt a person can simply choose to quit and stay that way (I have done it myself) but just how representative of the population is this practice? The average social media user has anonymous private accounts, notifications turned on for all platforms, connects to the web as often as possible and has a constant fear of, one, missing out, and two, needing some entertainment to keep themselves engaged.
To want to remain engaged in a world where attention deficits are increasing might seem counterintuitive but it is not: the engagement that social media provides is by nature designed for attention deficiency. Everything is bite-sized so you can spend five minutes on something before heading out to the next entertainer with a faux sense of having gained ‘new information’ along the way. By contrast reading a book demands days of continual attention.
What is this attention deficit doing, though? Why does it matter and why is it important? The reducing attention span plaguing a lot of today’s population is meant to take your mind off something much more sinister. Users are slowly being numbed to the fact that their presence on the social web is not one where they exist but rather where they are constantly and deliberately curating a version of themselves to showcase before the world. Every time someone looks at the social web they are looking not at themselves but at their reflection in a spring. As more time passes in this the chance of a user recognising this distinction reduces. The social web becomes a modern-day retelling of Narcissus’s myth.
If all this repeatedly reads like a dramatic cry against social media the reader will have to make a conscious effort to keep in mind that it is nothing of the sort. As said already, social media neither breeds narcissism nor makes a narcissist where none existed. Tools are rarely to blame especially when they have several valid positive uses too. The fault does not lie in social media at all; the fault lies in us.
How often have we seen someone enjoying a little moment in their day only to be swept away by the urge to share it online? Speaking as someone who almost never shares everyday moments on social networks, there is a surprisingly vivd mask that gets drawn across people’s faces—perhaps unintentionally, perhaps by habit—as they morph from themselves, who were enjoying the moment, into the their virtual selves who are ready to pose and photograph (or realise in some other fashion) that visualisation of events which they would like to put up online.
Everybody dresses real life up and it is not a new practice. Specific photographs taken way back in the 60s too suggest people loved setting things up before making a picture. The difference is, back then this was an occasional activity; now we dress up an event so often that we have slowly begun to lose our sense of reality itself, let alone the event. Pete Nicholson puts it quite eloquently—
I find myself enjoying a fun or interesting or strange thing and then, at a certain point, as if some invisible switch were flipped, I suddenly notice myself wondering the best way to communicate the moment to other people, typically via something you can do on a smartphone. Invariably, when I attempt to return to the moment, it’s gone.
The trick is to balance things. One could share later rather than now so nobody focusses on making pictures to share while the moment is underway—we just need to make pictures if we feel like it and later share pictures if we have any. And if we have none there is no need to share it.
However it is not just pictures that are the culprits. The social web has given everybody a soapbox to shout from. The trouble is that no-one is listening. It becomes important then to realise that not every thought we have needs to be broadcast on social media. Some can simply be kept to ourselves.
What we need today are what the journalist and author William Powers calls ‘Walden zones’, places around our home and work where devices are banned. He also points out the clever idea to have long moments of disconnection between successive use of our social media.You can read about William Powers’s book Hamlet’s blackberry on my bookshelf. This is key to ensuring we can keep our shiny new toys—perhaps even that we have earned them—without experiencing any adverse effects on our lives.
But there is always the elephant in the room: Do we have it in us to develop such discipline? Do we have it in us to set up Walden zones and stand by them? Do we have it in us to keep track of our connected lives and rein ourselves in from time to time? After all this is something Narcissus could not do. For my own part this has not been hard which is what gives me hope that anyone else can do it too if they, firstly, acknowledge the issue and, secondly, make a sincere attempt—both of which are easier said than done. Ironically enough I have had some additional assistance of late from my iPhone which, with iOS 12, tells me how much I used my phone every day and even compiles a report every week. Like most graphs it is insightful and at times unusually helpful, and I have been making some progress on that front as well.
Our virtual selves, our faux reflections, ought not subtly run our real lives. But they are doing as much today. Despite the advent of technology, which will only ever increase in the coming decades, humanity is not about to disappear; human interactions are not about to be replaced except for our own downfall; and if we proceed as we have been in the past our virtual selves will not stop trying to take us over any time soon.