Søren Kierkegaard, one of the fathers of existentialism, argued that it is the act of making choices that brings meaning to our lives; that through making choices, we live authentically, forming our own opinions, rather than being guided by the opinions of others or society as a whole. For Kierkegaard, understanding the meaning of our existence comes through true experiences when we make choices of our own, not following those of others.
What would Kierkegaard, who died in Copenhagen in 1855, make of the monolith that is Facebook?
The social-media company has announced that it will change its news-feed algorithm to deemphasize content which is passively consumed and prioritize content that Facebook predicts will be meaningful. Kierkegaard might agree that reducing passive experiences could increase meaning. But that would only be true if the remaining experiences were active and authentic.
One might argue that this filtering on our behalf is essential in a modern-day existence. Even in 1846, Kierkegaard argued that the pursuit of knowledge was distracting people from finding meaning, writing “people in our time, because of so much knowledge, have forgotten what it means to exist.” He argued that when presented with unlimited choices, we face a dizzying anxiety. The seemingly infinite opportunity to seek knowledge through the internet might seem so overwhelming as to require filtering.
Indeed, by filtering our experiences and limiting our choices, Facebook may be saving us from an existential crisis or angst. As Kierkegaard wrote:
Standing on a cliff, a sense of disorientation and confusion cloud you. Not only are you afraid of falling, you also fear succumbing to the impulse of throwing yourself off. Nothing is holding you back. Dread, anxiety and anguish rise to the surface.
Perhaps Facebook’s filtering of experiences saves us from this anxiety and anguish as a parent saves a child. That would be true if it were possible for an algorithm to understand each of us well enough to help us make choices that we would have made on our own, to predict our choices well enough to be able to present a selection of choices that discards those we wouldn’t make. Perhaps then. But not yet.
Despite the vast quantity of information Facebook has about us, it is limited primarily to our activity on Facebook and, secondarily, to the rest of our online activity that it tracks. Facebook can try to quantify meaning by our shares, comments, and emojis as well as how we cruise around its platform and the internet. But that is the limit of its data.
Facebook doesn’t know if we have a meaningful conversation offline about something we read online. It doesn’t know if we meditate to decrease our anxiety about a particular news story. Or if we ponder a question from a friend on a long walk. The only thing Facebook can attempt to predict is if we will interact with content on its platform. Without understanding the rest of our lives, it isn’t possible for Facebook to know what will be meaningful.
The core philosophical issue with Facebook’s algorithmic change is the conundrum that the very act of choosing meaningful content for us means that the consumption of that content cannot be meaningful. By filtering our experiences, Facebook removes our agency to choose. And by removing our choice, it eliminates our ability to live authentically. An inauthentic life has no meaning.