My recent research increasingly focuses on how individuals can and do manipulate, or “game,” contemporary capitalism. It involves what social scientists call reflexivity and physicists call the observer effect.
Reflexivity can be summed up as the way our knowledge claims end up changing the world and the behaviors we seek to describe and explain.
Sometimes this is self-fulfilling. A knowledge claim—like “everyone is selfish,” for example—can change social institutions and social behaviors so that we actually end up acting more selfish, thereby enacting the original claim.
Sometimes it has the opposite effect. A knowledge claim can change social institutions and behaviors altogether so that the original claim is no longer correct—for example, on hearing the claim that people are selfish, we might strive to be more altruistic.
Of particular interest to me is the political-economic understanding and treatment of our personal data in this reflexive context. We’re constantly changing as individuals as the result of learning about the world, so any data produced about us always changes us in some way or another, rendering that data inaccurate. So how can we trust personal data that, by definition, changes after it’s produced?
This ambiguity and fluidity of personal data is a central concern for data-driven tech firms and their business models. David Kitkpatrick’s 2010 book The Facebook Effect dedicates a whole chapter to exploring Mark Zuckerberg’s design philosophy that “you have one identity”—from now unto eternity—and anything else is evidence of a lack of personal integrity.
Facebook’s terms of service stipulate that users must do things like: “Use the same name that you use in everyday life” and “provide accurate information about yourself.” Why this emphasis? Well, it’s all about the monetization of our personal data. You cannot change or alter yourself in Facebook’s world view, largely because it would disrupt the data on which their algorithms are based.
Drilling for data
Treating personal data this way seems to underscore the oft-used metaphor that it is the “new oil.” Examples include a 2014 Wired article likening data to “an immensely, untapped valuable asset” and a 2017 cover of The Economist showing various tech companies drilling in a sea of data. Even though people have criticized this metaphor, it has come to define public debate about the future of personal data and the expectation that it’s the resource of our increasingly data-driven economies.
Personal data are valued primarily because data can be turned into a private asset. This assetization process, however, has significant implications for the political and societal choices and the future we get to make or even imagine.
We don’t own our data
Personal data reflect our web searches, emails, tweets, where we walk, videos we watch, etc. We don’t own our personal data though; whoever processes it ends up owning it, which means giant monopolies like Google, Facebook, and Amazon.
But owning data is not enough because the value of data derives from its use and its flow. And this is how personal data are turned into assets. Your personal data are owned as property, and the revenues from its use and flow are captured and capitalized by that owner.
As noted above, the use of personal data is reflexive—its owners recognize how their own actions and claims affect the world, and have the capacity and desire then to act upon this knowledge to change the world. With personal data, its owners—Google, Facebook, Amazon, for example—can claim that they will use it in specific ways leading to self-reinforcing expectations, prioritizing future revenues.
They know that investors—and others—will act on those expectations (for example, by investing in them), and they know that they can produce self-reinforcing effects, like returns, if they can lock those investors, as well as governments and society, into pursuing those expectations.
In essence, they can try to game capitalism and lock us into the expectations that benefit them at the expense of everyone else.
The scourge of click farms
What are known as click farms are a good example of this gaming of capitalism.
A click farm is a room with shelves containing thousands of cellphones where workers are paid to imitate real internet users by clicking on promoted links, or viewing videos, or following social media accounts—basically, by producing “personal” data.
And while they might seem seedy, it’s worth remembering that blue-chip companies like Facebook have been sued by advertisers for inflating the video viewing figures on its platform.
More significantly, a 2018 article in New York Magazine pointed out that half of internet traffic is now made up of bots watching other bots clicking on adverts on bot-generated websites designed to convince yet more bots that all of this is creating some sort of value. And it does, weirdly, create value if you look at the capitalization of technology “unicorns.”
Are we the asset?
Here is the rub though: Is it the personal data that is the asset? Or is it actually us?
And this is where the really interesting consequences of treating personal data as a private asset arise for the future of capitalism.
If it’s us, the individuals, who are the assets, then our reflexive understanding of this and its implications—in other words, the awareness that everything we do can be mined to target us with adverts and exploit us through personalized pricing or micro-transactions—means that we can, do, and will knowingly alter the way we behave in a deliberate attempt to game capitalism too.
Just think of all those people who fake their social media selves.
On the one hand, we can see some of the consequences of our gaming of capitalism in the unfolding political scandals surrounding Facebook dubbed the “techlash.” We know data can be gamed, leaving us with no idea about what data to trust anymore.
On the other hand, we have no idea what ultimate consequences will flow from all the small lies we tell and retell thousands of times across multiple platforms.
Personal data is nothing like oil—it’s far more interesting and far more likely to change our future in ways we cannot imagine at present. And whatever the future holds, we need to start thinking about ways to govern this reflexive quality of personal data as it’s increasingly turned into the private assets that are meant to drive our futures.