In a global security assignment for her politics degree at Manchester Metropolitan University, Gemma Espart Calvill argues that surveillance capitalism is a major threat to our democracies
It has been said many times that the digital revolution is unstoppable, uncontrollable and that it goes too fast for governments to be able to keep up with it. But is that the main problem? Perhaps the digital revolution is unstoppable, but then why would we want to stop it if it has the potential to bring great things for humanity? We live in a moment of increasing inequality in which the main threat to democracy and privacy rights comes from a still far too unknown economic and psycho-social order. Therefore, in order for us to be able to deal with it in the scope and dimension that it is required, I want to consider three main questions: one, what is happening and how did this economic order come about; two, why should it be understood as a relevant threat to democracy and basic rights; and three, the states’ real power and limitations when regulating this new economic order.
I started sensing that something was not right with social media when I tried to erase my Facebook account and could not find a straightforward way to do so. I had more than one reason for doing so. I had developed a dependency on it that made me feel really uncomfortable, but also the exposing nature of it felt as if what I was doing was a kind of profiling and self-surveillance of my entire life. I say it was not that straightforward because from the moment I decided to delete the account, I had to go through a three-month period in which my account would be deactivated first until it would be fully removed from Facebook’s servers. That was the moment I found out that, to my surprise, Facebook knew better than me what I really wanted, because I might have needed a period to really think this through before my decision was absolutely unchangeable. However, that was also the moment my curiosity grew: how much do they know? And more importantly, how much do we, the users, know?
The digital age moves fast, indeed that fast that we seem to be seeing a part of the elephant instead of the whole. Shoshana Zuboff’s 2019 book The Age of Surveillance Capitalism sheds a light into that darkness with her theory on this new economic logic that has been entering our lives for at least a couple of decades, nearly undetected and hugely misinterpreted. We are already fully embedded in this logic and whether we want to engage more or less is irrelevant because at this point – the only real alternative to being a part of it is to reject social life altogether. Zuboff argues that this is only the beginning, there is much more to come, and if we do not start trying to make sense of it and regulating it, soon the inequality that this logic produces will destroy the system and democracy in a way that will make them unrepairable.
Carry argues that sleep is the only human experience that has not yet been rendered profitable by this logic
She introduced the term “surveillance capitalism” and provides eight different definitions, all of them equally revealing. As a brief summary of the concept, the first one will suffice: “a new economic order that claims [private] human experience as raw material for hidden commercial practices of extraction, prediction and sales”. She argues that capitalism has the need for creating new markets in order to stay alive and survive its periodical and unavoidable crises. First, it was the appropriation of land to be owned and sold. Then, people’s own time was being sold as labour to become a service or product. This time it has been the commodification of leisure time, people’s own private sphere. Jonathan Carry, in his 2013 book 24/7: Late Capitalism and the Ends of Sleep argues that sleep is the only human experience that has not yet been rendered profitable by this logic.
This logic has become a “new economic order” in that it has permeated and commodified all aspects of our lives: at work, with our family and friends, and even in our most intimate moments. It has even entered our bodies, with smart watches that keep a record of our heart rate and other vital signs. I cannot but wonder then what is the meaning of privacy nowadays. When you go to the doctor all the information extracted from you and your body stays confidential, or it should do, so how come when it comes to technology, we believe that we have to enter a contract in which we sell our private information in exchange for using a product/service?
Is it not the case that if I have nothing to hide then I cannot surprise other people?
Zuboff argues that Google and other big tech companies’ idea that “if you have nothing to hide, then you have nothing to worry about” has made us believe that we have no other alternative than to expose our privacy in order to keep up with the digital advances. To that idea she replies: “If you have nothing to hide, then you are nothing.” This is an incredibly powerful insight into the essence of human beings: how important is privacy in order to preserve one’s own identity? Is it not the case that if I have nothing to hide then I cannot surprise other people? Do we want to live in a world in which everything is predictable, even what might constitute our own particular identity?
This is one of the crucial points investigated by the South Korean-German philosopher Byung-Chul Han. He argues that in this era of transparency, we lose what makes life unpredictable, spontaneous and different. What makes deep relationships possible is precisely the freedom and autonomy of each one involved in that relationship, the mutual respect of accepting that there are things you might not know about the other, argue Seubert and Becker in the journal Philosophy and Social Criticism. But if that is completely lost by the devaluation of privacy, we are left without the possibility of trust, respect and recognition for the other and the community. We become thus a uniform, undifferentiated society – one that is incapable of forming deep relationships and which resembles more a totalitarian state than a community that values democracy and freedom above all. Moreover, the idea of full transparency is an absolute fallacy, because while we agree to give away our data in exchange for a service or product, these companies know astonishingly more about us than we know about them, thus the exposure concept goes one way only, according to Zuboff.
That translates into the political sphere quite easily. An excess of information keeps populations confused, while reality and meaning become more about the quantity of information instead of quality. Post-truth and fake news fit in perfectly under this system, where big data and its correlations, causations and predictions become “truth”. Han argues: “Big Data affords only extremely rudimentary knowledge, that is, correlations in which nothing is comprehended. Big Data lacks comprehension – it lacks the Concept – and thus it lacks Spirit.”s By Spirit he means conclusion – that the act of reasoning adds coherence and logic to what is just data. The constant addition of information that Big Data provides reflects a lack of conclusion, and thus lack of concept and reason, says Han. And without reason we are completely blind to what happens in our community, why it happens and how to solve it. We become mere spectators of what happens around us, feeling uncapable of any real political action.
If politicians get their place in governments by manipulating citizens’ emotions, what can we expect from them when they are in office?
In this era of spectators’ democracy, politicians resemble more a product than a force for change, argues the philosopher LS Almendros, and the effectiveness of this excess of information is that they can sell whatever reform they believe citizens want during campaign, and go back on their word when they are elected. However, this is not a coincidence. It is a logical consequence of this new economic order which has permeated and corrupted politics altogether. Political campaigners understood that the power of data is huge because of the predictive value it offers. If you know enough about your citizens, then you know who will be more susceptible to engage with what type of information – and at what time of day.
This social engineering is completely emotional and irrational, because as I have argued before, reason needs comprehension, and this logic is based on manipulation of our most vulnerable moments. Obviously, therefore, if politicians get their place in governments by manipulating citizens’ emotions, what can we expect from them when they are in office?
Furthermore, the power that global tech companies have is exponentially bigger than that of governments, which makes it even harder for effective measures to be put in place from the bottom up. These companies do not respond to any state, they have no nation. Facebook, Google and Amazon, to name the biggest ones, understood that the power is in the data, because with that, you can manipulate whole populations. Data, therefore, is their main product. The other services they offer are all focused on increasing the quantity of data to sell to their advertising customers. They do not care whether you are sad or happy but whether that moment makes you more vulnerable to buy this or that, says Zuboff. They operate under the logic of capitalist accumulation, one that is way far from Hayek’s concept of the free market. That is because they do not give back to society. They extract our data, package it and sell it, but they employ few people, do not pay taxes on that product to any state, and only a few get to accumulate all that wealth, points out digital economy researcher Anya Skatova. This functioning lies behind the increasing wealth inequality that we see nowadays, and without a balanced distribution of wealth in society, democracy becomes unpracticable and our sovereignty rights are threatened.
The state, which is supposed to protect its citizens from threats, has become completely entangled in this logic. It can clearly be seen in the case of China, where we see a totalitarian capitalist society. Where capitalism has been long understood as the only possible ally of the free market and thus freedom, we now see this idea trembling. Capitalism can easily be partners with totalitarianism because it has no moral values, because the logic of accumulation does not feed from or respond to ethics. It could be argued that we, the citizens, are part and parcel of this logic because we want this technology to make our lives easier, because we happily enter this self-surveillance system without questioning it. And while I agree that we should be questioning why we find this self-exposure so attractive, I believe that this so-called new “smart power” gets what it wants precisely because we buy into this manipulative argumentation that makes us the guilty ones.
Zuboff explains that she has written her book precisely because the main problem with regulating these practices is that we need more awareness of what is really happening. Every time we click “yes” to accept the terms and conditions’ in order to access an app or a website, we believe that we are entering a fair contract in which we trade a few clicks on a website for a bit of online targeted advertising. However, this is not the whole picture. In 2010, Google admitted to be “accidentally” collecting personal data from domestic wifi networks through its Street View cars. Google also admitted in February last year to have introduced a microphone in a home security alarm without listing it on the product specifications. It also claimed that to be an accident. The narrative that “technology goes at a faster speed than politics and thus regulation” makes it possible for these things to happen. Tech companies’ power over governments allows them to claim that those were only mistakes and this in turn provokes further problems when regulating this data extraction and data ownership.
Do we need an Alexa to turn the lights of our house on and off? Is it even a healthy practice to self-monitor every single change occurring in our bodies?
Laws such as GDPR are considered to be one of the biggest achievements in giving back ownership of data to the individual. However, what it translates into practice is a bureaucratic nightmare in which people agree again without understanding the dimensions of the agreement. Furthermore, fighting over ownership rights on personal data stays embedded in the logic of accumulation and just shifts the power over it, says Zuboff. Why not instead ask whether we need that data to exist in the first place? Do we need an Alexa to turn the lights of our house on and off? Is it even a healthy practice to self-monitor every single change occurring in our bodies?
I am not arguing that all the ways in which data can be used are toxic or unnecessary. On the contrary, I believe that data can be used for great things. For example, Google Maps is an incredibly useful tool. However, should it not be a public service? Users do not currently get charged a fee for using the service but that does not make it free at all. They have to enable the location mode, which enables the device to send all this data to Google for its commercialisation. All that power is not given to the state under a social contract in which the state protects its citizens in exchange for some individual freedom, but instead it is given to private companies that can sell that information to whomever they want without having to give account to any government or citizen.
I believe the state’s inability to deal with this phenomenon stems from a real unawareness by populations of how this global economic and technologic order works right now and the implications it has and can potentially have in our lives. The lack of nationhood and of moral values that these global tech companies have alongside their increasing power over all aspects of our lives poses a real threat to human rights and communities in a global sense.
Like the Big Issue North on Facebook
If you liked this article, we think you’ll enjoy these:
Interact: Responses to Commodification of our private lives
Leave a reply
Your email address will not be published.