The effects of mass surveillance

14 July, 2022
Photography by Marija Zaric on Unsplash.com
The “Data Society Research Seminar: Behind Data and Algorithms – Actors, logics and cultures behind digital technologies” took place last week on 16 June at Malmö University. There to speak at the event were Andrea Rosales Climent and Sara Suárez-Gonzalo, researchers from the IN3 group CNSC. The seminar was organized by Malmö University’s Data Society research programme and was funded by the Swedish Research Council (VR) and the Swedish Foundation for Humanities and Social Sciences (RJ).

Data and algorithms are intensively put in the public eye and have been on the research agenda for some time, prompting studies on biased systems regarding gender, ethnicity and now age. Researching the actors, logic and cultures behind data can lead academics to key insights into how algorithms embody rules, furthering discussions on power in today’s data society.

Over the last 20 years, digital technology and the internet have completely changed our mindsets. The internet was initially seen as a new world of amazing capabilities, but the digital revolution now threatens to undermine values such as personal freedom, democracy, trustworthy knowledge and even open competition. The future looks more like a dystopian universe than a utopian digital world. Words like monopoly, surveillance and disinformation describe the most pressing problems in this rather fearful internet setting. It sometimes feels like we live in a technological regime where our privacy is not respected anymore. In order to have any hope of recovering the promise of new technology, these problems need to be fixed.

The explosive growth of the online economy in the 1990s and early 2000s appeared to validate the idea that markets were best left to themselves. The internet of that era was neoliberalism’s greatest triumph. Now we have entered the era of surveillance capitalism with the rise of platform monopolies, where Amazon, Facebook, Google, Apple and Microsoft control whole ecosystems in the digital world.[1]

“The concept of surveillance capitalism has been used and popularized by sociologist Shoshana Zuboff since 2013. It refers to the commodification of personal data, i.e. the transformation of personal information into merchandise that is put up for sale to make a profit”.[2]

Data domination: the effects of mass surveillance from a neo-republican perspective

The seminar brought together academics to explore the need for a socio-cultural research approach to data and algorithms, whereby the focus is placed on the actors and culture(s) behind new technologies.

Sara Suárez-Gonzalo focused her talk on explaining how neo-republican theory can help us to better understand and deal with the effects of big data exploitation, mass surveillance and the way they shape social power relations. The hegemonic framework for the discussion of the social and political effects of data-driven technologies is the liberal conception of the values of privacy and freedom (traditionally defined by liberalism as non-interference). In this talk, Suárez-Gonzalo explained why this approach is insufficient and why turning the focus towards a neo-republican conception of these values (linked to the ideal of freedom as non-domination) can better explain and deal with these effects, especially when it comes to protecting people’s fundamental rights. These new distributed and largely uncontested expressions of power constitute hidden mechanisms of extraction, commodification and control that threaten fundamental values such as freedom, democracy and privacy.

Suárez-Gonzalo’s main concern is understanding how data access exploitation (by private and public actors) affects people’s fundamental rights. When she began to study these issues, she found that the main metaphor or framework for the discussion was the traditional conception of the right to privacy, i.e. “the individual right to be left alone”. The definition of this right is based on the principles of immunity of the person, to be kept away from the public eye.

In general, the liberal model of privacy is based on the role of public institutions not to interfere in people’s privacy, guaranteeing that they will be left alone. Data can only be gathered if the data subject has previously, and willingly, given their consent. However, Suárez-Gonzalo pointed out that this model fails to consider the reality of solid power inequality. The expert concluded that freedom involves much more than the mere act of not interfering, which is why the neo-republican point of view considers users’ vulnerability in their inequality of power regarding surveillance capitalism.

Neo-republican freedom of expression understands freedom as the absence of domination. Certain interferences are necessary to protect the public and ensure their freedom. She claimed that republicanism aims to equip people with tools to make decisions autonomously and take control of their own lives. The role of public institutions should thus be to only interfere if they are subject to the decision-making process of the population. These active interferences are only acceptable as long as they are also subjected to popular control. In a deeply unequal society, individual consent doesn’t always represent a guarantee of privacy.

Illustration by Ivan Mesaroš on Blush.design

There is no control of privacy nowadays; this is seen in the need to accept cookies from regular websites to do any daily task on a computer. Surveillance strips people of control over their freedom to remain private, thus taking away their right to privacy. Overcoming data domination will mean defying the liberal hegemonic conception of privacy and accepting that leaving people alone to do as they please is not enough. We need to rethink a possible feminist and neo-republican perspective. For that, Suárez-Gonzalo proposed two conditions:

  1. Everyone must be provided with the material and immaterial resources necessary to keep the potential forms of domination in check. This means guaranteeing universal education on digital issues and bridging the digital divides affecting some parts of the population that are unable to keep up with technological development.
  2. Structural conditions, such as restrictions of policy, that give control to big tech companies must be avoided. This means preventing them from exercising their power on public institutions and imposing restrictions on the monopolistic dynamics of the economy that guide the technological business model.

 

Facing the ethical controversies of programmers

Andrea Rosales Climent analyses various cultural products within a theoretical framework concerning these perspectives: unfair practices, practices against user interest, hidden information, illegitimate manipulative personalization, the exploitation of users’ vulnerability, the right to equal treatment and the right to be treated with dignity.

The expert analyses the imaginaries about automation and algorithmic systems in two cultural products. Automation imaginaries are already embroidered in culture. Writers, journalists and documentalists contribute to the construction of the related imaginaries and, at the same time, embed the imaginaries of the society in their products. I particularly analyse two best-seller books: Uncanny Valley (2020) by Anna Wiener and Quality Land (2017) by Marc-Uwe Kling.

Uncanny Valley shows the techno-optimism of most tech workers who are not aware or are not interested in being aware, of the potential social implications of the technologies they develop. The author shows how tech culture contributes to creating this believer team, where employees are “Down for the cause” of the project, and any critical view is disregarded. In the book Wiener shares her memoirs as a misfit employed of the Silicon Valley. A privileged standpoint to observe the culture of the tech companies in the Sillicon Valley. Throughout the book, the author reflects on the moral implications of data collection and manipulation among technology companies: “Silicon Valley might have promoted a style of individualism, but scale bred homogeneity” (Wiener explained for The New York Times Books). The tech culture in Silicon Valley is very much part of the worker’s imagery. Rosales pointed out that most tech workers are not interested in dealing with data ethics problems and are totally into the techno-optimistic approach. This means that there is a general belief among workers that technological developments will do more good for humanity than harm, that the solution to people’s problems lies in technological innovation, and that they are actually the ones solving these problems.

Image by Tech Leaders Emerging on YouTube.com

Rosales highlighted that this belief may be reinforced by the fact that tech workers are often very young, have a stronger need to belong and thus easily identify with the all-consuming feeling of affiliation and quickly and wholeheartedly support Silicon Valley’s tech culture. The fact is that they have a good salary, a fun environment to work in and the empowering feeling that they are changing the world, so it makes sense that they support techno-optimism.

Uncanny Valley by Anna Wiener on Wikipedia.com

Otherwise, Quality Land by Marc-Uwe Kling illustrates the potential consequences of the large-scale application of the rationale behind many digital platforms used nowadays, including scoring systems, recommendation algorithms and to what extent they could be fair, genuine and harmless.

They both use an apocalyptic tone and dark views that reflect society’s fears about automation and algorithmic systems or the response to the imperant techno-optimism and the tensions in the forced digitalisation of society. 

They talk about unfair practices against user interest, black-box algorithms, illegitimate manipulative personalisation, discrimination of the most vulnerable collectives, exploitation of user vulnerabilities, and conversely, the right to equal treatment and the right to be treated with dignity and the right to freedom.  Moreover, the related imaginaries show the social implications of widely used digital platforms when their use becomes the rule for social interaction, work relationships, shopping and citizenship.

 

 

References:

Guerrero-Solé, F., Suárez-Gonzalo, S., Rovira, C., & Codina, L. (2020). Medios sociales, colapso del contexto y futuro del populismo de datos. Profesional de la información, 29(5).

Suárez-Gonzalo, S. (2019). Personal data are political. A feminist view on privacy and big data. Recerca. Revista de Pensament i Anàlisi, 24(2), 173-192.

[1]Starr, P. (2022, July 5). How Neoliberal Policy Shaped the Internet—and What to Do About It Now. The American Prospect. https://prospect.org/power/how-neoliberal-policy-shaped-internet-surveillance-monopoly/

[2] Wikipedia contributors. (2022, June 19). Surveillance capitalism. Wikipedia. https://en.wikipedia.org/wiki/Surveillance_capitalism

(Visited 35 times, 1 visits today)
About the authors
Marina Mora
  • Research & Media Comms, Communications Department