Melissa Renau is a predoctoral researcher at the UOC’s Doctoral School and member of the Dimmons Group at the Internet Interdisciplinary Institute (IN3). She is currently carrying out research into the working conditions of a range of online delivery platform models.
Spain has recently become the focus of attention due to passing the “Rider Law”, which although it aims to put an end to the labour dispute raging on the country’s online delivery platforms, has not pleased everyone across the political spectrum. It is a dispute that Renau-Cano has studied in depth, and about which she has just published an academic research paper analysing how the much-vaunted “flexibility” has been created, and why the use of “free login” does not in itself deliver the purported “freedom”.
Spain has been one of Europe’s pioneers when it comes to regulating jobs in delivery platforms. What do you think of this new law?
The agreement introduces two amendments to Spain’s Workers’ Statute, in line with the government’s goal of protecting courier platforms. First, it recognizes platform couriers as employees who are subject to the “algorithm”, which is recognized as the main tool used by delivery platforms to organize and monitor their work. This recognition is a major step forward.
Nevertheless, the most important amendment to the Statute is the second one, partly due to its innovative nature, but also because it affects all digital platforms in Spain. This second amendment forces companies to share the parameters, rules and instructions used as the basis for their algorithms with trade unions, or their artificial intelligence decision-making systems if they affect working conditions.
What effects has it had to date and what do you see as its future prospects?
Many of the demands made by unions and other groups fell by the wayside during negotiations with employers. More specifically, the law fell short in terms of failing to protect workers from mass redundancies and outsourcing.
One tangible example of the law’s limited effect is that most people who work for Glovo are still self-employed, as the company maintains that it has removed the conditions that make them employees. The main change to the app is that it has introduced a “lowest bid” system for riders to take on deliveries. This system entails riders offering to pick up orders, and bidding a price to compete with each other. This means that at certain times of the day, the platform takes advantage of lower levels of demand to pay them less.
Another area in which more could have been done is to bolster the provisions of Article 22 of the General Data Protection Regulation, by defining the meaning of the right to information on the algorithm more specifically, for example.
Do you regard its scope as sufficient or do you think that it is excessively focused on the delivery sector?
Although it’s a meaningful first step, the law does not do enough to improve working conditions. Most delivery platform companies are beginning to offer badly-paid jobs via third parties, and the law as it currently stands after the employment reforms allows this. This in turn leads to a growing sense of mistrust towards the unions, because some delivery platform workers are being disconnected from their apps in a sort of constructive dismissal by the platforms, meaning that they lose their livelihood.
What’s more, in practice, and despite the valuable work done, the law doesn’t affect other groups of workers. Delivery platform workers work in a high-profile sector which can easily mobilize its members, compared to other online platform economy workers like those doing micro-jobs online via Upwork, or domestic work through platforms like Cuideo. Thanks to the extremely high profile that delivery workers have achieved, there was an opportunity to reach out to and protect other groups who also need stronger and better regulation.
Is there a risk of associating the platform economy solely with deliveries and a failure to oversee or regulate the other workers?
The new Rider Law is a step forward provided that it doesn’t mean that the remaining sectors won’t be regulated, and that the platform economy’s definition is not restricted to the delivery and last-mile logistics sector. The repeal of Spain’s 2012 labour market reforms is a necessary step towards guaranteeing better working conditions for all.
In this regard, and given the delivery riders’ dispute, has the debate in the media accepted at least in part that the platform economy and job insecurity go hand-in-hand?
Well, first, it’s important to note that job insecurity is not an invention of the platform economy. Today’s job insecurity in Spain is not only connected with the employment law reforms of 2010 and 2012, but also with a dysfunctional migration system and other factors acting as a breeding ground for this kind of business. Platform economy delivery workers have better working conditions in other countries where there are more and better job opportunities.
Second, we need to understand that the platform economy is home to different business models, ranging from extractionist platforms to new types of platform cooperativism, each of which is associated with different impacts.
It is against this backdrop that Glovo’s new business model has introduced a “free login” system. This consists in allowing delivery workers to connect to the app and make themselves available to it to accept orders at any time of day. Before, they had to book the hours they wanted to work in advance. This is a double-edged sword.
As I show in my most recent article, flexible working hours cannot be associated with the piece-rate payment system used by these platforms, which forces delivery riders to take on lots of orders. Or the lowest bid system, which forces them to sacrifice part of their earnings in order to compete against each other and secure orders. Or with the zero-hours contract system or algorithmic control.
Is flexibility always a smoke screen for deregulation?
Although the narrative around flexibility is at present rightly associated with deregulation, this need not always be the case, and it’s not necessarily part and parcel of the appearance of new technologies. What’s really important is to understand that the concept of “flexibility” means different things to employers and workers. Whilst the former look to cut costs, the latter seek a better work-life balance and a better-quality job. With all jobs, there is an inherently unequal relationship between employer and employee. That’s why we have labour law, and why it’s important to introduce measures to ensure a fair balance between the parties involved.
We recently heard that a Russian company made a third of its workforce redundant on the basis of a decision made by an algorithm. Will this become commonplace in the future?
I’d say it’s already more of a current issue than a future one. Algorithm-based decision making in Europe is regulated in the sense that you can’t take a decision based solely on an algorithmic calculation. Now, although this doesn’t guarantee that a decision is fair, it does provide the workers affected by these decisions with a higher level of protection. Additionally, as mentioned before, it’s important to give these rights some substance.
How is oversight of algorithms possible, and how can bias in them be prevented or at least minimized?
Making sure that ethical and social values are reflected in algorithms and artificial intelligence (AI) technologies is no easy task. In many cases, there are very generic suggestions that are difficult to implement without further guidelines, or which fail to take into account the many stakeholders involved or the broader social context. One of the initiatives that has gained most traction in recent years has been algorithm auditing, which is sometimes based on “design justice”.
In her latest book, Virginia Eubanks explains how AI systems were initially tested in poor neighbourhoods. Are the most disadvantaged members of the community being used as guinea pigs for new technologies?
It’s true that many AI systems were first tested in more disadvantaged areas, as shown in documentaries like The Social Dilemma. Similarly, the hardest work ends up being outsourced to countries with poorer working conditions and attracts some of the most disadvantaged people in society (although it’s a little outdated, I’d recommend watching the documentary The Cleaners).
The fact is that algorithms form a part of our daily lives in a way that we’re not even aware of. AI systems are not only used by companies like Glovo, Google and Instagram. To give an example, Waze, the app some of us use as an alternative to Google Maps, leverages users’ data to create the routes it offers by means of a complex algorithmic system. And it’s a constantly-evolving system, which means that all its users are in a way its “guinea pigs”.
Will algorithms end up benefiting the rich and harming the poor?
When the Russian company you mentioned decided to fire a third of its workers based on an algorithm’s decision, it wasn’t forced to do so by the algorithm. Behind these actions are people who took the decision to fire part of their workforce based on this algorithmic calculation. They are the same people who are also looking to cut costs in the redundancy process.
In other words, it’s not just a matter of the power of algorithms, but also of the decision-making power of the people behind those algorithmic systems. In an incrementally more unequal world, inequality reaches out to affect all areas of life, including the digital world. If we focus solely on dealing with this digital realm, and forget everything else, if we ignore economic and social inequality, we’ll fail to solve the problems currently affecting society. The lack of a global perspective means using a stopgap solution to conceal a deeper problem.
The research reported in this paper was funded by European Union, Horizon 2020 research and innovation programme, “Platform Labour in Urban Spaces: Fairness, Welfare, Development” (https://project-plus.eu), Grant Agreement No. 822638. The views and opinions expressed in this publication are the sole responsibility of the author(s) and do not necessarily reflect the views of the European Commission/Research Executive Agency.
(Visited 17 times, 1 visits today)