An Italian lesson for Deliveroo: Computer programmes do not always think of everything!

bikes parked on city embankment near canal and old buildings on cloudy day
bikes parked on city embankment near canal and old buildings on cloudy day

In this blog we examine a very recent Italian decision from the Bologna Labour Court – Filcam VGIL Bologna and others v Deliveroo Italia SRL – which held that Deliveroo’s algorithm – called “Frank” – which determined its workers priority to access delivery time slots was discriminatory.  Whilst we understand that the algorithm at the heart of that case is now defunct, there is an important lesson to be learnt from the decision.  Specifically, this Deliveroo case demonstrates conclusively how unthinking reliance on algorithms simply because they are perceived to be “useful” can lead to unintended discrimination – with the result that a business ends up in court.

The priority allocation system

Deliveroo distributed work slots using a priority system based on a score derived from a digital platform.  Once a week, its workers were able to contact Deliveroo to obtain these working slots, but they did not have equal access.  What happened was that the workers were allocated to three different time bands within which they could book slots  –

  • At 11 am on Mondays, riders who had been given the highest priority by Frank were able to book slots.  This was equivalent to 15% of the riders.
  • From 3pm, riders having the next highest priority were given the opportunity to book such opportunities as remained.  This was equivalent to 25% of the riders.
  • Riders with the lowest ranking were able to access the system and make bookings from 5pm.  This was equivalent to 60% of the riders.

The difference in priority led to huge differences in the amount of work that was available.  This was because the riders given the highest score by Frank had early access to sessions and could quickly “fill up” available slots at the expense of the less well scored riders. 

AI’s role in determining rider priority

Priority in access to the distribution system was determined by a digital platform downloaded onto smartphones which created a personalised rider profile.   An algorithm then determined when riders could access shifts based on a “score” awarded to each one.  This algorithm  Frank – determined priority by reference to (at least) two metrics (called indices):  

• The reliability index:  The number of occasions when the rider, despite having booked a session, did not participate, where “participating” meant logged in within the first 15 minutes of the beginning of the session from the relevant geographical location.

• The peak participation index:  The number of times the rider becomes available for work between the hours of 8 pm to 10 pm from Friday for home consumption food delivery.  It also appears, although it is not entirely clear from the judgment, that the rider then needed to be available over the weekend in order to receive the highest score.

There were also penalties imposed on riders who cancelled booked sessions with less than 24 hours’ notice.

How was the algorithm used?

It appears that the aim of the work allocation algorithm was to encourage riders to perform work through the platform at certain times.   

There was evidence that the algorithm could be altered to account for “good” reasons for non-attendance at work, for example, technical glitches.

Who started the litigation and why?

The case was brought by three unions with relevant interests in this kind of work. They alleged that the system of offering work was discriminatory because the way in which riders were given priority depended on their ability to offer and accept work which in turn could be affected by their family commitments.

It was alleged that this was indirectly discriminatory because there could be very good reasons why a rider may wish or need to cancel or not be available in peak times e.g. a child’s illness, childcare etc. 

What did the Italian court conclude and why?

The court accepted that riders were penalised if they could not work due to reasons such as family reasons or had to cancel at short notice.

The court held that the fact that the algorithm did not take into account any reason for late cancellation or non-participation in peak shifts, meant that it could be potentially indirectly discriminatory.  Accordingly, the burden of proof shifted to Deliveroo to justify the system.  Deliveroo’s justification was simply that the commercial relationship between it and its riders (which it characterised as self-employed contractors) entitled it to monitor and distribute shifts as it saw fit and that all riders were treated the same.  The court rejected this justification defence essentially because the system, whilst universal in its application, could not sensibly differentiate between riders in relation to the individual reasons for non-participation in shifts which the algorithm then used to calculate the priority scores. 

What is the significance of this decision?

Transparency

It should be obvious that the Deliveroo case is yet another example of how courts may take a dim view of organisations that deploy algorithms which are alleged to be discriminatory, and yet decline to provide full disclosure of its inner workings.

To explain – one important feature of algorithms is that they are often difficult for third parties, like workers, unions or even judges, to understand and monitor in the absence of full disclosure/transparency from the company which uses them. 

Lack of transparency was a feature of the Deliveroo case since the company declined to “prove” the “concrete mechanism” at the heart of the algorithm.  Since there were factual disputes in the case over how the algorithm worked, the court upheld the claimants’ case in relation to those factual matters in the absence of Deliveroo “proving” its alternative factual narrative (and also there were some Deliveroo generated documentation which supported the Claimants’ case).  It is worth noting that a similar approach was taken in the Syri case where the Court of Hague concluded that a discriminatory algorithm was being used in the absence of a full and proper explanation from the state authority that deployed it.  Our blog addressing this case can be accessed here.

Accordingly, companies and organisation that deploy algorithms should not assume that a lack of transparency will save them from scrutiny.  We discuss the implications of a lack of transparency in discrimination cases more generally here.

Using “your head”

Secondly, it is important to note that the outcome in the Deliveroo case would have been no different if shifts had been allocated on the same basis by a human-decision maker relying on manual bookkeeping of the times when the riders cancelled shifts or failed to work during peak times.  This is a case – in truth – about using systems – algorithm based or otherwise – without thinking through what they mean and how they can impact on different protected groups. 

The error in Deliveroo’s system was very simply failing to distinguish between the reasons for cancellation and non-participation in a shift.  Was there a reason which might relate to gender e.g. childcare reasons ? Or did the rider simply feel like spending their time differently in a way which is non-gendered e.g. going to the cinema on a Friday night? The failure to answer these simple questions was fatal.

A system which intelligently differentiated between the reason for cancellation or inability to work would have been fairly straight forward to justify even if prima facie indirect discrimination arose.  Afterall, a company like Deliveroo is perfectly entitled to need and want riders who are reliable and committed and to ensure that there are adequate riders available to cover peak shifts.  Provided that the system is implemented in a way which understands, and sensibly accommodates riders which may be disadvantaged by Deliveroo’s aims, indirect discrimination should not arise.  For example, an algorithm which allows a working mother early access because, whilst she cannot work all Friday evenings, is extremely reliable during school hours, would overcome any concerns relating to indirect sex discrimination.   

Acting proportionately

Deliveroo advanced a poor case on justification which failed to articulate any compelling commercial reason for the algorithm or address key issues such as proportionality. So, perhaps the biggest “take away” from this case is that companies and organisations should not “blindly” use algorithms simply because they are perceived to be useful; careful thought is needed to ensure that there are no unintended discriminatory consequences.   

Where protected groups are disadvantaged, organisations need to consider – preferably at the time – why they need the system, what it achieves and whether there are non-discriminatory means of achieving that same aim or aims. 

That’s not a big ask; good companies have been doing this for a long time. 

So, the message is…

Don’t lose your head over a computer programme, in most cases it’s a tool and not a complete solution.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.