“Algorithmic unfairness” & the recent ICO consultation

man and woman discussing and sharing ideas

On 29 April 2020, we submitted a response to the ICO’s paper, “Guidance on the AI auditing framework: Draft guidance for consultation” (Draft Guidance).

There were three points which we highlighted in our response as follows:

  • The document includes an erroneous conflation of “algorithmic unfairness” with the concept of “discrimination” within UK law
  • There is an erroneous conflation of “algorithmic unfairness” with the concept of “discrimination” within European law.
  • There is insufficient guidance for employers, employees, workers and their advisors.

In this blog, we will focus on the key theme of our response, namely the conflation of “algorithmic unfairness” with the concept of discrimination as used here in the UK.

“Algorithmic unfairness” is different to discrimination

Whilst the Draft Guidance expressly refers to the Equality Act 2010, for example, at page 53, it does not set out its definitions of discrimination in use in GB (i.e. direct discrimination, indirect discrimination, harassment, failure to make reasonable adjustments, unfavourable treatment arising from disability and victimisation). 

Unfortunately, this omission this had led the Draft Guidance into dangerous territory in that it proposes various methods of mitigating discrimination which are not recognised in the UK.

Specifically, the concept of “algorithmic unfairness” in the Draft Guidance is entirely at odds with the meaning of discrimination within the UK and should not be conflated.  This means that the Draft Guidance wrongly gives the impression to business that acting in a way which is “algorithmically fair”, as measured against one of the three metrics (“anti-classification”, “outcome / error parity” and “equal calibration”), may or will lead to compliance with the Equality Act 2010.  This is wrong because, by way of example, it is always direct sex discrimination contrary to section 13 of the Equality Act 2010 to treat a woman less favourably than a comparable man because of gender regardless of whether “mitigation” steps have been taken such as removing in so far as possible protected characteristics (“anti-classification”) or whether equal numbers of positive or negative outcomes are given to different groups (“outcome / error parity”) or if the  model equally calibrates  between members of a different protected groups (“equal calibration”).

Moreover, one proposed method of “mitigating discrimination” namely the “anti-classification” approach of removing all data concerning protected characteristics may lead to businesses acting contrary to the Equality Act 2010.  That is, the Equality Act 2010 places, in many important respects, a positive duty on employers to treat people with certain protected characteristics differently (i.e. disabled persons, women on maternity leave, pregnant women) and sometimes an employer must positively act so as to alleviate the disadvantages created for certain protected groups (e.g. ensuring that part time working practices do not create certain difficulties for women who still mostly perform part time work).  An approach which creates “blindness” to protected characteristics, therefore, has the potential to led to breaches of the Equality Act 2010.

It should also be stressed that the definition of discrimination within the Equality Act 2010 is derived from the case law of the CJEU.  Whilst, in a post Brexit world it has perhaps become unfashionable to consider European law, a good deal of businesses operating in the UK will also operate in some or all of Europe.  These businesses will have a common approach towards AI systems rather than a UK-centric approached.  In short, Europe will become a standard setter in much the same way as the GDPR has internally.  Accordingly, unless business is to be stifled through uncertainty, the ICO and other UK based regulators should ensure that guidance is appropriate in both an EU and UK legal framework.

The full copy of our response can be found here.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.