Data protection: Legal framework

In this section, we examine the following topics:

  • Overview of the legal landscape
  • General Data Protection Directive (GDPR)
  • Article 21
  • Article 22
  • Criminal offences, criminal penalties and the protection of public security
  • Prohibition on decisions based solely on automated decision making which lead to discrimination

Each of these topics is explored in detail below.

Overview

The key legislation in Europe concerning algorithms, machine learning and data protection is:

Article 8 of the Charter of Fundamental Rights of the European Union which enshrines the right to data protection.

The General Data Protection Regulation (GDPR) which covers the protection of natural persons with regard to the processing of personal data (Article 1).

The Law Enforcement Directive which covers the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences, criminal penalties and the protection of public security (Article 1). It also applies in relation to cross-border processing of personal data for law enforcement purposes.

The two sister Directives outlined above are intended to complement one another and regulate entirely different spheres. Accordingly, Article 2 (2)(d) of the GDPR expressly “carves out” the matters which fall to be regulated by the Law Enforcement Directive.

The GDPR is transposed into UK law via the Data Protection Act 2018 (DPA 2018) under Part 2.

The position in relation to the Law Enforcement Directive is more complicated. The UK has opted out of criminal justice directives which post-dated the Lisbon Treaty. However, the DPA 2018 implements the Directive within Part 3 so as to “ensure a coherent regime” in relation to domestic and trans-national data processing (see Home Office guidance here).

The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) which gave effect to Directive 97/66/EC (now repealed and replaced by Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) .

PECR complements but does not override the GDPR and the DPA 2018.


General Data Protection Regulation (GDPR)

Application to algorithms and machine learning

The GDPR, as implemented in the DPA 2018, applies to the “processing” of “personal data” which is unconnected to the detection or prosecution of criminal offences, criminal penalties and the protection of public security

The definition of “processing” in Article 4 (2) is sufficiently broad to cover the application of algorithms to personal data:

‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction

https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1532348683434&uri=CELEX:02016R0679-20160504

The GDPR also cover “profiling” in Article 4 (4) which engages directly with machine learning:

‘profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements

https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1532348683434&uri=CELEX:02016R0679-20160504

The UK’s Information Commissioner’s Office (ICO) guidance on automated decision-making and profiling confirms that machine learning and algorithms fall within the GPDR.


Lawful processing – ordinary data

Personal data can only be lawfully processed, including using it within an algorithm, if one of the six grounds contained in Article 6 is met as follows:

Consent: The Data Subject has consented to the specific processing (Article 6 (1)(a)).

Contract: The processing is necessary for the performance of a contract to which the Data Subject is a party or to take steps at the request of the Data Subject prior to entering into a contract (Article 6 (1)(b)).

Legal Obligation: The processing is necessary for compliance with a legal obligation to which the controller is subject (Article 6 (1)(c)).

Vital interests: the processing is necessary in order to protect the vital interests of the data subject or of another natural person (Article 6 (1)(d)).

Public interest/official authority: The processing is necessary for the performance of a task carried out in the public interest or in the exercise of the controller’s official authority (Article 6 (1)(e)). This is further elaborated on by s.8 of the DPA 2018 which states that processing will be lawful where it is necessary for the administration of justice, the exercise of a function of the House of Parliament, the exercise of a function conferred on a person by the law, the exercise of a function by government or an activity that supports or promotes democratic engagement.

Legitimate interests: The processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the Data Subject which require protection of personal data, in particular where the Data Subject is a child (Article 6 (1)(f)). The first paragraph of (f) does not apply to processing carried out by public authorities in the performance of their tasks

Article 10 also states that where the personal data which is being processed relates to criminal offences and convictions, it can only be carried out (i) under the control of an official authority or (ii) if authorised by law but only where there are appropriate safeguards. Article 10 is further supplemented by s.10 (5) DPA 2018 which specifies various restrictions on this entitlement to process data.


Lawful processing – special data

Importantly, the GDPR has far more stringent obligations in relation to special personal data which is personal data which reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, and the processing of genetic data, biometic data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation (Article 9).

In relation to this special data there is a general prohibition on processing (including the application of algorithms) unless certain conditions are met as follows:

Explicit consent: The Data Subject has given explicit consent (Article 9 (2)(a)).

Obligations/rights: The processing is necessary for the carrying out of the obligations and exercising specific rights of the controller or of the Data Subject in the field of employment, social security and social protection law and subject to safety mechanisms (Article 9 (2)(b)).

Vital interests: The processing is necessary to protect the vital interests of the Data Subject or of another natural person where the Data Subject is physically or legally incapable of giving consent (Article 9 (2)(c)).

Legitimate interests: The processing is carried out in the course of its legitimate activities by a foundation, association or any other not-for-profit body with a political, philosophical, religious or trade union aim and subject to safety mechanisms (Article 9 (2)(d)).

Manifestly public data: The processing relates to personal data which are manifestly made public by the data subject (Article 9 (2)(e)).

Legal claims: The processing is necessary for the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity (Article 9 (2)(f)).

Public interest: The processing is necessary for reasons of substantial public interest, is permitted by law and subject to safety mechanisms (Article 9 (2)(g)).

Health and social care: The processing is necessary for the purposes of preventative or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment of the management of health or social care systems and services on the basis of the law or pursuant to a contract and subject to safety mechanisms (Article 9 (2)(h)).

Public health: The processing is necessary for reasons of public interest in public health (Article 9 (2)(i)).

Archiving: The processing is necessary for archiving purposes in the public interest, scientific or for historical research on the basis of the law and subject to safety mechanisms (Article 9 (2)(i)).

Articles 9 (2) (b), (g), (h), (i) and (j) are supplemented by additional conditions as set out in Parts 1 and 2 of Schedule 1 to the DPA 2010 (s.10 DPA 2018) and s.11 DPA 2018. There is also additional information concerning the safety mechanisms, when required, within the DPA 2018.

In short, the type of personal data which would allow an organisation to potentially discriminate against a person on the grounds of race, religion or belief or sexual orientation contrary to the Equality Act 2010 is subject to special protection under the GDPR and the DPA 2018 albeit that there are a wide range of circumstances in which organisations are entitled to process this data without consent provided (for the most part) that safety mechanisms are put in place.


Article 21: Objecting to partially automated decisions

Data Subjects also have a right to object to the use of algorithms and machine learning under Article 21 of the GDPR, even if processing would otherwise be lawful, but only where:

The processing is lawful under Article 6(1) (e) (“the processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller”); or

The processing is lawful under Article 6(1) (f) (“processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party”); and

The objection is derived from the Data Subject’s particular situation.

The precise language of Article 21 (1) reads as follows:

1. The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her which is based on point (e) or (f) of Article 6(1), including profiling based on those provisions.

https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1532348683434&uri=CELEX:02016R0679-20160504

Finally, even if a Data Subject can validly object to automated decision making under Article 21, this can be overridden by a Data Controller where, “… the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.” (Article 21 (1)).

There are also a series of “carve outs” for Article 21 within the DPA 2018.

In light of the various limitations on Article 21, we suspect that the right to object to profiling via algorithm or machine learning under Article 21 will not always be readily available. Where discrimination is at play, a better route of challenge may be the Equality Act 2010 (as explained under “UK’s existing equality and humans right framework”).


Article 22: Right not to be subject to fully automated decisions including discriminatory decisions

Under Article 22 of the GDPR, a data subject has the right not to be subject to decisions made in consequence of the pure application of an algorithm (whether or not underpinned by machine learning) where there are legal consequences for him or her or similarly significant repercussions. This applies to both ordinary personal data and special personal data.

The precise language of Article 22 of the GDPR is as follows:

1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

https://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1532348683434&uri=CELEX:02016R0679-20160504

The ICO guidance states that a decision which discriminates against a data subject would have a significant repercussion and so fall into Article 22:

What types of decision have a legal or similarly significant effect?

A decision producing a legal effect is something that affects a person’s legal status or their legal rights. For example when a person, in view of their profile, is entitled to a particular social benefit conferred by law, such as housing benefit.

A decision that has a similarly significant effect is something that has an equivalent impact on an individual’s circumstances, behaviour or choices.

In extreme cases, it might exclude or discriminate against individuals. Decisions that might have little impact generally could have a significant effect for more vulnerable individuals, such as children.

https://ico.org.uk/media/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling-1-1.pdf

This means that individuals have the right not to be subject to fully automated decision making which leads to discrimination against them as defined by the Equality Act 2010.

Limitations to Article 22

However, the prohibition in Article 22 is heavily qualified.

First of all, it does not apply to decisions where there is any degree of human intervention (Article 22 (1)).

For example, an employer who deployed an algorithm to monitor attendance would not fall under Article 22 if a manager still made the final decision whether or not to initiate capability proceedings. As the ICO (guidance and April 2019 blog) and European Data Protection Board have explained, a human simply “rubber stamping” a decision would not bring the activity outside of the remit of Article 22. Whilst this makes sense, we query the extent to which a human actor would feel able and maybe even competent to overrule the decision of a sophisticated form of technology.

Secondly, Article 22 does not apply where the processing is necessary for the performance or entering into a contract between the Data Subject and a Data Controller (Article 22 (2)(a)).

This represents a significant inroad into the protection offered by Article 22. It means that a whole raft of situations will fall outside Article 22 where they are based on a contract, for example, the employment relationship.

It is also important to note that the Data Controller does not need to be party to or a potential party to the contract with the Data Subject for this exception to apply. One example of how this might work in practice has been provided by the ICO in its guidance. It explains that where a financial organisation, which relies on an automatically generated credit score carried out by a third party credit reference agency to decide whether or not to enter into a contract with the Data Subject, the data processing conducted by the third party Data Controller (here the credit reference agency) would fall outside Article 22. This is a very significant limitation on its scope.

Finally, Article 22 does not apply where the processing is authorised by European or national law provided that there are suitable safeguarding measures in place (Article 22 (2)(b)).

This is again a very wide exception. This is illustrated by another example outlined in the ICO guidance. It explains that a private organisation in the financial services sector using automated decision making to detect fraud could escape Article 22 by relying on the high level regulatory requirement to detect and prevent crime.

The DPA 2018 also provides additional detail of the safety mechanism which must be in place for the exception in Article 22 (2)(b) to apply. Specifically, the Data Controller must inform the Data Subject as soon as reasonably practicable that a decision has been made solely on the basis of automated decision making and provide a review mechanism (s.14 (3) – (5) DPA 2018). Importantly, this means that the safety mechanism need only exist after the potentially damaging decision has been taken.

In light of the limitations on Article 22, we suspect that it will not apply to a significant number of algorithms. Again, where discrimination is at play, a better route of challenge may be the Equality Act 2010 (as explained under “UK’s existing equality and humans right framework”) rather than Article 22.


Data protection – criminal offences, criminal penalties and the protection of public security

The sister directive to the GDPR is the Law Enforcement Directive. The DPA 2018 implements the Law Enforcement Directive within Part 3, which applies to law enforcement processing.

Application to algorithms and machine learning

Part 3 of the DPA 2018 applies to algorithms and machine learning in so far as they are utilised by a “competent authority” as part of a law enforcement purpose (s.29 DPA 2018).

A “competent authority” is:

Any UK government department other than a non-ministerial government department, Scottish Ministers, any Northern Ireland department, the Welsh Ministers, the police, HMRC, the Welsh Revenue Authority, Revenue Scotland, the National Crime Agency, the Serious Fraud Office, Border Revenue, FCA, HSE, the CMA, the GEMA, the FSA, Land Registry, CCRC, SCCRC, the Probation Service and the DPP etc (s.30 (1) DPA 2018 & Schedule 7); or

Any other person exercising statutory functions for law enforcement purposes i.e. the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security (s.30 (1)(b) & s.31 DPA 2018).

An intelligence service is not a competent authority (s.30 (b) DPA 2018). It is regulated separately under the DPA 2018 and including in relation to algorithms.


Lawful processing – ordinary processing

Under the DPA 2018, personal data can only be processed if:

  • It is for law enforcement purposes (s.35(2)); and
  • The Data Subject has given consent (s.35(2)(a)); or
  • The processing is necessary (s.35(2)(b)).

Lawful processing – sensitive processing

Sensitive processing will occur where the processing reveals racial or ethnic origin, political opinions, religious or philosophical beliefs or trade union membership, the processing is of genetic data or of biometric data for the purpose of uniquely identifying an individual, the processing of data concerns health or the processing of the data concerns an individual’s sex life or sexual orientation (s.35 (8) DPA 2018).

In addition to the requirements in relation to ordinary processing, under the DPA 2018, sensitive processing is only permitted in two further circumstances.

The first situation is where:

  • The Data Subject has given consent to the processing (as above) (s.35 (4)); and
  • At the time of the processing, the Controller has an appropriate policy document in place (s.35 (4)).

The second situation is where:

  • The processing is strictly necessary for the law enforcement purpose (as above) (s.35 (5)(a));
  • One of the Schedule 8 conditions namely that:
  • It is necessary and in the substantial public interest, for the administration of justice, to protect an individual’s vital interests or to safeguard children and other individuals at risk(s.35 (5)(b));
  • The personal data is already manifestly been made public by the Data Subject;
  • It is necessary due to legal proceedings or to receive legal advise or to establish, exercise or defend legal rights;
  • A court or other judicial authority is acting in its judicial capacity;
  • It is necessary to prevent fraud;
  • It is necessary and relates to archiving, scientific or historical research or statistical purposes; and
  • At the time of the processing, the Controller has an appropriate policy document in place (s.35 (5)(c)).


An appropriate policy document is defined by s.42 (2) DPA 2018 to mean, a document which:

  • Explains the Controller’s procedures for securing compliance with the Data Protection principles i.e. processing must be lawful and fair (s.35), the law enforcement purpose must be specified, explicit and legitimate (s.36), the processing must be adequate, relevant and not excessive (s.37), the personal data must be accurate and kept up to date where necessary and reasonable steps must be made to correct errors (s.38), the personal data must not be kept for longer than is necessary (s.39) and must be processed in a manner that ensures appropriate security (s.40); and
  • Explains the Controller’s procedures as regards retention and erasure of personal data and giving an indication of how long such personal data is likely to be retained.

Prohibition on decisions based solely on automated decision making which lead to discrimination

The DPA 2018 prohibits “a significant decision” being made solely on the basis of automated processing unless that decision is required or authorised by law (s.49 (1)).

A decision is a significant decision if it produces an adverse legal effect concerning the data subject or affects the data subject. This will almost certainly include decisions which lead to discrimination (s.49 (2)). Indeed, this appears to be the ICO’s position (See “Guide to Law Enforcement Proceedings”, which addresses the Law Enforcement Directive and is available here.)

Since the Equality Act 2010 prohibits discrimination arising from the application of algorithms and machine learning as explained here, it follows that there is a complete prohibition on any solely automated decision making which leads to discrimination as defined by the Equality Act 2010.

This is absolutely consistent with the constraints on discriminatory profiling in Article 11 (3) of the Law Enforcement Directive which states that:

Profiling that results in discrimination against natural persons on the basis of special categories of personal data referred to in Article 10 shall be prohibited, in accordance with Union law.

https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016L0680&from=EN

Moreover, if the Government does enact legislation which allows solely automated decision making in relation to law enforcement so that s.49 DPA 2018 does not apply, there is a requirement that safeguards are put in place as follows:

  • The Controller must, as soon as reasonably practicable, notify the Data Subject to writing that a decision has been taken (s.50 (2)(a)); and
  • The Data Subject must have a right to request the Controller, within a certain period of time, to reconsider the decision or take a new decision that is not solely based on automated processing (s.50 (2)(b).

It is important to note that the prohibition on automated decision making does not extend to partially automated decision making. In so far as this type of processing was discriminatory, it would be contrary to the Equality Act 2010 as explored here.

PECR – THE USE OF ARTIFICIAL INTELLIGENCE FOR DIRECT MARKETING

AI can very easily be used to generate emails for direct marketing purposes however this is heavily regulated by PECR. Regulation 22 sets out the main constraints –

22.—(1) This regulation applies to the transmission of unsolicited communications by means of electronic mail to individual subscribers.

(2) Except in the circumstances referred to in paragraph (3), a person shall neither transmit, nor instigate the transmission of, unsolicited communications for the purposes of direct marketing by means of electronic mail unless the recipient of the electronic mail has previously notified the sender that he consents for the time being to such communications being sent by, or at the instigation of, the sender.

(3) A person may send or instigate the sending of electronic mail for the purposes of direct marketing where—

(a)that person has obtained the contact details of the recipient of that electronic mail in the course of the sale or negotiations for the sale of a product or service to that recipient;

(b)the direct marketing is in respect of that person’s similar products and services only; and

(c)the recipient has been given a simple means of refusing (free of charge except for the costs of the transmission of the refusal) the use of his contact details for the purposes of such direct marketing, at the time that the details were initially collected, and, where he did not initially refuse the use of the details, at the time of each subsequent communication.

(4) A subscriber shall not permit his line to be used in contravention of paragraph (2).

Breach of this Regulation can lead to enforcement by the ICO relying on sections 40 and 146 of the DPA 2018: see Leave.EU Group Limited Eldon Insurance Services Limited v The Information Commissioner [2020] 3 WLUK 122.