Never knowingly oversold? Tell me who you are, and I will tell you how much you need to pay!

person holding white tote bag

The blog has been co-authored with Alexandru Cîrciumaru. More information about Alex is available at the end of this blog.


Sooner or later, if you shop online, “you” will be offered a “discount” or “special price” to induce a first or subsequent purchase.  “You” may be offered no reason or one of myriad explanations for being offered this “special opportunity” because you are a … “student”, “pensioner” “loyal” “new to us” “getting married” “getting divorced” “saving for Christmas” “having a baby” “on furlough” “buying in bulk” or “been in hospital” etc.[1]

This is all driven by Algorithmic Price Discrimination or “APD”, and in this blog we ask two questions:

(1) What are the legal implications of APD?

(2) Should we be worried that business knows enough to pitch to us online in this way?

As the exam fiasco has developed over the summer, the whole of the UK has woken up to the fact that algorithms can be biased and lead to unacceptable results, yet few really know much about the way in which APD is affecting them every day.  Even fewer understand what it might ultimately achieve. 

Nonetheless APD is beginning to come under greater scrutiny, and it is expected that the Centre for Data Ethics and Innovation (CDEI) will look even more closely during its current work programme. So we too think it’s time to start to understand more about how APD is affecting our choices and what laws apply to limit such discrimination.  In this blog we shall look a little deeper into how such personalised offers are designed, what they mean for us, and what else is going on as retailers work to make us buy and spend more online. 

We shall argue, in response to the questions above, that –

(1) Yes, behind APD is a process that has significant legal consequences; and

(2) Yes – we need to know what controls there are now on APD, and to think about what is needed for the future.  

So, tell me more about APD?

Price discrimination occurs when a seller charges customers different prices for the same product or service. There are two main ways it can happen:

  • Discounts:  Different prices may be offered depending on quantity purchases or dependent on which group the buyer belongs to (e.g. being a student); or,
  • Perfect Personalised Pricing:  For retailers, the Holy Grail is to have “perfect personalised pricing” (“PPP”).  This occurs when each customer is charged, as exactly possible, the maximum price they are willing to pay (the “reservation price”).  

It is not that easy to create PPP;[4] you need a lot of personal data to predict a person’s reservation price accurately.  However, in the age of AI, where cheap Machine Learning (ML) can be deployed to crunch huge data sets, historic obstacles to PPP are disappearing fast.  This is the developing reality of APD.  Collectively, consumers’ digital footprints are scraped from the internet, and using ML they are processed to predict, on an individual basis, what price individual consumers will be willing to pay.[5] With its potential to maximise profits, and as the ML gets better and better with more and more data, it is predicted that APD could be used to secure near-perfect price discrimination.[6] 

That looks good for the retailer, but what about the consumer?

The full extent to which APD is used and by whom is not fully known,[7] and that is a concern.  Yet there are already numerous examples of overtly discriminatory advertising and consumer offers online.[8] Some have been considered by the CDEI, though transparency remains a major issue.  We know enough to appreciate that discussion about the benefits and risks of this practice is not purely academic; we think it is essential now to raise awareness and to help frame the developing debate about regulation. In short we think a deeper review is more than overdue. 

So, how worried, generally speaking, should we be about APD?

Like most of AI applications, APD is neither inherently good nor bad. Just like normal price discrimination, APD can have positive welfare effects from an economic perspective through better distribution and lower prices, and even increased competition – particularly in oligopolistic markets.[9] There are also claims that certain savvy consumers will devise strategies to secure lower prices – by deleting cookies, delaying purchases or changing their IP address for example[10]; practices which some consumers already adopt when shopping online. Though we suspect that algorithms will quickly learn how not to fall for such tricks.

On the other hand, as well as the usual risks associated with price discrimination, such as decreasing output and distribution, APD raises numerous other concerns that normal price discrimination does not. We explore these below.

Discrimination through social–sorting

If uncontrolled and biased ML occurs, APD could discriminate quite clearly for instance on the basis of gender, race, or any other of the protected characteristics such as sexual orientation or religion or age or disability, or even on geographic location (which is usually a proxy for race).   It is well known that ML has the potential to make decisions based on protected characteristics in this way. In an APD context, this would lead to some groups or individuals either not being supplied or being charged excessive prices.

Exploiting cognitive biases

Using APD, firms could harvest data to identify which emotion (or cognitive bias) will prompt us to buy a certain product. In his book “Thinking Fast and Slow”, Daniel Kahneman presents a number of heuristics and cognitive biases that affect our judgement and have a significant impact in our decision-making process. It’s a short step for ML to learn what makes us tick and how we think, and for firms to use that information to maximise profits.

We think that some of this is already happening. One such practice is price-steering, which, in the online environment means that a website alters the search results depending on the information it has about a consumer, so as to steer those with a higher reservation price towards more expensive products. For instance, a travel website is known to have steered Mac OS X users towards more expensive hotels in select locations by placing them at higher ranks in search results.[11]

There is an even more worrying side to this involving a different level of manipulation.  Business often claims a consensus view that the consumer will always benefit from being given the choice,[12] and we agree real choice for the consumer is definitely in the public interest.  Yet it is obvious that as APD develops the complexity of decisions for consumers will also increase. Consumers will need to be savvy about what they are being offered and why.  It has been suggested[13] that some companies might use this added complexity intentionally to create ‘cognitive overload’ through making choices too numerous and various.  The aim would be to make the act of choosing too tiresome and to increase customer susceptibility.  We think that if the decisions offered are not comprehensible or too overwhelming there is no real choice at all.

Lack of transparency

There are also real concerns that an expanded use of APD could be abusive, because some people (known as “sleepers”) will not really be aware what’s going on and what is happening to them, and (2) destructive of trust in the market because of the lack of transparency.[14] As the CDEI recognised in its recent AI Barometer, a lack of trust will in the long term stifle socially good innovations.[15]

Are all markets equally affected?

No, there are some so-called “essential markets” where there are specific issues about APD.  Economists talk of electricity, water, telecoms and postal services as “essential markets” but others might sensibly be added to this list.

So far it is thought that there is no APD deployed in these markets in the UK at the moment.[16]  This is probably because these markets are already heavily regulated, but that is not a reason to ignore them.  With Brexit it seems likely that increasing pressure will be brought upon government to ease this regulation, once the pandemic is over.

Beyond the essential markets, it is obvious that, were APD to be deployed, there would be serious consequences.  For example, one market where price regulation is not always so strict is the pharmaceutical industry and if pharmaceutical companies start using APD for essential medicine then the outcomes could be troubling. It is concerning for example to imagine that the price of medicine might be linked to personal data.

So how do you see the case for regulation?

We think that if fully informed, the public at large and politicians in general would think that there is a strong case to be made for regulating APD, over and above our current legal framework.

There are some regulatory systems that seem most apt for this task – 

Competition law is concerned with many of the effects that APD can cause  such as (1) increasing barriers to entry and expansion, or (2) enabling  firms to exclude or eliminate competitors and maintain or adopt predatory pricing.[17]  So far the law remains largely as it has been pre-Brexit.   We think that APD could be caught under Article 102 TFEU which prohibits a dominant undertaking from abusing its position on the market,[18] particularly since discriminatory prices, at business-to-business level, have been found to be in breach of this provision.[19] The European Commission has expressed the view that Article 102(c) should apply to these practices.[20]

We think it significant that EU Commissioner for Competition, Margarethe Vestager, is also in charge of the portfolio for “A Europe Fit for the Digital Age”.   She has already demonstrated her determination to get to grips with over-weening power in the digital market.[21]

While commentators say that the current case law of the Court of Justice of the European Union (CJEU) on Article 102 could theoretically be extended to include APD,[22] this is not without difficulty. The CJEU would need to consider a quite specific case where this seemed the right way to go.  Above all, even where the use of APD could be qualified as an abuse, the relevant undertaking would also need to have a dominant position in the relevant market for a breach of Art. 102 TFEU to occur.   So, we think that Article 102 is not the whole solution: APD could certainly hurt consumers even if put in practice by firms not dominant in a particular market.

Given the need for large amounts of data for APD to be effective, data protection legislation is another relevant avenue which could be used to regulate this practice. Here, the core piece of legislation is the General Data Protection Regulation (GDPR) which came into force in May 2018 and which aims to give individuals control over their data and make data processing fair and transparent.  The implications of GDPR for AI of all kinds is discussed on the AI Law Hub.[23]  There can be little doubt that Article 4(2) of GDPR which defines personal data applies to APD and so lawful processing would be necessary.  The exceptions are very limited: consent, under contract or legal obligation, protection of vital interests, public interest or special authority, or the protection of legitimate interests.  We don’t see that PPP would come within any of the defined exceptions other than consent.  The Information Commissioner’s Office guidance on data processing is going to be very important in assessing what is lawful and how consent works in this context.[24]

Consumer Protection is another avenue.  On the 27 November 2019, the EU adopted Directive (EU) 2019/2161 – the so-called Omnibus Directive – aiming to better protect consumers, especially in the online marketplace.[25]  Preamble 45 of the Omnibus Directive recognises that traders may engage in APD and the Directive establishes that consumers have a right to be clearly informed when that happens; it amends the Consumer Rights Directive accordingly.[26] This is an important amendment because it not only recognises the existence of APD but also the need to take positive steps to protect consumers. It is also an important first step towards regulating this practice at EU level.  We doubt that retailers realise they are under this obligation.

EU and UK anti-discrimination law is well established and far-reaching. It is clear that it will apply to APD insofar as it has an impact on access of individuals to goods and services, should the access be prohibited (or altered) on the basis of racial or ethnic origin and gender. In the UK, should APD be based on protected characteristics it would certainly fall within the ambit of the Equality Act 2010.  An explanation of how this Act applies to all kinds of AI is on the AI Law Hub.[27]

Which avenue would be the best? How should I proceed?

Having considered all these avenues, the best way forward seems to be to bring each of these strands together in a coherent, consistent and complementary way.  While the fact that current legislation can mitigate some of the negative impacts of APD is encouraging, there is still more work to be done to fully address all risks. This work could be done by amending existing legislation, adopting new regulation or, ideally, a combination of the two. It is expected that the EU consultation on future regulation will address APD as a specific issue and we expect that in the UK the CDEI will look at best practice in greater detail both in its forthcoming Bias Review and also when it returns to this issue specifically.


In the meantime, when shopping online, check the price of any large purchase that is offered using different IP addresses or after having deleted your cookies.  You could be in for a surprise.

Alexandru obtained an LLM in European Union Law from the College of Europe (Bruges) and been called to the Bar of England and Wales after receiving a Major Scholarship from the Inner Temple. Alexandru works as a legal intern for the Competition and Markets Authority, and as a Senior Fellow on AI for The Good Lobby, a Brussels-based NGO. In autumn 2020, Alexandru will start an MPhil in Law at the University of Oxford, researching the role of the CJEU and in regulating AI.

The opinions in this blog are those of the AI Law and Alexandru and are not those of any of the organisations above.

Alexandru can be reached at

[1] Some of the many ways discounts can be devised and tailored and there respective merits can be seen at

[2] See for instance the interesting blog at  and see also the Competition and Market Authority’s concerns at

[3] It published an “Interim report: Review into online targeting” in July 2019; see

[4] It is debated how possible it is to put it widely into practice: see, for example, Spiekermann, S. “Individual Price Discrimination – An impossibility?”, International Conference for Human-Computer Interaction (CHI’2006), Workshop on Privacy and Personalization, Montreal, Canada, May 2006

[5] C. Townley, E. Morrison, K. Yeung, “Big Data and Personalised Price Discrimination in EU Competition Law” (October 6, 2017). King’s College London Law School Research Paper No. 2017-38. Available at SSRN: or

[6] For a more general discussion, see A. Ezrachi, M. Stucke, “Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy”, Harvard University Press, 2016.

[7] Ibid.

[8] See e.g. the survey in R. Allen, D. Masters, Regulating for an Equal AI: A New Role for Equality Bodies” Equinet 2020, and Orwat C. Risks of Discrimination through the Use of Algorithms, FADA 2019.

[9] P. Papandropoulos, “How Should Price Discrimination Be Dealt With By Competition Authorities?”, Concurrences, 2007, available here:

[10] See C. Townley, E. Morrison, K. Yeung, op. cit., p. 701.

[11] A. Hannak et al, “Measuring Price Discrimination and Steering on E-Commerce Websites”, available at:

[12] See C. Townley, E. Morrison, K. Yeung, op. cit., p. 701.

[13] See A. Ezrachi, M. Stucke, op. cit. p. 109.

[14] A. Priester, T. Robbert, S. Roth, “A special price just for you: effects of personalized dynamic pricing on consumer fairness perceptions”, Journal of Revenue and Pricing Management, 2020.

[15] See e.g. p. 15: see

[16] M. Wild, M. Thorne, “A Price of One’s Own: An Investigation into Personalised Pricing in Essential Markets”, Citizen Advice, 2018, available at:

[17] See A. Ezrachi, M. Stucke, op. cit. p. 109.

[18] See

[19] See, for example, Case 27/76 United Brands v Commission [1978] ECR 207.

[20] Commission decision, Deutsche Post AG, OJ 2001 L331/40 (not appealed), para. 133. See also L. Gormsen, A Principled Approach to Abuse of Dominance in European Competition Law (Cambridge: Cambridge University Press, 2010), 107–10.

[21] Commissioner Vestager is creating with her own fairness based school of competition policy; see, e.g., M. Volmar, K. Helmdach, “Protecting consumers and their data through competition law? Rethinking abuse of dominance in light of the Federal Cartel Office’s Facebook investigation”, European Competition Journal, Issue 2-3, Volume 14, 2018,

[22] See C. Townley, E. Morrison, K. Yeung, op. cit., p. 730.

[23] See

[24] See

[25] Directive (EU) 2019/2161 of the European Parliament and of the Council amending Council Directive 93/13/EEC and Directives 98/6/EC,2005/29/EC and 2011/83/EU of the European Parliament and of the Council as regards the better enforcement and modernisation of Union consumer protection rules, with the aim to better protect consumers, especially in the online marketplace.

[26] The Recital says “Traders may personalise the price of their offers for specific consumers or specific categories of consumer based on automated decision-making and profiling of consumer behaviour allowing traders to assess the consumer’s purchasing power. Consumers should therefore be clearly informed when the price presented to them is personalised on the basis of automated decision-making, so that they can take into account the potential risks in their purchasing decision. Consequently, a specific information requirement should be added to Directive 2011/83/EU to inform the consumer when the price is personalised, on the basis of automated decision-making. This information requirement should not apply to techniques such as ‘dynamic’ or ‘real-time’ pricing that involve changing the price in a highly flexible and quick manner in response to market demands when those techniques do not involve personalisation based on automated decision-making. This information requirement is without prejudice to [GDPR], which provides, inter alia, for the right of the individual not to be subjected to automated individual decision-making, including profiling.”

[27] See

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.