New research shows that AI can silently replace one public price with many private offers for the same product.
This discovery turns digital pricing into a question of fairness. Because buyers never know when they will be chosen to pay more.
Hidden checkout calculation
In today’s online checkouts, the same product can appear at different prices to different people at the same time.
Dr Miroslava Marinova from the University of East London (UEL) argues that platforms can go beyond broader market signals and push pricing towards each buyer’s personal limits.
These differences are invisible to the people facing them, so a single product can split into many invisible versions of the same product.
That’s the line this article walks before the discussion moves from hidden prices to how the law should treat them.
AI estimates willingness to pay
Behind this hidden spread is algorithmically personalized pricing, i.e., the price of a software set is set to one person, rather than a public price that responds to everyone.
Rather than simply following demand, the system estimates the shopper’s willingness to pay, or the highest price he or she would accept, before leaving.
Clicks, location, purchase history, and even hesitation can help strengthen your reputation for individuals rather than groups.
The shift from market prices to private prices transforms familiar retail strategies into more difficult legal issues.
Personal pricing feels unfair
Consumer experiments have found that people rate individual prices as more unfair than segment prices, even when both are based on data.
Social comparison drives that response, as buyers judge prices by comparing them to what they think others are getting.
“When pricing becomes invisible and individualized, equity becomes a central issue,” says Dr. Marinova.
When shoppers suspect private penalties, trust quickly declines and even technically efficient systems begin to look fraudulent.
When control changes everything
Under Article 102 of the European Union Anti-Abuse Regulation, dominant companies cannot impose unfair sales prices.
This is important because the paper treats hidden private pricing as an exploitative abuse, a practice that uses market power directly against buyers.
Unlike storewide sales, the concern here is that consumers will be treated unequally without a clear reason that they can verify.
This case is strongest when the firm faces weak competitive pressures. Therefore, superiority is central.
old law, new code
The team’s argument relies on old competition law rather than waiting for new AI-specific legislation.
Because the software can update prices instantly and silently, regulators can have a hard time finding patterns without a record of the system.
“The next step is for regulators to move from theory to action,” Marinova said.
As systems become harder to read, the discussion shifts from concerns about abstract AI to auditing, explaining, and proving objective reasons.
Why is the UK paying attention?
UK competition law already prohibits companies from abusing a dominant position, including using unreasonable selling prices.
This wording leaves room for the same concerns that Marinova raises under EU law to arise even after Brexit changes the system.
The 2026 government consultation also proposes increasing the powers of the Competition and Markets Authority (CMA) to investigate algorithms across competition and consumer protection.
New powers, rather than new principles, may prove the more urgent need for UK regulators in practice.
lack of transparency
Price transparency disappears when every shopper sees a slightly different offer and there are no open shelf prices left.
Without a common reference point, people won’t know whether they’ve found a bargain or were chosen to pay more.
Search tools and comparison sites are only useful if sellers publish comparable prices, but hidden personalization is built to avoid this.
In such situations, competitive discipline is weakened, especially when platforms control search, data, payments, and final checkout.
legitimate gaps and hidden gaps
Not all personalized pricing is automatically fraudulent, as companies often charge different amounts for actual cost or loyalty reasons.
Student discounts, clearance sales, and local shipping charges are all based on reasons that are typically recognizable to the buyer.
When pricing is hidden and personalized, buyers have little power to resist.
At that point, the pricing system stops looking like smart merchandising and starts looking like personal extraction from the buyer.
What regulators can do now
Effective monitoring starts with records that show what data shaped prices, when software was changed, and why.
Auditors must examine the code’s stated purpose as well as its inputs, rule overrides, and group-wide results.
Regulators may also need the power to test the actual system, rather than simply requiring paperwork. If price channels can be restructured, it will be harder for companies to hide discrimination within automation.
AI pricing is most problematic when AI converts personal data into private prices and hides the reason from buyers.
Clearer disclosures, stronger investigative powers, and better audit trails do not prohibit personalization, but they make it easier to prove unwarranted targeting.
This research Journal of Competition Law and Economics.
—–
Like what you read? Subscribe to our newsletter for fascinating articles, exclusive content and the latest updates.
Check out EarthSnap, a free app from Eric Ralls and Earth.com.
—–
