A few weeks ago, on a flight back from London, I fell into a deep hole I haven't been able to get out of since. I knew how much I'd paid for my seat, and how many miles I'd used for an upgrade. But I didn't know whether the woman sitting across the aisle from me had only used a few points, like me, or paid the $10,000 or more the airline could charge for the same trip. Booking a flight has long been about playing a game whose rules only the airlines know, with countless booking codes, loyalty programs, and fare changes weaponizing your data against your wallet. But after I landed, I continued to see the same rigged game everywhere: every time I got into an Uber, everything I bought online, every time I went to the supermarket. All of these businesses know me so well now that I can see the numbers flashing above my head; the exact price I'm willing to pay at any given moment. Your own numbers are now flashing above your head.
In the age of algorithms, pricing dynamics are becoming more and more prevalent in digital commerce, with prices rising and falling in real time.
Even more alarming is the rise of personalized pricing, a technique in which digital retailers leverage users' own data to charge them exactly the price they're willing to pay, which may be different from the price their neighbors pay. Not only does personalized pricing have the potential to feed into biases and drive inflation, it also creates a world where we never know when an app is scamming us.
Now, whenever I try to pay for something on my phone or laptop, I wonder if I could pay less if I used someone else's account.
I still remember the mild shock I felt a decade ago when I learned that price discrimination is often perfectly legal in the United States. In law school, my antitrust professor introduced me to the little-known Robinson-Patman Discrimination Act from the Great Depression and was quick to point out that the law was completely unworthy of its name. Under this long-standing law, a business can only face devastating penalties for price discrimination if it discriminates against other businesses. If a wholesaler overcharged a store, the store could sue, but there was nothing then (and there is now) to stop a store from doing the same to its customers. In other words, store owners have more price protection than their customers. If a store charges some customers more than others because of their gender, race, or other legally protected characteristics, that is definitely illegal. But if a business tries to extort the maximum price each customer is willing to pay individually, it is free to engage in robbery.
Even in our increasingly polarized times, AI pickpocketing may be one of the few issues that can unite us in outrage. Tyler Le/BI
I say “low-level shock” because personalized price discrimination was less widespread and less harmful back then than it is today. Sure, coupon culture allowed companies to sell the same product at different prices at the same time in the same store, but it gave customers discretion: price-sensitive shoppers took the time to hunt for a cut, while less thrifty shoppers paid the full fare. Much of traditional price discrimination, like coupons, loyalty cards, and seasonal discounts, allows shoppers to choose which price group they fall into.
But algorithmic price discrimination takes away that choice. And the ways data is extracted to categorize people into price groups are more intrusive than you might think. Consider your most recent Uber ride. When you ordered that car, you probably knew that the distance of the trip and the time of day were price factors. We’ve become unwillingly accustomed to the ruthless and exploitative efficiency of surge pricing. But have you ever thought about charging your phone before ordering a ride? If you had, you might have saved a few dollars. That’s because Uber has allegedly used battery level as one of the factors in determining the price of a ride, a charge that the company vigorously denies. If the allegations against Uber are true, it’s easy to see why. People with low battery levels are more desperate than others, and those whose phone is about to die in a few minutes will likely be willing to pay almost any price to get a car before they’re stranded.
As The American Prospect recently detailed, this kind of individualized pricing has proliferated in nearly every sector of the economy (streaming, fast food, even dating apps), and it can be surprising which variables drive higher prices. In the 2010s, retailers relied on somewhat crude data to perfect their pricing: customers might have paid more for a flight booked on a Mac (but not a PC), or paid more for test prep in zip codes with large Asian communities. But in recent years, companies have moved away from local-level price discrimination toward individualized pricing.
Retailers know so much about what you buy, both on and off their platforms. And you have no way of knowing when your choices will change how much you pay. For retailers like Walmart, exploiting our shopping history isn’t enough. In February, the retail giant agreed to acquire smart TV maker Vizio for more than $2 billion, potentially giving Walmart a windfall of personal consumer data. Smart TVs can not only monitor what we watch with Orwellian precision, but also track other nearby devices with ultrasonic beacons and even eavesdrop on what we say in the privacy of our homes. Vizio, in particular, was fined millions of dollars for allegedly illegally spying on its customers.
Retailers not only know what customers have bought and how much they have earned, but also where they are, how their day is going, what their mood is, etc. All of this can be neatly integrated with an AI neural network to calculate how much you will pay for a particular item at a particular moment.
Your age, gender, and sexual orientation could determine how much the AI needs to pay for love.
No area of commerce is so personal that it's off limits. Dating apps collect data from our love lives, and some openly brag about doing so to make a profit. Many apps that don't publicize their personalized pricing still do. Tinder rarely talks about its pricing techniques, but Mozilla and Consumers International recently discovered that the dating app uses dozens of variables to drastically adjust prices for users. Your age, gender, and sexual orientation may determine how much AI should pay for love.
If left unchecked, personalized pricing will have negative effects on society as a whole. “Hidden algorithmic price discrimination can undermine public trust in pricing mechanisms and undermine markets,” says Nicholas Guggenberger, assistant professor at the University of Houston Law Center. AI pricing also means that the most desperate and vulnerable end up paying the highest fares. Worse yet, it could put people at a disadvantage because of their race, age, or class. Take the allegations about cell phone batteries, for example. Older people are more than twice as likely to have phones that are at least three years old than younger users. Because older smartphones tend to have shorter battery life, seniors may end up paying a higher price than younger people for the same ride in Uber.
“Algorithmic price discrimination can essentially automate usury,” Guggenberger said. “If you're in a rural area and your battery is running low, a ride-sharing app could dramatically increase your 'personalized price.'”
Much of AI pricing acts as a regressive tax, taxing the least on the most advantaged. People who live in underserved areas, with fewer stores and fewer options, often have no choice but to click “buy now,” even if it hurts. As law professor and consumer watchdog Zephyr Teachout told The American Prospect, we shouldn't think of this practice as something as innocuous-sounding as personalized pricing; she calls it surveillance pricing.
We know how to prove human discrimination. If stores in majority-black neighborhoods charge higher prices than stores in majority-white neighborhoods, inspectors can visit each store, record the prices, and sue. This kind of inspection has been the core of consumer protection for the better part of a century. But how do we prove that an algorithm discriminates? There are no stores to visit, no price tags to compare, just millions of siloed screens in people's pockets. The result is a dilemma: only by suing a company can we have enough data to prove discrimination, but we can't sue a company without the data first. We could see the rise of a strange, twisted world of law in which companies that use bias-prone AI to secretly adjust prices are less susceptible to legal scrutiny than brick-and-mortar stores.
My hope is that this situation is so dire, and the potential for abuse so obvious, that even our dysfunctional democracy will not accept it. Our lawmakers have been so slow to curb the harms of new technology, even when it becomes clear that it is undermining our democracy. But even in these polarized times, AI pickpocketing may be one of the few issues that can unite us in outrage.
Albert Fox Kahn He is the founder and executive director of the Surveillance Technology Oversight Project (STOP), a New York-based civil rights and privacy organization.
Correction 7/10/24 — A reference to Amazon has been removed from this article. The company claims it doesn't change prices based on consumer demographics or purchasing behavior..