The Roundtable
Welcome to the Roundtable, a forum for incisive commentary and analysis
on cases and developments in law and the legal system.
on cases and developments in law and the legal system.
|
By: Izzy Chapman
As of this November, New York has become the first state in the nation with a law explicitly targeting personalized pricing. The law–General Business Law 349-a–requires that all companies which use algorithms to set prices to say so in a public disclosure: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.” The expansion and development of AI has led companies to use it to alter prices based on user behavior. For example, the algorithm can artificially raise prices for Internet users with a history of expensive clothing purchases while maintaining reasonable options for those who do not. It allows companies to optimize profits by raising prices for only those customers whose purchasing histories indicate they would be amenable to a more expensive version of a product. Although increasing in popularity and efficacy with the current advent of AI, the use of personal data to adjust prices and strategies is not new. In 2012, for instance, The Wall Street Journal reported on the travel site Orbitz, which had displayed pricier hotel ads for Mac users than PC users. However, price fixing can be challenging to document. Last January’s Federal Trade Commission report indicated that businesses can track behaviors like Internet users’ mousepad movements and unpurchased items left in an online cart to adjust prices and display. But few companies will admit to this kind of surveillance. Chad Yoes, who used to oversee Walmart’s prices, maintained that retailers restricted such a level of personal data analysis to rewards programs, and that in any case, the New York law would only serve to breed fear and mistrust between customer and retailer. The law has accrued significant backlash from the retail sector. Ryan Thornton of Uber denied using customers’ data for price fixing at all and denigrated the new law as “ambiguous,” while the National Retail Federation similarly called it “misleading and ominous” and echoed Walmart in insisting that personal data-based adjustments are exclusive to price-lowering for members of companies’ voluntary rewards programs. At the same time as the companies it targets insist the law stokes unnecessary fear, users offer anecdotes for evidence. Consumer Watchdog researcher Justin Kloczko recounts searching for an Uber–the company that explicitly denied price-fixing allegations–at the same time and place as his wife, whose options were remarkably more affordable. Delta Air Lines is unusual for openly promoting a personalized pricing plan. The Justice Department recently settled a lawsuit with the real estate company RealPage, which came under fire for using landlords’ data to set an algorithm charging artificially high prices. The law has received attention as one of the first tangible steps toward addressing the use of AI in price-fixing. Lina Khan, formerly of the Federal Trade Commission and now part of incoming New York mayor Zohran Mamdani’s transition team, called it “vital” but emphasized the need to do more; AI lawyer Goli Mahdavi said the bill reflected the next “big battleground.” Forcing companies to be transparent about price-fixing is a crucial step in both publicizing and discouraging the practice. At the same time, the disclaimer that the law would require bears striking resemblance to the kind of cookie agreements that Internet users tend to unthinkingly click “agree” on when first clicking on a website. Although well-intentioned and seemingly inescapable, constant cookie pop-up agreements have had a numbing effect on their target audience, and some companies have learned to sidestep the question of consent by hiding the “opt out” option from the front page. Yet at least having such a law requires companies to be upfront and transparent, and perhaps that requirement will discourage some surveillance behavior to begin with by forcing it out of the shadows. New York’s law may be the first of its kind, but a long line appears behind it: ten other states have similar laws currently undergoing review, some of which actually ban the practice, rather than solely requiring disclosure. Works Cited Balk, Tim. “Personalized Surveillance Pricing, AI and What It Means for New York.” The New York Times, November 29, 2025. https://www.nytimes.com/2025/11/29/nyregion/personalized-surveillance-pricing-ai-new-york.html. Federal Trade Commission. “FTC Surveillance Pricing Study Indicates Wide Range of Personal Data Used to Set Individualized Consumer Prices.” Press release, January 17, 2025. https://www.ftc.gov/news-events/news/press-releases/2025/01/ftc-surveillance-pricing-study-indicates-wide-range-personal-data-used-set-individualized-consumer-prices. McCabe, David. “RealPage, DOJ Settle Rent‑Pricing Suit.” The New York Times, November 24, 2025. https://www.nytimes.com/2025/11/24/technology/realpage-doj-settlement.html. Mattioli, Dana. “On Orbitz, Mac Users Steered to Pricier Hotels.” The Wall Street Journal, August 23, 2012. https://www.wsj.com/articles/SB10001424052702304458604577488822667325882. Kaye, Danielle. “Personalized Pricing.” The New York Times, July 26, 2025. https://www.nytimes.com/2025/07/26/business/dealbook/personalized-pricing.html. Robeson, Nate. “AI Creates a New Antitrust Puzzle.” Politico, October 20, 2025. https://www.politico.com/newsletters/digital-future-daily/2025/10/20/ai-creates-a-new-antitrust-puzzle-00615415. The State of New York, Office of the Governor. “Protecting New Yorkers From Secret Online Price Hikes: Governor Hochul Announces Nation-Leading Surveillance Pricing Law Now in Effect.” November 24, 2025. https://www.governor.ny.gov/news/protecting-new-yorkers-secret-online-price-hikes-governor-hochul-announces-nation-leading.
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
Archives
March 2026
|