AI and machine learning technology is giving companies an edge like they’ve never enjoyed before. Businesses can pinpoint exactly which customers (and which kinds of customers) are more likely to buy their products — and a whole lot more.
Such as, when customers are more likely to shop and buy. And of course, at how high of a price point. Tech advances are making it easier than ever in human history to spy on consumers. Some of the richest, most influential companies appear to be all-in on “surveillance pricing.”
Fine Line Between Dynamic & Surveillance Pricing
Dynamic pricing, aka flexible pricing, is nothing new. Retailers and the airlines, for example, are monitoring data every second and adjust prices accordingly. Most companies are using a combination of AI, machine learning algorithms and other technologies to set optimal prices for all customers.
The line between dynamic and surveillance pricing is a bit murky. For example, what exactly is the cardinal sin? Spying on the buying public using ever-present lurkers like Siri and Alexa? Analyzing credit card or loyalty card purchases to create sales offers? Using cookies to track Web searches? All of these practices, after all, are widely used and protected by laws to set prices.
Where companies must be very careful to draw the line and never cross it is to set different prices for different customers. That’s clear-cut price discrimination in which companies “try to maximize profits by offering prices that are acceptable to different groups of consumers, defined by attributes such as age, socio-economic status and their affinity for certain goods and services,” according to the Wall Street Journal.
Consumers are rebelling against dynamic pricing techniques like burger chain Wendy’s charging higher prices at different times of day, the Journal notes. Which is why the Federal Trade Commission (FTC) “believes that consumers don’t understand how their personal information may be used to tailor prices to them.” And now it’s demanding answers.
Feds Want the Dirt on Surveillance Pricing
The FTC just ordered eight companies that offer surveillance pricing products and services which use consumer data to provide info to the feds and (ultimately) the public. Specifically the FTC wants to know about impacts on “privacy, competition and consumer protection … [and] when pricing is based on surveillance of an individual’s personal characteristics and behavior.”
The eight companies being called on the carpet are Accenture, Bloomreach, JPMorganChase, Mastercard, McKinsey, PROS, Revionics and Task Software. FTC chair Lina Khan believes these firms “could be exploiting this vast trove of personal information to charge people higher prices. Americans deserve to know whether businesses are using detailed consumer data to deploy surveillance pricing, and the FTC’s inquiry will shed light on this shadowy ecosystem of pricing middlemen.”
Spying on customer data to fix prices — or wages — is a much more risky proposition due to changes in the law made last year. The Department of Justice’s anti-trust division and the FTC withdrew two “safe harbors” for companies designed to allow hospitals to share wage data. Turns out a lot of corporations in a lot of fields besides healthcare were sharing data to help set wages and prices, as covered by Matt Stoller of the BIG substack.
The FTC is most likely very interested in hearing from McKinsey, whose “main strength is that it’s trusted by corporate America to gather large troves of sensitive business data, and then share insights about that data,” Stoller says. McKinsey sets pay rates for “a significant portion of the oil industry,” Stoller notes. “Is that cross-firm price-setting legal? Probably not.”
Bottom Line: Onus Falls on the Seller
Many small- to mid-sized companies are taking a wait-and-see approach as far as utilizing AI. CFOs and controllers hope AI can take over menial tasks that finance staffers grind away on. A majority of CFOs say they don’t know enough about AI yet and want to learn. And a majority of workers flat-out mistrust AI because they fear it’s coming for their jobs.
A good piece of advice, from the B2B Rocket blog, is to put “clear rules and standards in place to avoid unfair pricing and [to] treat all customers fairly. Businesses must openly disclose how their AI algorithms work and provide explanations for the pricing decisions they make. Be cautious of biases in data or algorithms to avoid unfair sales tactics.”
Bottom line: Any company that can’t stand by their pricing practices should be wary of leaning too much on AI. The potential damage to customer relations and liability risks aren’t worth it.