Skip to content

No, Pricing Bots Won’t Collude Against Customers

An article in the Economist warns that, “a cabal of AI-enhanced price-bots might plausibly hatch a method of colluding that even their handlers could not understand, let alone be held responsible for.” The article describes a situation in Martha’s Vineyard where a lawsuit was filed against four gas stations on the island, alleging price fixing. While judges found no evidence of a conspiracy, they noted that the market was conducive to “tacit collusion.”

The article posits scenarios where algorithmic pricing could lead to collusion and point to paper for support of this idea. The paper gives, as examples, three cases of non-US gasoline retailers where the use of online technology to speed the transmission of pricing information led to tacit collusion.

Having built an algorithmic pricing and optimization system for a US gasoline retailer, we do not see a scenario where the bots take over and collusion reigns.

First, while people like Elon Musk warn about the potential of general artificial intelligence to eventually wake up and be a danger to the public, it is hard to see a world where “handlers could not understand”, as the Economist states it, how their algorithms are working. Yes, there is a self-learning aspect of such algorithms, but they are designed by and controlled by humans, and typically learn a single, very simple task, and learn it well. Human handlers are able to understand how the bot accomplishes that task.

And they are given constraints — bounds within they can work. They don’t just evolve a mysterious personality, and begin eliminating those in the way of their mission, like the HAL computer did. Properly designed process automation systems, such as pricing bots, should always have a mathematical optimization component built in, which provides safeguards through boundaries and constraints.

Second, we don’t see a scenario where “the handlers” would NOT be held responsible for the algorithms. The Sherman Anti-Trust Act and its derivative laws are still in force. Pricing bots can be made to adhere to pricing decisions that can be monitored and managed by the people who are ultimately held accountable for their actions.

To be sure, as machine learning systems play a larger role in human augmentation, it is prudent to be vigilant to such issues as these. But hype (in the case of rosy expectations), and hysteria needs to be kept in check.

Back To Top