Uber CEO admits pricing algorithm uses ‘behavioural patterns’

Ride-hailing app Uber has admitted to using workers’ “behavioural patterns” to determine their pay amid long-standing conflict with drivers over algorithmic transparency.  

The comment, made in a financial call by CEO Dara Khosrowshahi, has brought renewed concerns from drivers about how Uber uses their data to boost profits while pushing down the amount they receive for each fare.

Since its initial roll-out in Boston in 2012, Uber has been slowly expanding the use of its “dynamic pricing” algorithm to set variable pay and pricing levels, which the company previously said uses real-time data for market conditions such as time and distance, predicted route, estimated traffic, and number of users requesting or providing services.

At the same time, its “upfront pricing” policy means the algorithm’s data inputs are completely hidden from riders and drivers, who are only shown a fixed fare for a trip.

During this time, drivers and their unions have accused the company of using its dynamic algorithm in combination with upfront pricing to gradually reduce their pay, arguing that more transparency is needed to protect them from the negative effects of algorithmic surveillance and automated decision-making.

In February 2023, for example, when dynamic pricing was initially rolled out in London, the App Drivers and Couriers Union (ADCU) said the algorithm would likely use personal data and profiles of drivers and passengers to make decisions, which could push down working conditions by targeting drivers based on their willingness and ability to accept lower fares.

Uber previously told Computer Weekly that it was “categorically false” to claim the company’s algorithm – which is currently limited to the US and select markets like London – uses either personal data or profiling to set fares, adding it had worked closely with the GMB Union (which it signed an agreement with in May 2021) to consult with drivers on the algorithm and incorporate their feedback ahead of the London launch.

While Uber has never disclosed exactly what data its algorithm uses to set pay and prices, Khosrowshahi told investors on 7 February of plans to expand its use of drivers’ preferences and behavioural patterns to algorithmically target them with work they find amenable.

“Drivers are quite idiosyncratic in terms of their desire – there are some drivers who want long trip, some who want short trips, some who want to go to the airport, some who don’t want to go the suburbs, etc,” he said during an earnings call.

“I think that what we can do better is targeting different trips to different drivers based on their preferences or based on behavioural patterns. That is really the focus going forward: offering the right trip, at the right price, to the right driver.”

Khosrowshahi added that Uber has gone from using flat time and distance metrics to determine fares, to now using “point estimates for every single trip based on the driver… We’re making these point estimates both in mobility and delivery. We’re doing it globally”.

He concluded that because Uber has more “point estimates” than anyone else, its artificial intelligence (AI) “algorithms are going to be able to learn more and are going to be more accurate than anyone else’s, which is an advantage that over a period of time is absolutely going to accrue to us”.

Khosrowshahi’s comments follow Uber reporting an operating profit of $1.1bn in 2023, compared with a $1.8bn loss in 2022, and a net income of $1.9bn, after losing $9.1bn the previous year.

Computer Weekly contacted Uber about the specific types of data used by the algorithm and whether it was still “categorically false” to claim the company’s algorithm uses personal data to help set fares, but received no response.

Demanding transparency

Drivers have long been pushing for greater algorithmic transparency so they can understand how data about them is being used, how their performance is managed, and on what basis work has either been allocated or withheld.

Commenting on Khosrowshahi’s admission that Uber’s algorithm uses driver preferences and behavioural patterns, driver and ADCU’s London vice-chairman, Zamir Dreni, told Computer Weekly it was “absolutely not surprising”.

He added that while there was no oversight of the information used by the algorithm to set pay, drivers “noticed straightaway” it started to decrease with the introduction of the algorithm and upfront pricing.

“When Uber went to the fixed pricing, that’s when we see dramatic changes in the profiling [of drivers],” he said, adding that a big part of the problem is the lack of transparency over what information the algorithm uses.

Giving the example of dealing with customer complaints, Dreni said he’d noticed that if he challenges Uber over issues raised by his passengers, he is assigned less work. “It’s held against the driver, always,” he said.

Similar sentiments were shared by Worker Info Exchange (WIE), a UK advocacy organisation for workers’ data rights, which said drivers saw an almost immediate deterioration in their pay after “upfront pricing” was introduced.

“After years of flat denials, Uber has finally admitted to the deployment of automated decision-making which discriminates between workers to personalise pay and task allocation”
Worker Info Exchange

“After years of flat denials, Uber has finally admitted to the deployment of automated decision-making which discriminates between workers to personalise pay and task allocation,” it said in a blog post.  

“In WIE’s successful legal action against Uber at the Court of Appeal in Amsterdam, the court ruled that Uber, in breach of the GDPR [General Data Protection Regulation], has not been sufficiently transparent about how upfront pricing algorithms function given the obvious serious impact such AI decision-making has on workers.

“Despite today’s volte-face, we believe Uber has never met its legal obligation to inform workers [about how it uses their data]… Nor has Uber properly consulted workers about the risks of such systems, something they are required to do for the completion of regulatory required risk and impact analysis,” it wrote.

In one of the rulings from April 2023, the Amsterdam Court of Appeal found that Uber had “wrongly” rejected a driver’s subject access request for data it holds on them and how this information is used in the context of dynamic and upfront pricing.

It also noted that while it is reasonable for the company to withhold trade secrets about its algorithms, “what matters is that Uber at least explains on the basis of what factors and what weighting of those factors Uber arrives at the ride-sharing decisions, fare decisions and average ratings, respectively, and also provides [appellant sub 1] et al with other information necessary to understand the reasons for those decisions”.

According to Dreni, upfront pricing also divorces the driver from the rider, in the sense that they cannot see the same information about the journey and how it was calculated in the app.

Dreni noted an example where one of his passengers was charged £67 for a trip, while he was only paid £31, as the lack of transparency in the app meant it did not relay the same fare information to both parties, or any information about how those different amounts were determined. Dreni only realised the difference because the passenger told him in conversation.

“Drivers don’t see fares that customers accept, which allows Uber to take more of the commission,” he said, adding this also creates tension between customers and drivers because of, for example, different expectations about the quality of service.

He added that another part of the problem is how upfront pricing means drivers are paid per trip, rather than time spent or mileage covered, which makes drivers reluctant to take certain trips – for example, through busy parts of central London – as “it doesn’t make business sense” for them to journey through traffic-dense areas or make detours for passengers if they will not be paid for the extra time travelling.

He said this creates further tension between passengers and drivers because many drivers are unwilling to do longer trips for the same upfront price they were shown for a shorter route, something that is often out of line with customer expectations due to their lack of knowledge of how the platform works.

Computer Weekly contacted Uber about every aspect of the story but received no response.

Algorithmic wealth transfer

A number of academics and analysts have also highlighted the negative effects of algorithmically distributed work by gig economy app companies, and how Uber’s practices in particular have enabled it to move from massive losses to profit in a relatively short timeframe.

According to transport analyst Hubert Horan, Uber began keeping a larger share of gross customer payments and giving a smaller share to drivers in early 2022, moving from retaining 21% of gross revenue in the third quarter of 2021 to 28% by the first quarter of 2023.

“This was not because Uber was providing an increasing portion of what customers valued. Uber simply figured out how to transfer over $1bn in revenue per quarter from drivers to Uber shareholders,” he wrote in August 2023 as Uber was approaching break-even.

He added that Uber’s “delinking” of passenger fares and driver compensation via upfront pricing was a major driver of this capital wealth transfer. “Prior to 2022, driver payments were a function of what passengers paid, with adjustments for incentive programmes and peak period demand,” he said. “Uber has developed algorithms for tailoring customer prices based on what they believe individual customers would be willing to pay and tailoring payments to individual drivers so they are as low as possible to get them to accept trips.”

Examining Uber and others’ dynamic pricing algorithms in a draft paper published in January 2023, San Fransisco law professor Veena Dubal also criticised these algorithms’ “use of granular data to produce unpredictable, variable and personalised hourly pay”, claiming it amounted to “algorithmic wage discrimination”.

“As a labour management practice, algorithmic wage discrimination allows firms to personalise and differentiate wages for workers in ways unknown to them, paying them to behave in ways that the firm desires, perhaps as little as the system determines that they may be willing to accept,” she wrote.

“Given the information asymmetry between workers and the firm, companies can calculate the exact wage rates necessary to incentivise desired behaviours, while workers can only guess as to why they make what they do.”

She added that such practices enable companies deploying the algorithms to maximise profits while exerting a higher degree of control over workers.

Responding to Dubal’s arguments in January 2023, Uber told Motherboard: “It’s a good thing that Professor Dubal’s paper is still a draft, because its central premise about how Uber presents upfront fares to drivers is simply wrong.”

It added that “it does not tailor individual fares for individual drivers ‘as little as the system determines that they may be willing to accept’.” and that factors like a driver’s race, ethnicity, acceptance rate, total earnings, or prior trip history were not considered when calculating fares.

Exit mobile version