Uber is facing new scrutiny in the United Kingdom as a collective legal action has been filed against the company over its use of an AI driven pay and management system. The case, brought forward by Worker Info Exchange, alleges that Uber’s automated decision making tools that manage driver pay, performance, and access to work are unfair and insufficiently transparent. The group claims that the systems may violate UK data protection laws by not clearly informing drivers how automated assessments impact their earnings and work opportunities.
The action focuses on Uber’s use of dynamic pricing, automated performance tracking, and AI powered risk assessment systems that evaluate driver reliability and potential fraud. Worker Info Exchange argues that these tools function as a form of algorithmic management that makes decisions capable of significantly affecting a worker’s livelihood. The group says drivers should have the right to receive explanations of how the algorithms operate and how decisions affecting their earnings are made.
The legal challenge comes at a time when regulators in Europe and other regions are examining the growing role of artificial intelligence in employment and compensation. Over the past few years, gig workers have increasingly raised concerns that automated systems used by ride hailing and delivery companies have altered pay structures, determined access to jobs, and enforced disciplinary actions without adequate human oversight.
Uber has maintained that its AI systems are designed to ensure fairness, safety, and efficiency for both drivers and riders. In earlier statements addressing similar concerns, the company has said that dynamic pricing helps balance supply and demand, while automated fraud detection is necessary to protect both users and the platform. Uber has also said that drivers retain full access to their performance records and can appeal certain decisions made by the system.
According to Worker Info Exchange, however, current transparency is not sufficient. The group alleges that Uber’s pay algorithm can lead to unpredictable earnings because the system adjusts compensation based on demand patterns, regional trends, customer behavior, and broader market data. Drivers involved in the action say they often do not understand why earnings fluctuate, how trip pricing is decided, or why some trips appear to offer lower returns than similar ones.
The organisation also cites examples from drivers who claim they were locked out of the app or restricted from receiving certain types of trips after the system flagged irregular activity. They argue that automated decisions of this kind can have immediate and serious consequences, particularly for gig workers who rely on the platform as their primary source of income.
Legal experts note that the case may test the boundaries of the United Kingdom’s data protection laws in the context of AI driven employment platforms. Under existing rules, individuals have the right to request meaningful information about automated decisions that affect them. If the court finds that Uber’s systems qualify as automated decision making with significant impact, the company may be required to provide additional disclosures about how its algorithms function.
The action also comes as several countries are considering new regulations on the use of AI in employment. The European Union’s AI Act includes specific provisions related to high risk AI systems used for hiring and worker management. Although the UK is not part of the EU, lawmakers have been debating similar guardrails that would require companies to document algorithmic systems and allow independent audits.
Industry analysts say the case reflects a broader shift in how gig economy workers are responding to rapid technological change. As platforms introduce more automation to manage pricing, safety, and performance, workers are seeking clearer guidelines on how those systems operate. Some analysts believe companies may need to redesign communication practices or provide greater human involvement in decisions related to pay and account status.
Uber has not provided detailed public comments on the case but has stated in the past that it complies with all regulatory requirements related to data transparency and worker rights. The company says that automated tools help maintain platform reliability and that drivers have the ability to raise concerns or challenge decisions through existing support channels.
The outcome of this legal action could have implications beyond Uber. Other gig platforms rely on similar AI powered systems for pricing, performance tracking, and fraud detection. A court ruling that expands transparency requirements may lead to broader changes across the gig economy, particularly in the United Kingdom and Europe.
For now, the case is expected to proceed through the London courts, where representatives of Worker Info Exchange will outline their claims regarding algorithmic pay management and data governance. The organisation says it plans to continue advocating for clearer rights for workers affected by automated technologies.