The App Drivers & Couriers Union (ADCU) is taking legal action against Uber in the Netherlands on behalf of a number of former Uber drivers. The drivers are challenging their dismissals by Uber, which they believe resulted from automated decisions made by algorithms used by Uber regarding alleged fraudulent activity and the deactivation of drivers.
This is a significant case which will likely pressure test the safeguards afforded to individuals by Article 22 of the European Union (EU) General Data Protection Regulation (GDPR), which provides data subjects with certain rights in respect of automated individual decision-making, including profiling.
Article 22 of the GDPR provides that individuals have the right not to be subject to decisions based solely on automated processing, including profiling, which produces legal effects for the individual or which similarly significantly affects the individual.
This does not apply if the decision: (i) is necessary for entering into or performing a contract between an individual and a data controller; (ii) is authorised by EU or EU member state law to which the data controller is subject, which also provides for suitable measures to safeguard individuals’ rights, freedoms and legitimate interests; or (iii) is based on the individual’s explicit consent.
Regarding points (i) and (iii), data controllers must implement suitable measures to safeguard individuals’ rights, freedoms and legitimate interests, including at least the right to obtain human intervention on the part of the controller, to express his or her viewpoint and to contest the decision.
It is reported that the ADCU has seen over 1,000 cases since 2018 involving drivers alleging that they have been wrongly accused of fraudulent activity and have had their accounts terminated at once without any recourse. The ADCU notes that private hire operators in London are obliged to notify Transport for London if they dismiss drivers, but that allegedly Uber has failed to explain to the relevant drivers in some cases what the issues are that have resulted in their termination.
Uber has apparently confirmed to the BBC that drivers’ accounts were only deactivated after manual reviews by humans working within a specialist team, however, it will be interesting to see the extent to which this case is held to involve solely automated-decision making (if any). Hopefully, useful guidance will be provided regarding the nature and extent of the safeguards that data controllers are required to implement when processing personal data in the context of automated decision-making and profiling in order to ensure compliance with Article 22 of the GDPR.
Former Uber drivers have accused the taxi app firm of using automated "robo-firing" algorithms to dismiss them. British drivers want courts in the Netherlands - where Uber's data is based - to overrule the algorithm that they say caused them to be fired. Experts say the legal challenge is the first of its kind to test the protections of GDPR Article 22.