Apparent profiling of benefit claimants sparks privacy concerns

Viewpoints
July 30, 2021
3 minutes

The use of certain algorithms by councils to profile benefits claimants in the UK has sparked concerns amongst privacy activists.  The civil liberties campaign group, Big Brother Watch, has highlighted fears that the use of such algorithms (which are apparently used to predict fraud and rent arrears) "treat the poor with suspicion and prejudice".

Big Brother Watch obtained data regarding the use of automated tools from just over 300 local authorities and council-owned housing associations in the UK, approximately 25% of which confirmed that they had used algorithms to assess benefit claims during the last few years (although such use appears to be decreasing).

Big Brother Watch has raised concerns over the transparency of the use of such algorithms, stressing that they can attribute unjustified weight to unimportant information, with use of defective algorithms potentially leading to discrimination and bias and the infringement of applicable data protection requirements.

Big Brother Watch believes that such algorithms have been used to process personal information relating to 1.6 million people in social housing to forecast rent arrears, covertly attribute fraud-risk scores to over half a million individuals before they can access council tax support or housing benefit and calculate how likely over a quarter of a million people are to experience joblessness, abuse or homelessness.

Individuals who are categorised as medium or high risk are reportedly asked additional questions during consideration of their benefits claims, with reports suggesting that some individuals are not aware of how their personal data is being processed or the risk-scoring processes involved.

Certain information about hundreds of thousands of housing-benefit claimants allocated the highest fraud-risk scores are sent to the Department for Work and Pensions for further computerised scrutiny.

This raising of these concerns highlights the data protection issues that can arise in respect of the use of individuals' personal data in connection with automated decision making and profiling.  The UK GDPR includes provisions regarding both automated decision making (i.e. decision making without human involvement) and profiling (automated personal data processing to assess certain aspects of an individual).  Controllers should consider their obligations around their lawful basis for processing, transparency, individuals' rights regarding their personal information and data minimisation in the context of automated decision-making and profiling.

The UK GDPR also includes additional protections for individuals if a controller is carrying out solely automated decision-making that has legal or similarly significant effects for individuals.  This type of decision-making can only be conducted if the decision is either: 

  1. Necessary for the entry into or performance of a contract.
  2. Authorised by domestic law which applies to the controller.
  3. Based on the data subject's explicit consent.  If special category personal data is used, the explicit consent of data subjects is required, or the processing must be necessary for substantial public interest reasons.

Controllers engaged in solely automated decision making that has legal or similarly significant effects for individuals must ensure that they provide data subjects with information about the processing, implement easy ways for such individuals to challenge decisions or request human intervention and regularly ensure that their systems are operating as intended.  They should also carry out data protection impact assessments in order to mitigate any risks involved in such processing.

In this case, the Local Government Association (which represents English councils) has noted that "... data is only ever used to inform decisions and not to make decisions for councils" and COSLA (which represents Scottish councils) has observed that "Councils are aware of and comply with data confidentiality law". 

Nevertheless, Big Brother Watch is calling on the UK Information Commissioner (ICO) to ensure increased transparency and regulation in respect of this issue, including a public-register of algorithms used for public-sector decision making and obliging authorities to carry out privacy and equality assessments prior to use of such algorithms. Benefit recipients are also being asked to request their risk scores.  It will be interesting to see whether any action is taken and/or guidance issued in respect of these concerns.