Impact case study database
Improving Fairness and Equality in the Use of Police Intelligence Records
1. Summary of the impact
In order for the public’s civil liberties to be protected, police intelligence analysis and information management processes must be well-regulated, and handled fairly and lawfully. Research at Sheffield Hallam University has influenced policy making relating to police intelligence regulation, criminal records sharing and predictive data analytics. The research led to the creation of a self-regulation framework on data analytics for policing in the UK - ALGO-CARE - which has guided police data projects addressing serious crime and provided better protection for the privacy rights of hundreds of thousands of people in the UK. Sheffield Hallam research on criminal records also influenced the rejection of a domestic violence disclosure scheme (DVDS) in Australia, and led to commitments by the UK government to reform its DVDS, known as 'Clare's Law'.
2. Underpinning research
The sharing and analysis of criminal records are methods used by police to protect victims and pursue offenders in crimes such as sexual offences, domestic abuse, modern slavery and serious violence (including firearms offences). These methods include the profiling of individuals based on previous records, the sharing of police records with the public - for example in sex offence cases or in relation to domestic abuse - and the use of predictive intelligence analysis software to identify likely offenders based on past behaviour. Without regulation and guidance, however, misuse of such methods can result in ‘algorithmic injustice’.
In 2015, Grace, a privacy law and criminal records specialist, examined the legalities of disclosures of criminal records information under the UK government’s Domestic Violence Disclosure Scheme (DVDS) ( R1), finding inadequacies in the operation of the scheme which could precipitate legal challenges. This research led to wider collaborations with academics from the universities of Cambridge, Essex, Kent and Winchester, and policing professionals.
Bringing together expertise from law, policing and quantitative criminology, this body of research identified inadequacies in the proportionality and fairness of the retention of police intelligence on protestors and demonstrators ( R2), and in policy guidance and supporting statutory frameworks for public protection information sharing ( R1, R4, R6). It illustrated how algorithm-based decision-making by the police can undermine the European Convention on Human Rights, namely the right to a fair trial, right to respect for private and family life, and the prohibition of discrimination ( R3, R5). In addition, the research has demonstrated the need for regulatory and statutory reform of the ways in which public protection-oriented disclosures of criminal records are made in the UK ( R4, R6).
Findings indicated that police forces required support to ensure their methods of intelligence analysis were fit for purpose, particularly as the use of such policing tools was increasing in an ad-hoc manner, with no appropriate regulation in place. In 2016, Grace collaborated on a study of policing machine-learning analytics capabilities in the UK, using a freedom-of-information approach ( R3). Following this, Grace, in collaboration with researchers at the universities of Cambridge and Winchester, and Durham Constabulary, undertook the first detailed legal examination of a police machine-learning tool in the UK ( R5). The ‘Harm Assessment Risk Tool’ (HART) was one of the first algorithmic models to be deployed by a UK police force (Durham Constabulary) in an operational capacity. By adopting a multi-disciplinary approach that combined detailed legal analysis with data science, the team evaluated the legalities of HART, which separates offenders into three risk categories, with a view to enabling targeted interventions and reducing future harm and recidivism. Using HART as a case study, the research determined a number of legal and ethical risks, which could affect any machine learning-based decision-making tool in UK policing, including risks relating to bias, discrimination and individual rights ( R5).
In response to the HART study’s findings, Grace and colleagues developed the self-regulation framework - ALGO-CARE - for police forces that use machine learning in their approaches to intelligence analysis. ALGOCARE (published in R5) is a checklist of key considerations in legal, ethical and data science best practice, to be used by police forces in their innovation and adoption of capabilities around data analytics and machine learning applications. ALGO-CARE requires police forces to use predictive analytics in an Advisory (rather than determinative) way, and in a way that is: Lawful, Granular, under clear lines of Ownership, Challengeable, Accurate, Responsible, and Explainable.
3. References to the research
R1. J. Grace (2015). Clare's Law, or the National Domestic Violence Disclosure Scheme: The Contested Legalities of Criminality Information Sharing. Journal of Criminal Law. 79(1) 36-45. https://doi.org/10.1177/0022018314564732
R2. J. Grace and M. Oswald (2016). ‘Being on Our Radar Does Not Necessarily Mean Being Under Our Microscope’: The Regulation and Retention of Police Intelligence. European Journal of Current Legal Issues. 22(1)
R3. M. Oswald and J. Grace (2016). Intelligence, Policing and the Use of Algorithmic Analysis: A Freedom of Information-Based Study. Journal of Information Rights, Policy and Practice. 1(1). https://doi.org/10.21039/irpandp.v1i1.16
R4. M. Duggan and J. Grace (2018). Assessing Vulnerabilities in the Domestic Violence Disclosure Scheme. Child and Family Law Quarterly. 30 (2) 145-66.
R5. M. Oswald, J. Grace, S. Urwin and G. C. Barnes (2018). Algorithmic Risk Assessment Policing Models: Lessons from the Durham HART Model and ‘Experimental’ Proportionality. Information and Communications Technology Law. 27(2) 223-50.
https://doi.org/10.1080/13600834.2018.1458455
R6. K. Hadjimatheou and J. Grace (2020). ‘No Black and White Answer About How Far We Can Go’: Police Decision Making Under the Domestic Violence Disclosure Scheme. Policing and Society (2020). https://doi.org/10.1080/10439463.2020.1795169
All articles were rigorously peer-reviewed prior to publication in leading journals in the field.
4. Details of the impact
SHU’s research has been influential in the UK and internationally in two main areas. First it has shaped policy, practice and debate relating to the development and operation of police machine learning tools; and second it has informed decisions and recommendations regarding criminal records disclosures.
In 2018, the National Police Chiefs' Council (NPCC) endorsed ALGO-CARE as a model of best practice for self-regulating the development of algorithmic tools by UK police forces ( E3). Then through a series of CPD events delivered by Grace, the ALGO-CARE framework was introduced to approximately 30 key UK police organisations, including the National Crime Agency, the College of Policing, the NPCC and regional police forces - chief amongst them West Midlands Police (WMP). WMP have a leading role amongst UK forces in terms of developing a programme of policing data analytics, including the Home Office-funded National Data Analytics Solution (NDAS), which aims to become a centralised advanced analytics capability for UK policing. ALGO-CARE was built into the project initiation process for NDAS and has been used to provide ethical oversight of data analytics projects, including ones which have sought to identify risk factors around vulnerability to modern slavery and the perpetration of knife crime ( E1). In the summer of 2019, WMP began to incorporate the ALGO-CARE framework more widely into oversight processes in relation to new intelligence analysis tools developed by its Data Analytics Lab ( E1).
Grace’s appointment in June 2019 as Vice-Chair of the independent Data Analytics Ethics Committee, established by the West Midlands Police and Crime Commissioner (WMPCC), has been instrumental in ensuring that the ALGO-CARE framework informs both WMP and NDAS’s adoption and development of algorithmic analysis tools. In this way, Sheffield Hallam’s research guided the committee’s work on the Integrated Offender Management (IOM) tool proposed by WMP ( E1). The IOM tool would have drawn on the data of more than 400,000 offenders in the West Midlands, in order to profile and introduce interventions for 8,000 'high harm offenders'. By employing the ALGO-CARE framework, the ethics committee was able to demonstrate that there was insufficient clarity on issues of legality, specifically with reference to older information, in relation to the data that the tool drew upon. Consequently, piloting was delayed until a clarification of the ethics of the tool and its safeguards could be produced by WMP. The lead data scientist for the WMP Data Analytics Lab (DAL) has explained that "Jamie’s legal knowledge has also helped the DAL… in terms of providing the ALGO-CARE framework when projects are taken to the Committee”. He explained that the Committee, through Grace's input, "raised questions regarding the age of the data that could be used for the IOM model and, as a result, when we were in a position to have the relevant business and coding logic in place, the [Management of Police Information] rules were applied to the data, which had the effect of stripping a large number of individuals from the analysis” ( E3). In this way, Grace's research has been used to protect personal data from unnecessary processing and to enhance human rights protections relating to both privacy and public safety.
The ALGO-CARE framework has therefore shaped the dialogue between the ethics committee, supported by the WMPCC, and the NDAS, supported by the Home Office. NDAS abandoned a pilot of an algorithmic tool, designed to predict the risk of individual offenders committing 'most serious violence' (using guns or knives) in both West Yorkshire and the West Midlands. ALGO-CARE requirements around accuracy helped NDAS realise that the reliability of the tool was very much in doubt and ensured that NDAS foregrounded this shortcoming in a transparent and accountable way. A key component of this feature of the data ethics committee is the work by Grace on embedding the ALGO-CARE framework in police practices, which, as West Midlands Police and Crime Commissioner David Jamieson has noted, is ensuring "that ethics and human rights goes to the heart of West Midlands Police's innovative use of technology" ( E3). As the strategic adviser to the WMPCC has highlighted, through both developing and helping to implement the ALGO-CARE framework, Grace "is using his scholarly expertise to meaningfully influence the national conversation on various controversial areas of technology in policing" ( E3). In terms of this national conversation, while NDAS projects informed by the ALGO-CARE framework have a national scope in terms of their development, Essex Police have also utilised the ALGO-CARE framework in setting up the oversight processes for their data analytics partnership with Essex County Council. Police forces in County Durham, North Wales, West Yorkshire, Wiltshire, Lancashire, Avon & Somerset and Kent (bring the total to nine forces) have also built the ALGO-CARE model into their machine learning/artificial intelligence projects. The NPCC national lead for data analytics has described Grace's role in establishing ALGO-CARE as an advisory standard for the UK police service as "invaluable" in the context of an "absence of bespoke legislation or regulation" ( E3). As senior representatives of the NPCC have explained, it "speaks volumes about Jamie and colleagues’ determination and effectiveness that… ALGO-CARE remains the only product to have been endorsed and disseminated by the NPCC Business Change Council" ( E3) . Furthermore, the Head of Criminal Justice at Durham Constabulary has promoted ALGO-CARE as a regulatory model to police forces in " Europe, Abu Dhabi, Dubai, US, Canada, New Zealand, and Australia", and in this way, research by Grace has "significantly influenced the thinking around such models and contributed to the debate nationally and internationally" ( E3).
As a national regulator for data protection issues, including in the law enforcement landscape, the Information Commissioner's Office (ICO) has benefited from the adoption of ALGO-CARE by a growing number of UK police forces. The ICO has since developed a regulatory 'toolkit' for law enforcement bodies to use in self-auditing compliance with the Data Protection Act 2019. The toolkit features ALGO-CARE as part of its package of advice materials for police forces ( E2). The manager for policy projects at the ICO has explained that, taken together, this toolkit and "ALGO-CARE will be complementary resources for police forces to use in order to demonstrate good practice in their application of data protection law”. She added that the ICO found that ALGO-CARE “supports data controllers in understanding the rigour of the law and their role in demonstrating accountability… borne out by police forces when we spoke to them about how it was being used in practice” ( E3).
Parliamentary select committees have been informed by Grace’s research when searching for regulatory models relating to the use of machine learning in the public sector. A Lords Select Committee on Artificial Intelligence report (2018) praised Durham Constabulary for going to “considerable lengths” in helping to develop ALGO-CARE, and so ensuring that their use of algorithms in policing is "open, fair and ethical" ( E8). In response to Grace’s written evidence to a 2018 inquiry on algorithmic decision-making in the UK public sector, the House of Commons Science and Technology Committee recommended that "the Government should produce, publish and maintain a list of where algorithms with significant impacts are being used" ( E7).
Grace also contributed written evidence, cited in the 2020 report of the Committee on Standards in Public Life (CSPL) (advising the Cabinet Office), on issues of bias and algorithms in the public sector. The CSPL, citing Grace’s evidence, noted that on "a case-by-case basis, public sector organisations will need to justify why they are using an algorithm; consider whether the potential impact on individuals is necessary and proportionate; and demonstrate how the tool will improve the current system" ( E10). The CSPL report then recommended that algorithmic tools used in public sector bodies should be assessed for risks to public standards, prior to and post implementation. Additionally, Grace's research on equality law, which underpinned his research on criminal records, informed the House of Lords Select Committee on the Equality Act 2010 and Disability report on the nature and impact of the public sector equality duty (PSED) (E6). The report recommended a strengthening of the (PSED) in the Equality Act 2010, which Grace had shown applied in an integral way to the regulation of the use of police-held data.
Grace drew upon his research to undertake an advisory role to the Law Commission for England and Wales’ report into Criminal Records Disclosure in 2017 ( E9). In line with Grace's assessment, the Law Commission report recommended that the UK government should work on reforming wider policy on Enhanced Criminal Record Certificates (ECRCs). Grace had advised a wider review should be undertaken of the 'filtering' rules, which determine whether minor or older criminal records can be excluded from disclosure on ECRCs.
Grace’s research on the Domestic Violence Disclosure Scheme (DVDS) was drawn upon by the Queensland Law Reform Commission (QLRC) in October 2017, when they responded to calls for a 'Clare's Law'-model policy in their jurisdiction. These types of disclosure schemes present a risk to the rights to rehabilitation of former domestic abusers and there is little evidence that disclosures can be made in such a way that the voice of the rehabilitated offender can be appropriately taken into account. Sheffield Hallam’s research, which was detailed in submissions to the QLRC by both the University of Queensland Pro Bono Centre and the Queensland Centre for Domestic and Family Violence Research, highlighted these privacy law issues with 'Clare's Law' as it was already being operated in England and Wales. The University of Queensland Pro Bono Centre argued, citing research by Grace, that the "DVDS directly interfere with both the perpetrator’s right to control their personal information and right to form relationships". Additionally, the QLRC report noted that "the Queensland Centre for Domestic and Family Violence Research observed that the DVDS in England and Wales has been criticised for compromising the perpetrator’s rights to be consulted before, or informed of, a disclosure: referring to Grace (2015)" ( E4) . The QLRC then rejected a DVDS policy.
In the UK, in June 2020, Jess Phillips MP drew on Sheffield Hallam research when drafting an amendment to the Domestic Abuse Bill (2020) during its House of Commons scrutiny committee stage ( E3), with the aim of securing greater transparency and clarity over the operation of the DVDS by police across England and Wales. Grace provided the text of amendments which were moved verbatim by Jess Phillips MP. As a result of the debate over Grace's amendment, government safeguarding minister Victoria Atkins MP committed to exploring reforms to guidance on both the 'pressing need’ test for disclosures under the Scheme, and to include more emphasis on better use of DVDS disclosures to protect children in households at risk from abusers ( E5).
5. Sources to corroborate the impact
E1. WMPCC Independent Data Analytics Ethics Committee minutes (April 2019-December 2020)
E2. ICO toolkit to assist police agencies using analytics
E3. Correspondence with NPCC representatives and compiled testimonials
E4. Queensland Law Reform Commission (2017) - Domestic Violence Disclosure Scheme, Report No 75
E5. Public Bill Committee, Domestic Abuse Bill, 8th sitting minutes, PBC (Bill 96) 2019-2021
E6. House of Lords Select Committee on the Equality Act 2010 and Disability (2016) – The Equality Act 2010: The Impact on Disabled People
E7. Written evidence ALG0003 submitted to the relevant report: House of Commons Science and Technology Committee (2018) - Algorithms in Decision-Making
E8. Written evidence AIC0068 submitted to the relevant report: House of Lords Select Committee on Artificial Intelligence (2018) - AI in the UK: Ready, Willing and Able?
E9. Law Commission (2017) - Criminal Records Disclosure: Non-Filterable Offences
E10. Report by the Committee on Standards in Public Life (Cabinet Office, 2020) – Artificial Intelligence and Public Standards