Predictive policing technology used by UK police forces “violates human rights” and should be banned, according to Amnesty UK.
In its new report Automated Racism – How police data and algorithms code discrimination into policing it looks at a number of forces, including Greater Manchester Police’s gang-profiling methods.
It stated that these “can be based on no evidence of offending” and that there is “disproportionate representation of Black and racialised people.
Amnesty found that at least 33 police forces – including Greater Manchester police – across the UK have used predictive profiling or risk prediction systems. Of these forces, 32 have used geographic crime prediction, profiling, or risk prediction tools, and 11 forces have used individual prediction, profiling, or risk prediction tools.
GMP’s Xcalibre anti-gun crime taskforce was set up in 2005, described as the “the reactive police arm in order to go out and conduct enforcement around gang criminality and firearms discharges.”
However, Amnesty said that its definition of “gang” and “active gang member” were “incredibly broad”. It added that the profiling had “disproportionately affected young people from Black and racialised communities, amounting to racial profiling.”
A 2016 study found that 89 per cent of people on Xcalibre’s database were from ‘Black and Minority Ethnic’ backgrounds. A response from Greater Manchester Police is below.
According to today’s report:
“The disproportionate representation of Black and racialised people on the ‘gang profiling’ XCalibre database is discriminatory and evidences the racial profiling that XCalibre conducts. This police tactic is also clear infringement of these young people’s right to freedom of association.”
Amnesty said that the 2 main types of predictive policing systems used nationwide that “raised several human rights concerns were around location and profiling.
Location data makes predictions about the likelihood of crimes being committed in geographic locations in the future. “The systems in all locations specifically targeted racialised communities.” In the year ending March 2023 there were 24.5 stops and searches for every 1,000 Black people, 9.9 stops and searches for every 1,000 people with mixed ethnicity, 8.5 for every 1,000 Asian people – and 5.9 for every 1,000 white people. 69% of stop and searches in the UK led to know further action.
Profiling is individuals placed “in a secret database” and profiled as someone at risk of committing certain crimes, in the future.
It added that in areas such as Manchester with high populations of Black and racialised people, they are repeatedly targeted by police and therefore crop up in those same police records. Black people and racialised people are also repeatedly targeted and therefore over-represented in police intelligence, stop-and-search or other police records.
“No matter our postcode or the colour of our skin, we all want our families and communities to live safely and thrive,” stated Sacha Deshmukh, Chief Executive at Amnesty International UK.
“The use of predictive policing tools violates human rights. The evidence that this technology keeps us safe just isn’t there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores.
“These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background.
“These tools to “predict crime” harm us all by treating entire communities as potential criminals, making society more racist and unfair.
“The UK Government must prohibit the use of these technologies across England and Wales as should the devolved governments in Scotland and Northern Ireland. Right now, they can demand transparency on how these systems are being used. People and communities subjected to these systems must have the right to know about them and have meaningful routes to challenge policing decisions made using them.
“These systems have been built with discriminatory data and only serve to supercharge racism.”
READ MORE – Greater Manchester Police is second English force targeted by cyber attackers in a month
Zara Manoehoetoe of the Kids of Colour and Northern Police Monitoring Project added:
“The way in which these systems work is that you’re guilty until you can prove yourself innocent. Criminalisation is a justification for their existence. There is the presumption that people need to be surveilled and that they need to be policed.”
Amnesty is calling for a prohibition on predictive policing systems ; transparency obligations on data-based and data-driven systems being used by authorities, including a publicly accessible register with details of systems used; and accountability obligations including a right and a clear forum to challenge a predictive, profiling, or similar decision or consequences leading from such a decision.
A Greater Manchester Police spokesperson told Prolific North:
“Our priority is preventing crime and stopping people from coming to harm. We have a range of ways we proactively do this, including engaging with communities and working with partners and charities to divert people away from criminality.
“Proactive policing is a particularly vital part of how we’ve brought down violent crime which saw 1,600 fewer victims come to harm last year across GM. We focus our local and specialist resources in the areas where a combination of community information and recent reporting suggests there is risk of people being subjected to harm.
“The work of the XCalibur Task Force in the past two decades has seen us engage with areas of Manchester where violent crime has previously blighted communities. It does not work from any ‘gang profiling’ database and instead uses information from the community to inform the life-changing work it continues to achieve alongside local community groups and agencies.”
It added that:
- Xcalibur do not possess a gang database or matrix
- We have not used banning letters in the past 2 years and have no plans to return to this
- Any XTF attendance at music events is based on intelligence (usually threat and harm related). We do not request information based on ethnic backgrounds.