Computer technology used by police in Bristol that predicts where crimes will be committed and who might be about to commit them is ‘supercharging racism’ according to a new report from Amnesty.

The international human rights body has called on Avon and Somerset police, and the 32 other police forces across the country, to stop using predictive computer systems which turn police data into algorithms and make predictions about future crime which informs policing decisions across Bristol. But Avon and Somerset police insist the new technology is used in line with “legal and ethical guidelines”.

Amnesty said its 120-page report, entitled ‘Automated Racism – How police data and algorithms code discrimination into policing’, is the first of its kind in the country to look in depth at the growing use of these AI-based algorithm systems, the impact of them on people and communities, and whether any police forces have looked into how effective they are.

Amnesty said Avon and Somerset Police started using a number of profiling algorithms through its computer system called Qlik Sense as far back as 2016. The force has said in the past that there were around 300,000 people on its internal Offender Management App, and that as many as 170,000 people have been profiled and assigned a risk score in the past six years.

“This is a substantial number of people profiled to assess their so-called risk of committing crime in future,” said an Amnesty spokesperson. “The force has said that no formal evaluation reports have been conducted on any of its Qlik apps,” he added.

Amnesty said they have two main human rights concerns with the predictive police systems of the kind used by Avon and Somerset police. The first is that it factors in geographic areas where crimes are seen as likely to be committed, and Amnesty said that the systems in all those locations ‘specifically targets racialised communities’.

The second concern Amnesty said it has with the use of algorithm-based predictive policing systems is that individuals are ‘placed in a secret database and profiled as someone at risk of committing certain crimes in the future’.

Amnesty said that, effectively, using this kind of system becomes a self-fulfilling prophecy – if an area in Bristol with a high proportion of black people is predicted to be where crime is going to happen, it then is targeted by police who find more crime there than a different area with the same level of crime that goes undetected because it isn’t targeted by police, but the algorithm takes that data and the cycle is perpetuated.

“Areas with high populations of Black and racialised people are repeatedly targeted by police and therefore crop up in those same police records,” an Amnesty spokesperson said. “Black people and racialised people are also repeatedly targeted and therefore over-represented in police intelligence, stop-and-search or other police records,” he added.

The 120-page report includes the experience of a man called David, from Bristol, who had been stopped by police for putting a sticker on a lamppost after a gig in 2016, and later found he had been profiled by the police’s predictive system – he’d been stopped and searched by police in the years since as many as 50 times, he said.

Avon & Somerset Police Chief Constable Sarah Crew at the Police & Crime Commissioners Police Question Time on Thursday, September 12, 2024 (Image: Avon & Somerset PCC/Facebook)

He tried to find out more information about what his police profile said about him, or what his ‘score’ was, but the police would not tell him. He challenged it, and asked them to change it.

“I feel my overall experience with Avon and Somerset police is…very much a negative one,” he is quoted in the Amnesty report. “I think a lot of that is knowing what I went through in my youth. That caused me to feel very wary of them as an organisation.

“I find it very difficult to put any degree of faith or trust in them. And I’m sure some of that is going to be from past trauma that I’m carrying from them. I have therapy every week about some of the stuff that I’ve been through because of the police and how they’ve treated me over the past, say, three or four years. It’s scandalous, to be honest. They made me feel like I don’t have any rights at all,” he added.

The chief executive of Amnesty International UK, Sacha Deshmukh, said this technology ‘violates our fundamental rights’. “No matter our postcode or the colour of our skin, we all want our families and communities to live safely and thrive,” he said. “The use of predictive policing tools violates human rights.

“The evidence that this technology keeps us safe just isn’t there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores. These technologies have consequences. The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background.

“These tools to ‘predict crime’ harm us all by treating entire communities as potential criminals, making society more racist and unfair. The UK Government must prohibit the use of these technologies across England and Wales as should the devolved governments in Scotland and Northern Ireland. Right now, they can demand transparency on how these systems are being used.

“People and communities subjected to these systems must have the right to know about them and have meaningful routes to challenge policing decisions made using them. These systems have been built with discriminatory data and only serve to supercharge racism,” he added.

Sacha Deshmukh, the chief executive of Amnesty International UK (Image: Handout)

Bristol Copwatch’s John Pegram explained that the algorithm and predictive policing system was effectively racist profiling.

“It doesn’t matter if you offended 13 or 14 years ago for something, you’re known to us for this, and therefore we’re going to assign a score to you,” he said. “It’s risk scoring, it’s profiling, often racist profiling. It’s very assumptive, and it’s very biased, and it’s assuming that because you’ve done something wrong in life, you’re going to repeat the same mistakes again.

“You’re going to go back to where you were, say, 15 years ago, three years ago, two years, even six months ago. And what it doesn’t do is allow for the fact that people change and people rehabilitate,” he added.

A spokesperson for Avon and Somerset Police said: “As a police service we’re expected use our finite resources efficiently to tackle crime and anti-social behaviour (ASB) and safeguard the most vulnerable, while facing increasing demand.

“We use data science to help with that, following legal and ethical guidelines and routinely monitoring for bias.

“Our Offender Management App only scores people who have been charged with a criminal offence within the past two years. If they’re not convicted, they are removed from the scoring process.

“No protected characteristics are included in the model, nor do we include addresses, intelligence, stop and search information or location data.

“The models do not predict whether someone will become a victim or commit offences. They help to calculate a potential risk which can support a more holistic professional human judgement when making an assessment or prioritising competing demands.

“Data science can provide supporting evidence in bids for extra resources and funding to tackle specific issues identified by communities.

“This could be targeted visible patrols to deter and disrupt crime – but will not always be enforcement activity.

“For example, along with the views of the community, local business and partners, data analysis helped to evidence the issue which led to the football programme undertaken by the Robins Foundation which is reducing ASB in Hartcliffe, Bristol.

“Avon and Somerset Police accepts that there is disproportionality within the criminal justice system which adversely affects minoritised communities.

“We acknowledge the effect of over-policing and under-protecting minoritised communities on trust and confidence in policing and are actively working to becoming an anti-racist police service, with help from people with lived experience of these issues.

“Our use of data science is an essential part of that in terms of understanding the extent and contributing factors for that disproportionality and also in developing solutions to tackle and reduce it together with local communities.”