British police forces have signed contracts with a controversial US tech giant to use AI-powered software that provides data about an individual’s race, sex life, health and political beliefs, it can be revealed.
An internal police memo obtained by The i Paper and Liberty Investigates confirms an intention to “nationally” apply the “Nectar” intelligence system, currently deployed as a pilot by the Bedfordshire force after being developed with Silicon Valley data analysis group Palantir Technologies.
The document obtained under freedom of information rules shows how the Palantir system is designed to bring together dozens of existing law enforcement databases into a single computing platform to draw up detailed profiles of suspects as well as collate information on victims of crime, witnesses and vulnerable individuals including children.
The 34-page briefing, which deals with data protection issues related to Nectar and Bedfordshire Police, makes clear the ambition by senior officers for the system to be used across policing, including in the fight against serious organised crime by regional units.
It states: “The primary goal is to help Bedfordshire… as well as the Eastern Region Serious Organised Crime Unit and eventually apply [Nectar] nationally. This will develop tools to better protect vulnerable people by preventing, detecting and investigating crime.”
The crime fighting tool is the latest manifestation of the Government’s push to harness artificial intelligence to improve the performance of cash-strapped public services – from the health service to defence – with the help of private sector players such as Palantir.
But the use of the technology to access vast amounts of police data, including sensitive personal information, has alarmed senior MPs and campaigners, with one former home secretary calling for Nectar to be scrutinised by Parliament.
The system, which is already in use by the Bedfordshire force with a similar system understood to be under development by Leicestershire Police, could be used to target “persons suspected of having committed or being about to commit a criminal offence”.
A Home Office source acknowledged that the “experience” gathered by forces using the Nectar pilot scheme would be used to assess the effectiveness of the system in comparison with “potential alternatives”, but underlined that no decision has yet been taken to deploy the Palantir system nationally.
The police memo states that Nectar will “require and be used to access” 11 different types of “special category information” held on an unspecified number of individuals. This information includes “race”, “political opinions”, “sex life”, “religion”, “philosophical beliefs”, “trade union membership” and “health”.
It is understood that as many as 80 separate data sources, ranging from traffic camera data to intelligence files, are available to be processed by the software.
Palantir and Bedfordshire police have insisted the AI software will only access existing information held by law enforcement agencies and no police data will be routinely accessible to non-police staff.
Co-founded by the US libertarian billionaire Peter Thiel, Palantir, which already has contracts with the NHS and Ministry of Defence, said its system did not amount to so-called “predictive policing” – an ability to use AI to try to identify individuals at risk of offending – and said its policy was not to offer any such capability. It also stated that it never permits its software to be used for any form of racial profiling and there is no suggestion UK police forces intend to use the Nectar system for this purpose.
“A single source of truth”
Sensitive data such as details of a person’s sexual activity or philosophical beliefs can be gathered by police forces for a number reasons, including seeking intelligence on potential offences or offenders. Such data can be gleaned from so-called human sources, including informants or witnesses, as well as social media platforms such as online forums, dating website profiles to financial records. Much of this information can only be obtained with permission such as a court warrant and is then held securely on classified systems.
A police source said: “We live in an age where vast amounts of information are available about almost everybody. One of the big problems we face is dealing with that body of information in a way that is both lawful and allows us to detect and prevent crime. This is not about indiscriminate snooping – access to this sort of data is really carefully controlled. But at the same time we have to be equipped to do the job the public rightly expects us to.”
Senior officers believe that the AI software – billed as a “real-time data-sharing network” – will dramatically speed up the identification of suspects and criminal networks as well as offering an enhanced ability to crack down on domestic abusers and protect minors at risk of harm.
The data protection assessment from Bedfordshire Police, which has paid Palantir £1.4m for services in the last two years, describes Nectar as offering users “a single source of truth”.
However, senior MPs and privacy campaigners have expressed concern over what they say are its far-reaching implications for the ability of police to sift vast amounts of data about individuals and the way in which such information is used and safeguarded.
David Davis, the former Conservative home secretary, said he would like to see the new system and its legal underpinnings examined by MPs, adding that it raised “multiple concerns” about issues including data deletion and the risk that systems used to plot criminal networks end up flagging individuals unconnected with any wrongdoing.
He told The i Paper: “There is a real problem with technology being applied to policing without the necessary statutory underpinning and police simply appropriating the powers they want. There are lots and lots of reasons to be concerned by this [Nectar] software and it should be scrutinised by Parliament.”
Campaigners said the sheer breadth of information available to Nectar raised serious questions about the ability to safeguard privacy.
Who are Palantir Technolgies?
If proof were needed of Palantir’s status as a key mover and shaker in Britain’s attempts to grapple with the AI revolution, it was to be found in the Prime Minister’s itinerary on his trip to Washington in February.
After finishing his crucial first meeting with President Trump in the Oval Office, Keir Starmer – accompanied by his national security adviser Jonathan Powell and Lord Mandelson, the UK’s ambassador to the US – drove to the offices of the Silicon Valley data analysis giant to take tea with the company’s chief executive, Alex Karp.
Starmer, who has made clear his enthusiasm for “rewiring” the British state by harnessing the benefits of artificial intelligence, was given a briefing on Palantir’s existing UK public sector contracts, including a £330m five-year deal to provide the NHS with a huge new data platform, alongside work with the Ministry of Defence and, increasingly, police forces.
The meeting was evidence of the remarkable momentum of an unconventional company which has been no stranger to controversy since it began in 2003 as the brainchild of Peter Thiel, a co-founder of PayPal and supporter of Donald Trump in 2016, who saw potential in the emerging field of analysing large databases.
The then start-up, named after the “seeing stone” in JR Tolkien’s Lord of the Rings, was backed to the tune of $2m by the venture capital arm of the CIA and has its roots in the realm of working closely with the US defence and intelligence sectors. One of its early projects was working with the US military analysing data on the placing of roadside bombs in Afghanistan to predict where subsequent attacks might occur.
Joe Lonsdale, another of the company’s founders, has said its software has allowed counter-terrorism experts and special forces “to neutralise thousands of adversaries (including infamous ones) and prevent dozens of attacks on the United States”.
Since then, the company has rapidly expanded, gathering a snowballing roster of public and private sector clients for its AI-enhanced software tools which has helped boost its value to some $320bn (£235bn), making it worth more than Disney or Coca Cola. Its annual rate of revenue growth currently stands at nearly 40 per cent.
It is a journey which has brought criticism, including protests at Palantir’s deal with the US Immigration and Customs Enforcement (ICA) agency to develop software used to track and identify undocumented migrants – a key focus for the Trump administration as it seeks to dramatically ramp up deportations.
The company has also faced controversy over its work with police forces in the United States. Civil rights groups have raised concerns that its data tools have been used by some police departments for so-called “predictive policing” to flag individuals or neighbourhoods where offences are more likely to happen. Palantir has said its company policy is not to support or allow the use of its software for predictive policing or racial profiling.
Nonetheless, the company has developed a reputation for being unapologetic about its work and world view. Thiel and his fellow executives have said they see the company’s purpose as helping to defend the West and Western values.
Speaking earlier this year, Karp said one of the founding aims of Palantir was to create an “impactful company that could power the West to its obvious innate superiority”.
Cahal Milmo
David Nolan, an expert in algorithmic surveillance at Amnesty International, said: “The establishment and provision of data-driven law enforcement raises severe human rights concerns. The development of a ‘real-time data-sharing network’ across UK law enforcement agencies, that creates a 360 profile of individuals using sensitive personal data, violates people’s right to privacy and establishes a system of indiscriminate mass surveillance.”
Labour MP Chi Onwurah, chair of the House of Commons technology select committee, said: “For the digital transformation of government to be successful, people must be able to have confidence in public sector technology. Improving the access and use of data can make public services more effective, but this must be accompanied by the appropriate safeguards and transparency.”
The Trade Union Congress said it was concerned at the ability of Nectar to integrate information on whether individuals were trade union members, pointing to previous examples of activists being “blacklisted” by employers with the help of intelligence provided by police.
TUC assistant general secretary Kate Bell told The i Paper: “There is a long history of trade unionists being targeted simply for defending members’ interests. It is vital that any processing of trade union information by police forces and others is done in accordance with data protection law.”
The internal police memo suggests that senior officers recognise the potential for their AI intelligence platform to cause disquiet, noting that “there may be scepticism or resistance form the public regarding data sharing and privacy concerns”. It adds that “any breaches [of data security] could have significant consequences”.
However, senior officers also argue that they are in technological race with criminals who have already proved themselves adept at harnessing digital tools, including AI, to commit offences at industrial-scale from online fraud to grooming or blackmail.
Read Next
square ARTIFICIAL INTELLIGENCEVast data warehouses to attract AI firms - and could heat swimming pools
Read MoreIt is claimed that by adopting tools such as Nectar, vast amounts of time dedicated to sifting evidence and organising information or performing administrative tasks can be saved and allow officers to be more nimble in catching criminals and protecting vulnerable individuals.
Bedfordshire Police said it considered the privacy of personal information to be “absolutely paramount” and that Nectar had been designed with “robust security measures”. The force said the new system was an “explorative exercise” and that its experience so far suggested it could result in faster response times and “more successful resolutions of cases”.
A spokesperson said: “As the landscape of policing evolves, it’s imperative that we evolve with it, which means taking an innovative approach to systems and procedures to allow us to be more efficient.”
Palantir said that the Nectar system had identified dozens of additional children at risk of abuse within days of being put into operation and was also being used to enact Clare’s Law – the system giving women the right to know if their partner has an abusive past.
A spokesperson said: “We’re proud our software is helping police improve how they tackle crime, including domestic violence, which is a key part of the ‘Nectar’ pilot. It is supporting powerful results such as the identification of more than 120 young people potentially at risk of abuse or exploitation in the first eight days of its use for child protection.”
The company, which has built Nectar using its own data analysis software called Foundry, said it wanted to emphasise that its system did not provide any information not already held by police. The spokesperson added: “It simply organises that data in a way that enables faster, better decision making.”
The Home Office source said it would remain a matter for “operationally independent” forces to decide how to deploy AI systems. The National Police Chiefs Council did not respond to a request to comment.
Smart phone downloads and “association charts”: How Nectar works
The precise workings of Nectar remain under wraps for fear of giving criminals insights into police capabilities. But it is understood that the system creates a “dashboard” of data for officers working on a particular incident or case by simultaneously extracting information from multiple sources.
Depending on need, investigators are provided with a live summary of evidence such as location data or messages from a seized phone, number plates, 999 calls or intelligence files with the aim of identifying suspects, building a picture of criminal associates or alerting vulnerable individuals.
Tasks that would previously have taken days, such as crunching data from a mobile phones, take hours and building “association charts” – a task previously associated with pieces of string linking together images on investigation room white boards – now happen 75 per cent quicker, it is claimed.
While sensitive personal data such a political beliefs or sexual orientation is available to the system, it is understood that a “use case” or justification must be provided for each search and an audit trail tracking use is built into the system.
Those involved with the project insist it is already bringing tangible benefits in terms of solving or averting crimes.
Cahal Milmo
Read More Details
Finally We wish PressBee provided you with enough information of ( Police use controversial AI tool that looks at people’s sex lives and beliefs )
Also on site :
- Term Sheet Next: How Facebook’s former chief revenue officer is coaching the next generation of startup founders
- Popular Dunkin’ Menu Item Gets a ‘Glow-Up’ in Limited-Edition Release
- Inside Iran’s concrete-encased nuke fortress buried under a mountain & ringed by air defences that Israel MUST destroy