
US surveillance firm Palantir is to be granted access to a trove of highly sensitive UK financial regulation data, reports have said
The Financial Conduct Authority (FCA) has awarded a contract to Peter Thiel’s AI firm to examine the watchdog’s internal intelligence data, the Guardian has said.
It is understood the deal will allow Palantir access to the FCA’s data to help it tackle financial crimes including money laundering, investigating fraud and insider trading.
The firm, co-founded by billionaire Donald Trump donor Thiel, has been appointed for a three-month trial by the FCA to analyse its “data lake”.
READ MORE: UK must 'get off the road to open-ended war' in Middle East, warns John Swinney
The deal could lead to a full procurement of an AI system by the authority, part of its drive to use digital intelligence to focus resources on rule-breaking among the 42,000 financial services firms it regulates, including major banks and crypto exchanges.
According to the Guardian, there was only one other competitor for the contract, who has not been named.
Miami-based Palantir already has more than £500 million in deals with UK public services, including the NHS, Ministry of Defence (MoD) and police forces in England.
The firm is expected to apply Foundry, its data analytics platform, to huge quantities of information held by the FCA, which will include case intelligence files that are marked highly sensitive, reports from lenders about proven and suspected frauds, information on what are described as “problem firms”, and data about the public including consumer complaints to the financial ombudsman.
This data trove will include recordings of phone calls, emails and swathes of social media posts.
The deal has raised concerns inside the FCA, an agency which aims to stop financial crimes that underpin the drugs trade and human trafficking.
A source told the Guardian: “Once Palantir understands how we detect money-laundering threats, how do we know that they are ethically reliable enough not to go to share that information?”
READ MORE: BBC 'imposed restrictions' on its journalists during coverage of Gaza, UK court hears
The Israeli military and US Immigration and Customs Enforcement (ICE) currently use Palantir’s technology, leading to concerns over its involvement in human rights violations.
In 2023 it signed a £330m deal with NHS England, and a £240m contract with the MoD in December 2025.
Professor Michael Levi, an expert in money laundering at Cardiff University, said that there was “serious under exploitation” of data held by financial regulators, so AI could potentially be a valuable resource to tackle financial crimes.
He told the Guardian it was “a relevant question as to whether Palantir’s owners might tip off their friends about methodologies”.
Levi added: “What are the protocols agreed between the FCA and Palantir about the onward use of things that they have learned in that process?”
Under the terms of the contract, the FCA said Palantir would be a “data processor” rather than a “data controller”, and that it would only act on instruction from the regulator.
The US firm will have to destroy data after the contract is completed, and the FCA will retain exclusive control over the encryption keys for the most sensitive files.
The data will also be hosted and stored solely in the UK.
READ MORE: Major update for all platforms at Glasgow Central station
Christopher Houssemayne du Boulay, a partner and barrister at the law firm Hickman & Rose, told the Guardian that the FCA can compel firms to hand over vast quantities of data while it is carrying out an enforcement investigation.
“We could be talking about hundreds of whole email accounts and full financial records. Many innocent people will be caught up in that and the data may contain bank account details, email addresses, telephone numbers and other personal information,” the specialist in defending complex financial crime cases, said.
“If you ingest that data and use it to train an AI system, there are very significant privacy concerns. There should be serious confidentiality requirements regarding what Palantir does with the data.”
The FCA said Palantir could not copy the data to train its AI products.
A spokesperson for the FCA said: “Effective use of technology is vital in the fight against financial crime and helps us identify risks to the consumers we serve and markets we oversee. “We ran a competitive procurement process and have strict controls in place to ensure data is protected.”
Disclaimer
This article is intended for general information purposes only and does not constitute legal advice. For advice specific to your situation, please contact our team at T & M Legis for a consultation with our Legal Experts.

