Government failure to manage AI risks exposes UK to ‘serious harm’

20 January 2026

UK consumers and the financial system are being exposed to potentially serious harm by the Government’s failure to manage the risks of artificial intelligence in the financial services sector, the Treasury Select Committee has warned.

A new report by the Committee criticised the Financial Conduct Authority, the Bank of England and the Treasury for taking a ‘wait and see’ approach to the use of AI.

According to evidence received by the Committee, more than 75% of UK financial services firms are now using AI, with the largest take up among insurers and international banks. It is used in a variety of ways, including to automate administrative functions and deliver core services such as processing insurance claims and credit assessments.

While MPs on the Committee acknowledge that AI and wider technological developments could bring considerable benefits to consumers, they stressed that more action is needed to ensure that this is done safely.

The UK does not currently have AI-specific legislation or AI-specific financial regulation. However, the report outlined a number of risks to financial consumers, including a lack of transparency within AI-driven decision making in credit and insurance and the potential for AI search engines to mislead or misinform consumers. Additionally, the report found AI usage may result in an increase in fraud.

AI also poses significant risks to financial stability, heightening cyber security vulnerabilities. The report found that UK financial services firms are overly reliant on a small number of US technology firms for AI and cloud services, threatening the sector’s operational resilience.

To counter the risk, the Committee is recommending that the Bank of England and the FCA conduct AI-specific stress-testing to boost businesses’ readiness for any future AI-driven market shock.

It is also recommending that the FCA publish practical guidance on AI for firms by the end of this year. This should include how consumer protection rules apply to their use of AI as well as setting out a clearer explanation of who in those organisations should be accountable for harm caused through AI.

The Critical Third Parties Regime was established to give the FCA and the Bank of England new powers of investigation and enforcement over non-financial firms which provide critical services to the UK financial services sector, including AI and cloud providers. Yet, despite being set up for more than a year, no organisations have yet been designated under the regime.

The Committee is urging the Government to designate AI and cloud providers deemed critical to the financial services sector in order to improve oversight and resilience.

Dame Meg Hillier, chair of the Treasury Select Committee, said firms are eager to try and gain an edge by embracing new technology, particularly in the financial services sector which must compete on the global stage.

Hillier said: “The use of AI in the City has quickly become widespread and it is the responsibility of the Bank of England, the FCA and the Government to ensure the safety mechanisms within the system keeps pace.

“Based on the evidence I’ve seen, I do not feel confident that our financial system is prepared if there was a major AI-related incident and that is worrying. I want to see our public financial institutions take a more proactive approach to protecting us against that risk.”

Phil Cotter, CEO of SmartSearch, said: “While AI can dramatically improve the detection of suspicious activity and reduce false positives, these benefits disappear if firms cannot explain decisions, evidence compliance to regulators or intervene when systems go wrong.

“A lack of clarity around accountability and oversight creates real risks for vulnerable consumers and opens the door to greater financial crime. Regulatory clarity, explainable AI and human oversight are essential if AI is to strengthen – rather than undermine – the UK’s defences against fraud and financial crime.”

 

Professional Paraplanner