The Bank of England wrote a letter in April to the Secretary of State for Science, Innovation and Technology, and the Economic Secretary to the Treasury and City Minister, about the Bank of England’s and Prudential Regulation Authority’s (PRA) work on delivering safe and responsible AI and machine learning (ML) within its regulatory remit.
In this letter, the Bank of England points out: “AI/ML is already being widely adopted in many parts of the financial sector to improve firms’ operational efficiency, better detect fraud and money laundering, and enhance data and analytics capabilities.” So far, the Bank says it has been able to meet its statutory objectives while supporting the safe and responsible adoption of AI/ML in financial services, but will keep its approach under continuous review.
The letter says the Bank of England and the PRA adopt a “technology-agnostic approach to supervision and regulation of AI/ML. Our core principles, rules, and regulations therefore do not usually mandate or prohibit specific technologies.”
It goes on to emphasise that technology-agnostic does not mean technology-blind. “Risks may arise that relate to the use of specific technologies (such as AI/ML) and have an adverse impact on our statutory objectives, and we actively work to understand and address these.”
The Bank of England, PRA and Financial Conduct Authority (FCA) believe their regulatory framework has proved well equipped to capture regulated firms’ use of AI/ML, but the situation will be kept under review and engagement will continue.
The AI/ML question is a serious one
Jonathan Hall, an external member of the FPC at the Bank of England, gave his personal views in a speech to Exeter University Business School. It was called ‘Monsters in the deep’ – perhaps the clue is in the title.
The FPC was set up in response to the 2008 financial crisis, to reduce the risks of financial instability. “If we at the FPC have an enemy, it is the forces of amplification,” he says.
In his closing comments, he refers to two major themes: “One is the complexity and unpredictability of deep learning models. And the second is the possibility that collusive or destabilising behaviour could arise from an unconstrained profit maximising function.
“Taken together these create performance and regulatory risks for trading firms, and explain the current caution about using neural networks for trading applications. If trading algorithms engage in non-compliant, harmful behaviour then the trading manager will be held responsible.”
He says that the adoption of deep trading algorithms might raise system-wide concerns. It could lead to a less resilient and highly correlated market ecosystem, or neural networks could learn the value of actively amplifying an external shock. “The FPC’s old enemy, the forces of amplification, could arise in this new form.”
This is why, he says, it makes sense for regulators, market participants and AI safety experts to work together to ensure that the behaviour of any future algorithms can be constrained and controlled.
A question of balance
Further insights can be gleaned from Randall S Kroszner’s presentation at City Week 2024. Kroszner is an external member of both the Bank of England’s FPC, charged with identifying, monitoring and mitigating systemic risks, and the Financial Market Infrastructure Committee (FMIC), which supervises financial market infrastructures.
He says: “Productivity growth is crucial to boosting real wage growth and sustaining economic growth, particularly when the number of hours worked in an economy may be declining as populations are ageing and growing more slowly (or are, in some countries, declining). Innovation is a fundamental driver of productivity growth, which is why it is valuable to have promotion of innovation incorporated into the FMIC’s objectives.”
Kroszner adds that AI may be the answer to some of these challenges – but it could involve fundamentally disruptive innovation and change that brings both enormous upsides and potential risks. The challenge, he says, is to develop a regulatory framework that fosters “the flowering of creativity and innovation”, but takes potential financial stability risks into account.
“I want to make sure FPC and FMIC as regulators and guardians of financial stability are properly equipped to deal with the challenges ahead – that means continuously and consciously deepening our understanding of the issues so we can take part fully in conversations about whether and how we should respond to developments.”