ICAEW.com works better with JavaScript enabled.

Why patchy regulation is a challenge for AI

Author: ICAEW Insights

Published: 20 Mar 2025

In a fascinating recent round-table event, ICAEW members explored a host of issues around how the UK regulatory landscape should best address companies’ use of AI.

What sort of framework does the UK need to drive safety in artificial intelligence (AI), while ensuring that businesses can innovate with the technology at a competitive pace?

That was a key question explored in a recent talk at Chartered Accountants’ Hall. Held in early March, the round-table event brought together an impressive roster of ICAEW members, each of whom has witnessed the rise of AI first hand. While some work directly in the accounting profession, others have taken their skills into enterprise.

Chaired by ICAEW President Malcolm Bacchus, the event welcomed as special guest Shadow Secretary of State for Science, Innovation and Technology Alan Mak MP, who committed to being “on listening mode” as the discussion unfolded.

The members’ thoughts on issues around regulatory design for AI will no doubt have given him much to consider.

Multiple layers

First to delve into the topic of regulation was Becky Shields, Partner and Head of the Data Analytics and AI team at Moore Kingston Smith. Shields pointed out that even as a mid-tier organisation, her firm faces multiple layers of oversight. While its primary work is regulated by the main accountancy bodies, it also has a legal services practice covered by the Solicitors Regulation Authority and financial advisers overseen by the Financial Conduct Authority.

As such, she said: “Our biggest concerns are regulatory conflict and duplication of effort. Thinking about our start-up clients, it’s quite hard now to be entirely domestic. Many of them will be dealing with multiple layers of regulation across different territories. If UK plc wants to reap the benefits of AI, regulation must not impede innovation.”

Similar concerns were expressed by Jo Muncaster, Head of Finance at AI-enablement specialists digiLab. Although the company’s platform is sector-agnostic, each industry it works with has its own regulators. “For them to assess whether our tools are compliant requires us to open up our tech stack to a level of transparency that makes it very hard to protect intellectual property,” she said. “Over-regulation of AI could make matters worse, and undermine competitive advantage.”

Glenn Fletcher, CEO of industrial sensors manufacturer Tribosonics, said that rather than bring in new and complex regulation, the government should encourage greater compliance with relevant ISO Standards. For example, 42001 and the in-development 27090 set a number of AI benchmarks. In Fletcher’s view, ISO alignment would be much simpler for business to manage than statutory requirements. “If you’re dealing with large customers who’ll only work with you if you’re certified, that will provide a natural control,” he said.

However, Shields pointed out that some certifications can be onerous – particularly for small entities, for which the approval process “can take months”.

With that in mind, Pippa Goodall, Head of Finance at retail-focused AI solutions provider Edgify, asked: “Is there a middle ground, where enterprises can somehow still be safe, but keep the agility they need to move forward?”

For Fletcher, the answer is for companies to draw up a time-scaled roadmap towards certification. “If you’ve got a roadmap to show you’re on that journey, that will impress your clients and help your team.”

 

Rigorous guidance

PwC Partner and Global Head of AI Trust Leigh Bates highlighted respected best-practice initiatives in the cyber arena, such as the US Cybersecurity Framework and the UK Cyber Essentials scheme. He suggested that perhaps UK stakeholders could collaborate on developing a similar suite of ‘AI essentials’ to affirm confidence in the fast-growing sector.

“We need to show that we’re consistently applying trust by design, and incorporating the right ethical principles into the development and deployment of AI use cases at scale,” Bates said. “What guardrails, testing and monitoring procedures are needed to secure greater confidence in AI-enabled applications?”

Forvis Mazars’ Director of Innovation and Digital Skills Robbie White noted that one major driver behind the success of Cyber Essentials is that adoption is now expected of companies that wish to work with the public sector. In White’s view, if an ‘AI Essentials’ scheme should go ahead, it would be important for stakeholders to look at how to set minimum standards that are not too onerous, but still provide rigorous guidance.

“We would need to give companies that guidance sooner rather than later,” he said. “That way, they can plot a path towards applying certain technologies in, say, three years’ time, and work on upskilling their teams accordingly.”

For Deloitte Partner, Algorithm and AI Assurance, Mark Cankett, the question of interoperability between rulesets in different jurisdictions is every bit as important as that of how UK-specific regulation should work. “At some point, UK companies will be exposing EU residents to their AI,” he said. “Therefore, the extraterritorial nature of the EU AI Act will come into play. We must recognise that, as they grow, UK companies will go on a journey of being more exposed to overseas codes and frameworks.”

Prime opportunity

In tandem with regulation, another critical force in the safety debate is AI assurance. For Bates, the term requires greater clarity that will help it – and its various steps – to be more broadly understood. He pointed to the Singapore Government’s AI Verify framework as one example of how a prominent territory is working to standardise AI governance, testing and validation to inform AI assurance. “It’s difficult to reach full transparency and explainability of AI models,” he said. “But with robust testing, we can get to a degree of comfort that AI use cases are operating within risk tolerance against responsible AI, trust and ethical principles.”

“There’s clearly a need for AI assurance and certification – not just in the UK, but globally,” said ICAEW Head of Tech Policy Esther Mallowah. “However, not enough people are focusing seriously on it right now. For me, this presents a prime opportunity for the UK to lead. Getting assurance right would also encourage domestic demand and adoption.”

Rounding off the discussion, Bacchus said: “Regulators need to look at risks and rewards in both the short and long-term, rather than simply regulating for the situation that exists today.” He suggested looking at a type of regulation akin to the food industry’s five-star system. “It’s a light-touch system that adjusts itself without much intervention at all. In AI, we need a similar sort of graduated framework of certification that will provide assurance, but won’t stifle innovation.”

Coming up…

For further insights on these areas of the AI debate, join ICAEW’s inaugural AI Assurance Conference at Chartered Accountants’ Hall on Monday 19 May.

Accounting Intelligence

This content forms part of ICAEW's suite of resources to support members in business and practice to build their understanding of AI, including opportunities and challenges it presents.
Support on AI
Computer screen with text relating to generative AI

You may also be interested in

Conference
AI Assurance Conference

ICAEW is bringing together assurance providers, businesses, policymakers and academics to explore the role of AI Assurance in promoting responsible AI adoption and innovation.

Find out more Book your place
Support
Generative AI Guide

Explore the possibilities of generative AI in accounting through ICAEW's guide to Gen AI, covering use cases, risks, limitations, ethics and how to get started.

Read more
ICAEW support
A person holding  a tablet device displaying various graphs
Training and events

Browse upcoming and on-demand ICAEW events and webinars focused on making the most of the latest technologies.

Events and webinars CPD courses and more
Open AddCPD icon

Add Verified CPD Activity

Introducing AddCPD, a new way to record your CPD activities!

Log in to start using the AddCPD tool. Available only to ICAEW members.

Add this page to your CPD activity

Step 1 of 3
Download recorded
Download not recorded

Please download the related document if you wish to add this activity to your record

What time are you claiming for this activity?
Mandatory fields

Add this page to your CPD activity

Step 2 of 3
Mandatory field

Add activity to my record

Step 3 of 3
Mandatory field

Activity added

An error has occurred
Please try again

If the problem persists please contact our helpline on +44 (0)1908 248 250