ICAEW.com works better with JavaScript enabled.
Exclusive content
Access to our exclusive resources is for specific groups of students, users and subscribers.

When deploying a new AI tool, ethical and legal risks need to be carefully considered and managed.

Yet the current lack of clear specific AI regulation in the UK contributes to an overall feeling of unease among dealmakers about how to effectively implement AI, especially Gen AI. The lack of clarity, structure, oversight and standards could put those who use AI in an exposed position.    

The Corporate Finance Faculty’s aim is to provide a high-level overview of legal frameworks that could impact decisions around AI use in the UK and EU, while highlighting where further ICAEW guidance can be found on the topics of legal and ethical considerations. We provide suggestions to enable responsible AI adoption, but we don’t attempt to provide actions or steps to be taken to abide by specific legal frameworks. Tailored professional advice should be sought before taking or refraining from taking any action to mitigate ethical and legal risks associated with implementing a new AI tool.

Legal and regulatory frameworks

Industries and regions around the world have varying standards and laws that regulate the use of data and AI tools and need to be considered when introducing a new AI tool in the M&A process. These include for example:

  • AI specific laws (see more detail below)
  • Data protection laws
  • Intellectual property and copyright laws
  • Fairness legal frameworks 

We explained how the legal risks arise under the risk section

ICAEW's Generative AI guide highlights the legal considerations specifically related to generative AI. Tailored legal advice should be sought to mitigate legal risks associated with implementing a new AI tool. The laws need to be considered in the context of the creation or sale of AI and the use of third-party AI tools. 

AI specific laws

UK law and regulation

In February 2024, the Department for Science, Innovation and Technology, released guidance setting out the considerations that regulators may wish to have when developing tools and guidance to implement the UK’s approach to AI regulation and it's five regulatory (voluntary) principles. As of September 2024, there are no specific AI laws in the UK, but the July 2024 King’s Speech presented to parliament contained two bills which could impact the use of AI tools and the data it uses.

Faculty policy work on AI

The faculty has been involved in policy work related to AI. We outline these developments and market recommendations below. 

The Select Committee on Artificial Intelligence was appointed by the House of Lords on 29 June 2017 “to consider the economic, ethical and social implications of advances in artificial intelligence”. In its 2019 report AI in the UK: ready, willing and able? , the committee considered the UK’s readiness for AI, as well as ethical and legal considerations in deploying such models. Lord Clement-Jones chaired this committee and at the time was also on the board of the ICAEW Corporate Finance Faculty. 

Since 2019, ICAEW has been working with the All-Party Parliamentary Group on Artificial Intelligence (APPG AI) to address the big public-policy, financial and practical implications of AI for corporate decision-making and investment. That same year, the faculty produced a white paper titled AI in Corporate Advisory which covered the practical potential of, and challenges for, the use of AI specifically in corporate transactions throughout the deal process.

In 2021, the Science, Innovation and Technology Committee released a report on The roadmap to an effective AI assurance ecosystem to ensure the UK is the “most trusted and pro-innovation system for AI governance in the world”. The report explains the belief that “strong, existing professional services firms, alongside innovative start-ups and scale ups, will provide a range of services to build justified trust in AI”. It also illustrated the other actors in the AI assurance ecosystem. 

In 2022, the Corporate Finance Faculty convened an expert group to investigate how the sector can ready itself for the Age of AI. The faculty contributed expertise to the Department for Digital, Culture, Media & Sport and the Office for AI report Understanding UK AI R&D commercialisation and the role of standards, which was published by DCMS and the Office for AI in May 2022. 
In July 2023, a House of Commons Committee report, titled "The governance of artificial intelligence" was released by the Science, Innovation and Technology Committee with recommendations to the government. In August 2023, the committee released a white paper A pro-innovation approach to AI regulation, describing its proposed approach to regulating AI, for consultation. ICAEW’s response can be found here

EU law and regulation

In March 2024, the European Union passed its extensive Artificial Intelligence Act, which aims to ensure that AI systems deployed there are “safe, transparent, traceable, non-discriminatory and environmentally friendly.”  There is, however, a transitional period before full implementation, with most rules of the AI Act being applicable on 2 August 2026.

In June 2024, the EU launched its AI office, responsible for enforcing the implementation of the AI Act and it will apply sanctions where necessary.

The requirements of the AI Act will roll out in stages, and on 1 August 2024, the European Commission released a statement that the European Artificial Intelligence Act (AI Act) came into force. The stages of the Act mean that from November 2024 there will be a ban on any AI systems that pose unacceptable risks (systems which could be a threat to individuals, such as social scoring, real-time and remote biometric identification, and behavioural manipulation systems). From February 2025, the proposed Codes of Practice will enter into force and in May 2025 transparency requirements on general-purpose systems will be put into force.

The Act applies to general purpose AI service providers operating or “put into service” within EU member states. Both providers and deployers of the tools will fall into scope of the Act when using high risk systems’ outputs in the EU. Seek legal advice when considering whether your AI operations are subject to the Act. 

Read more about the key obligations under the EU AI Act.

Ethical considerations

For Chartered Accountants, including those working within advisory, ethical considerations are integral to their roles. Corporate financiers who are members of other professional bodies may also have relevant ethical obligations.

The International Ethics Standards Board for Accountants (IESBA) Handbook for Professional Accountants defines five fundamental principles around Integrity, Objectivity, Professional Competence and Due Care, Confidentiality and Professional Behaviour, which also apply to the use of technology.  

Responsible AI adoption by chartered accountants requires a commitment to the five fundamental ethical principles. 

Demonstrating this commitment can involve actions such as the following:

  • Transparency and explainability: Disclose AI use in deal or investment processes and ensure stakeholder understanding
  • Fairness and bias mitigation: Regularly assess and correct AI algorithms for bias
  • Security and privacy: Establish clear policies on data use and protection in AI tools
  • Accountability: Define policies for AI-based decisions and maintain human oversight. 
  • External collaboration: Stay informed on AI developments, collaborate with regulators and stakeholders to develop ethical guidelines, standards, and best practices for AI in M&A
  • Supplier engagement: Work with AI tool suppliers to understand their systems, testing and data provenance used for training
  • Learning and development: Train staff on ethical decision-making, and equip them with increased AI literacy to understand the implications of AI outputs

Further resources

Disclaimer

This AI in Corporate Finance content is being provided for information purposes only. ICAEW will not be liable for any reliance you place on the information in this material. You should seek independent advice © ICAEW 2024