ICAEW.com works better with JavaScript enabled.

When using generative AI, it is important to look at ethical considerations.

Ethical considerations are already integral to the role of accountants. The International Ethics Standards Board for Accountants (IESBA) Handbook for Professional Accountants defines five fundamental principles around Integrity, Objectivity, Professional Competence and Due Care, Confidentiality and Professional Behaviour, which apply to the use of technology. 

The code was updated in 2022 to spell out specific ethical threats relating to the use of technology, such as compromised objectivity caused by undue influence of or reliance on technology (section 112.1), and inadequate professional competence and due care due to limited awareness and understanding of technology (113.1 A2). Whilst these relate to technology in general, they are applicable to AI and Generative AI more specifically. 

The ICAEW Code of Ethics references the five IESBA principles and they are applicable as follows: 

  1. Integrity – being transparent and honest in the use of Generative AI is essential, especially if it has been used to support research or analysis that is being delivered to a client. It is also important that accountants validate the integrity of model inputs and operation before using them, especially where inputs/training data is generated by other parties. In addition, whilst prompt engineering can be helpful at leading Generative AI tools towards providing a better-quality response, it can also be used to intentionally produce a specific response to justify a pre-determined action. Procedures such as independent reviews should be implemented to help address this. 
  2. Objectivity – the role that bias plays in the use of Generative AI has been covered previously, but it is necessary to recognise where the outputs of generative AI tools may have been improperly influenced by specific sources (and indeed, being aware and again transparent where your professional judgement has been influenced by generative AI). For example, it is known that ChatGPT draws very heavily on Wikipedia to source its more ‘factual’ responses. Auditors should avoid automation bias reflected in overreliance on the use of Generative AI and should exercise professional judgement in performing their work. Free generative AI tools may also be subject to conflicts of interest, as ultimately those providers will be looking at other ways to commercialise their product. When using Generative AI, it is important for accountants to have comfort over the sources of data used to train models and how considerations of bias have been managed, particularly when using a third-party model trained on internet data for which these details may not be readily available. Accountants must ask the right questions of suppliers and get the evidence to gain this comfort.
  3. Professional competence and due care – Generative AI tools do not make up for a lack of professional knowledge and skill; they are not fully qualified accountants. It’s also worth being aware that many generative AI tools are not able to draw on the latest news and legislative developments which may limit the quality of its output. In addition, public models are largely based on what is publicly available which may not include the best inputs and may be incorrect. Furthermore, models will lack the human creativity to come up with new and innovative solutions to problems. When using such tools, accountants should ensure that they understand how the tool works including the risks and limitations and how to effectively manage these and use the tool to get the required output, including exercising professional scepticism when reviewing the output. Accountants should also continue to develop their knowledge and skills through continuous professional development in the use of Generative AI.
  4. Confidentiality – as has been touched upon, consideration must be given to the type of data that is input into Generative AI tools. Respecting the confidentiality of organisational and client information means not loading confidential information into public Generative AI tools, even if such information has become publicly available. There is limited visibility and control over who the information is shared with, how it is secured and how long it is retained. This is also important for compliance with data protection laws (covered in the Legal Section of the guide).
  5. Professional behaviour – action that discredits the profession around the use of generative AI, could include being overly reliant on the output of generative AI tools and restating as fact outputs that may, as discussed earlier, be hallucinations, especially when such output is used in work that causes significant harm to other parties. There is also a need to be aware of the legal frameworks around the use of AI more broadly, some of which already exist in e.g. in data protection, intellectual property and employment law, and other more specific regulations which many jurisdictions are starting to introduce (see the section on Legal Framework and Regulations)

Aside from these principles, the focus on human good and the growing areas of AI and data ethics also require consideration. Being aware of data sources and limitations, data protection laws, the benefits and harms that individuals might face when using AI, and the transparency and openness around decision-making, are all crucial considerations. UNESCO has established a set of Recommendations on the Ethics of Artificial Intelligence, which all UN Member States have adopted.  

The IESBA Code also points out risks related to the provision of non-assurance technology services such as in the design and implementation of AI systems and applications to audit clients, including whether accounting firms should take on such work. Accountants should also be aware of the risk of assuming management responsibility e.g. for making decisions and operating suitable controls around AI systems and should ensure that responsibility always remain with the client. Self-review of AI systems and applications that their firms were involved in designing or implementing can also be a threat and firms should implement appropriate safeguards when deciding to undertake such work, including having different teams for different engagements. IESBA has provided guidance on Applying the Code's Conceptual Framework to Independence.pdf  when working on technology relates scenarios, which applies to Generative AI.

Finally, accountants should be conscious of their duty to act in the public interest and report any use of Generative AI that breaches or is suspected to breach laws and regulations in line with IESBA’s “Non-Compliance with Laws and Regulation” commonly known as the NOCLAR obligations. The IESBA code of ethics and the ICAEW guidance on NOCLAR  provide guidance on how to go about meeting the responsibilities. 

Additional resources

You may also be interested in

Elearning
Finance in a Digital World - support for ICAEW members and students on digital transformation and technology
Finance in a Digital World

ICAEW has worked with Deloitte to develop Finance in a Digital World, a suite of online learning modules to support ICAEW members and students, develop awareness and build understanding of digital technologies and their impact on finance.

Resources
Artificial intelligence
Artificial intelligence

Discover more about the impact of artificial intelligence and the opportunities it presents for the accountancy profession. Access articles, reports and webinars from ICAEW and resources from tech experts.

Browse resources
Event
Trust and trust are key to ethics
Using ethics to build a better world

For Global Ethics Day, ICAEW is hosting an event discussing the power of ethics, covering: the link between ethics and economic crime, professional standards, modern slavery, and an ethics-by-design approach to AI.

Register now
Open AddCPD icon

Add Verified CPD Activity

Introducing AddCPD, a new way to record your CPD activities!

Log in to start using the AddCPD tool. Available only to ICAEW members.

Add this page to your CPD activity

Step 1 of 3
Download recorded
Download not recorded

Please download the related document if you wish to add this activity to your record

What time are you claiming for this activity?
Mandatory fields

Add this page to your CPD activity

Step 2 of 3
Mandatory field

Add activity to my record

Step 3 of 3
Mandatory field

Activity added

An error has occurred
Please try again

If the problem persists please contact our helpline on +44 (0)1908 248 250