AUASB Information

Impact of AI on Auditors

Issued July 2025

Introduction

This webpage outlines some considerations for auditors:

  1. in the use of artificial intelligence (AI) tools on audit engagements (AI audit tools);
  2. where an audited entity uses AI in producing information that is, or could be, material to financial and sustainability reports subject to audit; and
  3. on the use of AI to improve communication in internal or external documents.

This webpage does not deal with the use of AI in an audit firm’s internal quality management processes.

Background

The application of AI technology presents auditors with both opportunities and risks.

In this webpage, AI audit tool includes automated software tools that:

  1. extract information from unstructured sources;
  2. identify patterns or anomalies in information; and/or
  3. make ‘judgements’ or ‘recommendations’ based on an analysis of information for consideration by the auditor.

An AI audit tool uses criteria that may be changed by the tool based on information available, identification of patterns and relationships, by drawing on analysis used by similar applications and/or analysis of past exceptions identified.

Use of AI tools in audits

The use of AI audit tools can enhance the effectiveness and efficiency of an audit. However, AI audit tools must be used appropriately, or audit quality may be compromised.  

Subparagraph 32(f) of ASQM 1 Quality Management for Firms that Perform Audits or Reviews of Financial Reports and Other Financial Information, or Other Assurance or Related Services Engagements requires any AI audit tools obtained or developed for use in performing audits to be appropriate and to be appropriately implemented, maintained, and used.

Examples of the use of AI audit tools include:

  1. Risk assessment – To assist in identifying unusual patterns or trends, or anomalies in information that may require audit focus and a change in the nature and extent of audit procedures;
  2. Extracting information – To assist in extracting, classifying and analysing information from unstructured documents and sources internal and/or external to the audited entity for use in independently assessing the completeness and accuracy of information supporting the audited entity’s financial/sustainability report;
  3. Analysing data – To assist in identifying possible anomalies, inconsistencies or omissions in information supporting the financial/sustainability report data.

Consideration on using AI tools in audits

AI audit tools may be “black-box” systems.  It may be challenging or impracticable to understand how the tool is arriving at its conclusions or outputs.

Firms also need to consider and manage data privacy and data security risks, as much of the data that auditors use is proprietary data of the audited entity.

Some considerations for auditors in planning, developing and implementing AI audit tools include:
  1. Assessing and approving tools – Tools should be reviewed, assessed and approved for use at firm level, whether developed internally or externally. Firm tools should be reviewed and approved for specific uses on each engagement. Engagement specific tools and modified firm tools should also be subject to appropriate review and approval by suitably qualified, experienced and senior partners/specialists;
  2. Capability and capacity – Partners and staff using AI audit tools should be properly trained to understand and use such tools appropriately and effectively, including knowing their limitations and avoiding ‘automation bias’;
  3. Cost/benefit – The costs and benefits of using an AI tool in the specific circumstances should be properly assessed having regard to the other matters listed here before work commences;
  4. Understanding the business – The auditor should have a good understanding of the business and the factors that need to be taken into account in the model. After the first year in which an AI audit tool is used for a particular aspect of the audit engagement, consideration should be given as to how changes in the factors such as the business, operations, contracts and data availability may have affected the application of the tool and its outputs;
  5. Planning –Sufficient time should be allowed to effectively adapt and implement a tool on an audit engagement, including time to address unanticipated issues;
  6. Alternative procedures – Sufficient time should be allowed to adopt alternative procedures if the tool proves to be ineffective, particularly in the first year of use on an engagement. Consideration should be given to using a tool in parallel with other procedures in the first year;
  7. Inputs - Considerations may include:
    1. Data used - Assessing what data is being used by the tool and what relevant data is not being taken into account; 
    2. Currency - Whether data is sufficiently current (e.g. if the application relies on the audited entity's own data or is for non-comparable entities);
    3. Biases – Biases in data sources that could affect the results (e.g. if the application relies on the audited entity’s own data or is for non-comparable entities);
    4. Reliability – Whether the data used is reliable and independent, including testing of data used and relevant controls;
    5. Assumptions – Whether assumptions used are reasonable and supportable;
  8. Process - AI can create challenges that go beyond 'black boxes' because the logic used can change and code reviews and reperformance may not be possible. Considerations may include: 
    1. Logic - The data and logic used by the model should be understood and reviewed;ii.
    2. Transparency – To enable the logic review, the tool should produce an ‘audit trail’ showing the data used and the logic applied to that data;
    3. Changes – Any changes in the data used, assumptions and logic should be understood and reviewed. The review should ensure that appropriate changes have been taken into account;
    4. Ability to challenge – The auditor may be unable to challenge information and judgements underlying the entity’s financial/sustainability report if the data used and logic applied by the AI tool cannot be explained;
  9. Outputs - Considerations may include:
    1. Reasonableness - The auditor should consider the reasonableness of the output;
    2. Testing – Outputs should be test checked for accuracy (e.g. accuracy of summarising information from unstructured documents);
    3. Scenarios – How the tool deals with different circumstances and conditions should be tested;
    4. Professional scepticism – Auditors should maintain professional scepticism and critically evaluate outputs, including considering the possibility of bias in the results and AI hallucination (i.e. that the tool generates inaccurate or misleading information as factual);
  10. Review - The design, use, results and conclusions should be reviewed by the audit partner and firm specialists. AI tools are not a substitute for human judgement;
  11. Cybersecurity and privacy - Ensuring that any confidential information of the audited entity is properly protected in accordance with best cybersecurity practices and relevant privacy laws;
  12. Documentation - Such that another professional can understand the matters outlined above, including the design and use of the tool, the specific data and assumptions used, the logic applied, outputs and the basis for any conclusions reached by the auditor.

Impact of the use of AI by audited entities

The auditor should consider how the use of AI affects the information used in financial/sustainability reports.

Auditors must consider the risk that technology poses to entities as part of the auditor’s risk assessment process. Auditors should exercise caution not to over rely on the information generated using technology (automation bias).

Audited entities may rely on AI applications in processes that produce or support material information or judgements in financial/sustainability reports. Auditors need to understand these applications, identify and assess risks in determining the nature and extent of audit work.

The considerations listed above for using AI audit tools on understanding the business (paragraph d above), and assessing inputs, processes and outputs (paragraphs g, h and i above) are similarly relevant in gaining sufficient appropriate audit evidence on the completeness and accuracy of information produced by the audited entity’s AI applications.

The auditor should also consider the control environment, the audited entity’s systems and processes (including general IT controls), as well as management review of inputs, assumptions, logic and outputs.

There may be additional challenges where information is generated by third party proprietary AI applications. If there are restrictions on the auditor accessing the logic, data and assumptions used, there may be a limitation on scope if suitable alternative procedures are not possible.

Auditors should also be alert to the possibility of AI being used to fabricate documents that purport to be from third parties and support information in financial and sustainability reports. Auditors may need to place greater reliance on independent third-party data sources—such as direct confirmations—to validate evidence authenticity.

Improving communication

AI applications are available to assist in improving an auditor’s written communications. For example, AI applications may assist the auditor in writing for the audience, using clear and concise wording, improving grammar, and clearer presentation formats.

Consideration should be given to cybersecurity threats and privacy of information. Confidential information should not be provided to an external application.  Ideally, tools should not send information outside the firm.

Care should be taken in using AI generated summaries of documents.  The summaries may not be accurate, may omit important aspects or nuances, and may use information out of context.

Documents generated by AI from web searches and third party documents may not be reliable.

Important note

This webpage outlines some considerations for practitioners in applying AUASB standards when using AI audit tools in an audit or assurance engagement or where an audited entity uses AI in producing material information supporting a financial report or information in a sustainability report.  It is not an authoritative publication of the AUASB and practitioners should use their own professional judgement when conducting an audit or assurance engagement.

The circumstances of the audited entity’s business, identified risks, systems, processes and controls will differ between entities. This will affect the use of any AI audit tools and the nature and extent of audit or assurance work required.

This webpage does not establish new principles or amend existing standards. It is not intended to be a substitute for compliance with relevant AUASB standards.

Guidance on the audit implications of using AI may be developed as part of the International Auditing and Assurance Standards Board (IAASB) technology project.  This webpage may be updated or replaced with regard to any future IAASB guidance.

Copyright

© 2025 Auditing and Assurance Standards Board (AUASB).  The text, graphics and layout of this webpage are protected by Australian copyright law and the comparable law of other countries.  Reproduction in unaltered form (retaining this notice) is permitted for personal and non‑commercial use, acknowledging the AUASB as the source. Otherwise, no part of this webpage may be reproduced, stored or transmitted in any form or by any means without the prior written permission of the AUASB. Requests and enquiries concerning reproduction and rights for commercial purposes should be addressed to the ‘Auditing and Assurance Standards Board, PO Box 204, Collins Street West, Melbourne, Victoria 8007’ or sent to [email protected].

Subscribe for updates Subscribe