Boardroom leadership needed to manage artificial intelligence risks to drive trust, highlights ACCA  

Businesses urged to take steps to maximise the opportunities of AI and lay foundations for responsible use of new technologies

Mumbai : Chief executive officers (CEOs) and chief financial officers (CFOs) need to build trust in artificial intelligence (AI) by taking steps in their organisations to manage the associated risks, underscores ACCA.

As AI plays a greater role in the accounting and financial reporting of businesses, CFOs and financial controllers will have to be confident about the adequacy of oversight and controls of AI systems.

In the first in a series of insights, AI monitor: trust, ACCA (the Association of Chartered Certified Accountants) urges finance professionals to ensure that AI governance and AI risk management is in place, beginning with:

  • Investing in AI literacy and skills development: finance professionals must invest in education and training to critically evaluate AI outputs, communicate clearly with key stakeholders, and make informed decisions.
  • Collaborating via cross-functional teams: finance professionals should actively engage with IT, data science, legal and risk management teams.
  • Developing an AI governance framework: beginning with critical uses, finance professionals should take steps within their organisation to establish clear policies, oversight and governance practices.

AI presents many opportunities to businesses such as providing more insights from a wider array of information sources, driving greater efficiency and better customer experiences. But it also poses a challenge to trust in accounting and finance reporting with new dynamics being introduced to the traditional trust mechanisms that underpin corporate accounting.

Alistair Brisbourne, Head of Technology Research, ACCA, said: ‘Introducing AI is both about trust in the systems and trust in the people that we work with, and how we bring those two elements together.

 

‘CEOs and CFOs need to focus on making the changes needed to harness the many potential opportunities but also retain trust. This includes upskilling to deal with the technology and introducing new knowledge into their organisations. They also need to focus on the governance, the oversight and culture required to allow different teams to work together effectively. It’s about bringing change management and governance together.’

AI monitor: trust highlights some of the risks of AI in accounting systems, such as:

  • Impacting decision making without clearly explaining the rationale of the forecast or recommendation;
  • An over-dependence on AI procedures in auditing and assurance and a decline in use of human intervention and judgement;
  • Concern over AI bias or error in fraud detection, risk assessment and compliance monitoring;
  • Over relying on AI-powered virtual assistants which give inaccurate or inappropriate responses.

Brisbourne said: ‘In the AI era the role of finance professionals is to focus on the outcomes driven by technology. Value lies in understanding how these outputs inform decisions and actions that drive business outcomes.’

In 2024 future issues of the AI monitor will explore talent, risk and controls, the relevance of effective data strategy, and sustainability applications.