Covid, ‘me too’, ESG zijn ontwikkelingen die vragen om (nog) meer aandacht voor cultuur, van het bestuur én van de auditor. Cultuur is veelomvattend, dynamisch en complex en leent zich niet voor één blauwdruk van hoe de organisatiecultuur eruit zou moeten zien. Het ontbreken van een dergelijke norm én de ‘’zachtheid’’, oftewel de moeilijke meetbaarheid van het begrip, maken het voor de internal auditor een uitdaging om cultuur mee te nemen in de werkzaamheden. Tegelijkertijd kan de internal auditor, als relatieve buitenstaander, bestuurders onafhankelijk informeren, ook over de mate waarin houding en gedrag ondersteunend zijn aan de strategie.
Het rapport ‘Cultiveer een gezonde cultuur’, een samenwerking tussen IIA Nederland en KPMG biedt een groot aantal handvatten om te reflecteren op je eigen ambitie en aanpak als internal auditor, en waar nodig bij te dragen aan de doorontwikkeling van cultuuraudits. Centraal in het rapport staat de vraag hoe internal auditors op dit moment cultuur en gedrag meenemen in hun audits, gegeven de hiervoor geschetste ontwikkelingen.
Het rapport is tot stand gekomen middels een combinatie van kwalitatieve en kwantitatieve onderzoekstechnieken. Er is gebruik gemaakt van een enquête, groepsinterview en verdiepende gesprekken.
This final installment of The IIA’s Global Knowledge Brief series on GRC addresses how GRC systems are evolving from the incorporation of new technologies as well as what inherent risks are involved in embracing digital transformation. This brief also addresses where internal audit fits into this conversation and how it might best aid organizations as they continue this critical journey.
You can find part 1 here.
You can find part 2 here.
This is for members only. To access it and other valuable resources, become a member today.
De nieuwste handreiking van IIA Nederland ‘’Tekstanalyse, gewoon doen!’’ beschrijft de belangrijkste mogelijkheden en aandachtspunten in het toepassen van tekstanalyse in audits. Tekstanalyse is een verzameling geautomatiseerde technieken die leidt tot het extraheren van nieuwe informatie en bruikbare inzichten uit tekstuele gegevens. Ook in de auditpraktijk zijn er allerhande vraagstukken waarvoor tekstanalyse een toegevoegde waarde is.
Begin 2022 is er een enquête uitgestuurd onder de leden van het IIA als opmaat naar deze handreiking. Hieruit bleek dat zij zowel kansen als uitdagingen voorzien bij het gebruik van tekstanalyse. Deze handreiking van de Commissie Professional Practices van het IIA, helpt auditors daarbij.
The market context in which insurance companies operate is fundamentally changing. The use of data and Artificial Intelligence (AI) algorithms is growing significantly and is expected to be a key currency of future success. With the huge quantities of data created across the insurance value chain, AI provides tremendous opportunities for further automation of processes, development of new, more customer-centric products and the assessment of insurance risks. With these new possibilities, processes are becoming more complex and risks need to be handled. AI algorithms may have a direct impact on people and therefore ethical and privacy questions arise, which in turn brings regulators and industry bodies to the discussion to avoid adverse effects, without stifling the innovation and potential of AI.
Insurance companies must achieve the right balance between improving their operations with the new solutions which AI will make possible and managing the corresponding risks. This requires rigorous risk assessment and management of the development, implementation and use of AI. The importance is reflected by various legislation currently under development across the world, including the European Union’s AI Act, which includes penalties of up to 6% of total worldwide annual turnover. With these regulatory requirements and the potential reputational implications, AI risk management cannot be completely diversified or assessed proportionally. No matter the size of the insurance company, it can be catastrophic for reputation and business if customers are harmed by AI. That’s why Internal Audit should play a role in providing assurance and advice on mitigating risks arising from implementing AI.
The Internal Audit function can, according to its mandate, help organizations with the balancing act between risk mitigation and business innovation. This could include developing strategies for assurance to govern AI, data privacy and security, reviewing processes for potential bias and ensuring compliance with relevant laws and regulations. In addition, internal auditors can provide insights and advice for companies in understanding and mitigating the risks associated with AI adoption and use.
Internal Audit should be involved from the start of new AI implementations to provide advice on how to implement AI securely, according to policies and regulation. Following a top down approach is wise, starting with auditing the AI strategy, governance and test individual instances, algorithms and models, starting with high risk AI. This will ensure that the development is being conducted in an efficient and effective manner and that controls are in place tailored to the risks related to the specific AI implementation.
Internal Audit should not only provide assurance over the process of developing AI, but also perform risk-based deep dives to ensure AI implementation is compliant and working effectively. Auditing AI includes technical aspects, data governance and quality, ethical themes and business application. Therefore, a multidisciplinary audit team should be formed. The team should have representatives from IT audit, data science, business audit and specific technical expertise such as actuaries, as well as ethics, to ensure each aspect is thoroughly assessed. Hence, Internal Audit departments should upskill their staff where needed, to stay ahead of key new developments, and be able to independently assess the risks, plan and execute audits as required. Our research has shown that most Internal Audit departments are at an early state of establishing the required skills and processes, and often not keeping up with the rapid development in use of AI in the Insurance industry. For these reasons, this paper contains a proposal of an AI Audit Program, where the most important AI related risks, possible root causes and testing strategies are identified.
Management guru Peter Drucker once said, “[only] what gets measured, gets managed.” So, how are organizations quantifying non-financial risk? Internal audit can play a key role in helping organizations develop strategies that tackle this issue.
This Global Knowledge Brief, the second in a three-part series on governance, risk, and control (GRC), examines the challenges of quantifying non-financial risks and how companies are addressing them, as well as the important role that internal audit can play in advancing understanding in this area.
You can find part 1 here.
You can find part 3 here.
This is for members only. To access it and other valuable resources, become a member today.
{"required":"Dit veld is verplicht","email":"Voer een geldig e-mailadres in","confirm":"De velden hebben niet dezelfde waarde","password":"Uw wachtwoord voldoet niet aan de criteria. Uw wachtwoord moet minimaal 8 tekens lang zijn en minimaal \u00e9\u00e9n letter en \u00e9\u00e9n cijfer bevatten.","phone":"Voer een geldig telefoonnummer in","saved":"Opgeslagen!","failed":"Kan niet opslaan","error":"Er is iets misgegaan"}