SAAF Hackathons: Building Trusted AI for Internal Audit

Nieuws 05/02/2025
SAAF Hackathons: Building Trusted AI for Internal Audit

Audit, risk and technology teams are exploring how AI can support their work, yet many professionals still struggle with practical and responsible implementation. Tools evolve quickly, but questions about safety, explainability, governance and legislation (such as the EU AI Act) remain.

The Shared Audit Agents Framework (SAAF) addresses this challenge. Developed collaboratively by ISACA, NOREA and IIA Netherlands, SAAF provides a shared foundation for building AI-supported audit processes that remain understandable, manageable and reproducible across organizations.

Why SAAF matters

Internal auditors increasingly face repetitive manual work and fragmented AI offerings. SAAF shifts the focus to learning by building, helping audit professionals understand how AI Agents can be designed responsibly and aligned with audit methodology, governance expectations and regulatory requirements.
This collaborative approach supports the professional judgment of auditors and strengthens the role of internal audit within the broader governance structure.

What we build together

Throughout a series of hackathons, participants co-create open-source AI Audit Agents designed around four key components:

  • Prompts: domain-specific audit prompts
  • Tools: reusable scripts and utilities
  • Regulatory alignment: mappings to standards such as SOC2 and DORA
  • Outputs: consistent, machine-readable audit deliverables

This shared structure allows organizations to implement the agents in their own environments—Copilot, Gemini, Claude or local LLMs—without vendor lock-in.

How the hackathons work

Following SAAF’s high-level co-creation process, participants:
 

  1. Identify shared audit needs
  2. Translate these into simple reusable design patterns
  3. Build helpful agents focused on clarity, safety and explainability
  4. Adapt the agents privately within their own environment
  5. Reflect and improve together through lessons learned

The result is a growing, validated open-source knowledge base that supports innovation across the audit profession.

Who should join

These sessions are designed for organizations that want to apply AI in audit and risk without compromising governance, control or compliance:
 

  • Heads of Internal Audit & Risk
  • IT auditors and technology risk professionals
  • Data, AI and innovation teams

Participation is possible at different levels:

  • User
  • Co-developer
  • Strategic partner

Upcoming sessions

The next hackathon will take place soon.

Accepting the invitation implies commitment to attend. New participants will not be admitted after session 4. 

CPE eligibility

Participation is eligible for CPE credit. The exact number of credits per session will be communicated shortly.

Stay informed

Would you like to take part in this collaborative AI initiative and work alongside peers from ISACA, NOREA and IIA Netherlands?

Registration details will be published here soon.

 

Andere nieuwsberichten