AI Readiness Checklist¶
This checklist helps Australian businesses decide if they are ready to adopt AI safely, responsibly, and effectively.
It reflects the Voluntary AI Safety Standard (10 Guardrails) released by the Australian Government and is consistent with international frameworks such as ISO/IEC 42001:2023 and the NIST AI Risk Management Framework.
Tick the boxes that apply to your organisation. A higher score means greater readiness.
1) Strategy & Governance¶
- [ ] Clear AI vision linked to business goals
- [ ] A named senior owner for AI (accountable leader)
- [ ] An AI Use Policy covering acceptable use, privacy, and IP
- [ ] Approval process for new AI initiatives
2) Data & Privacy¶
- [ ] Up-to-date data inventory and quality checks
- [ ] Compliance with the Privacy Act 1988 (APPs)
- [ ] Protections for business IP (Copyright Act 1968)
- [ ] Ability to anonymise or pseudonymise personal data
3) Risk & Impact¶
- [ ] Risk and impact assessments completed (bias, safety, rights)
- [ ] High-risk use cases identified and controlled
- [ ] Sign-offs recorded before deployment
4) People & Skills¶
- [ ] Human oversight for important decisions
- [ ] Staff trained on safe AI use and escalation paths
- [ ] Clear process for reporting incidents or issues
5) Testing & Monitoring¶
- [ ] Pre-deployment testing (performance, fairness, robustness)
- [ ] Ongoing monitoring for errors, drift, and safety
- [ ] Records kept of models, prompts, and key decisions
6) Suppliers & Partners¶
- [ ] Vendors align with Australia’s 10 Guardrails
- [ ] Contracts cover privacy, IP, and security requirements
- [ ] Regular review of vendor practices and updates
Scoring (quick read)¶
- 0–10: Early stage - build governance and staff skills first
- 11–20: Mid stage - ready for pilots with strong oversight
- 21–30: Advanced - ready to scale with continuous improvement
Use alongside the Voluntary AI Safety Standard (10 Guardrails) for best practice adoption in Australia.
Template Disclaimer & Licence¶
Disclaimer¶
The purpose of this template is to provide best practice guidance on implementing safe and responsible AI governance in Australian organisations.
SafeAI-Aus has exercised care and skill in the preparation of this material. However, SafeAI-Aus does not guarantee the accuracy, reliability, or completeness of the information contained.
The content reflects best practice principles but is intended as a starting point only. Organisations should adapt this template to their specific context and may wish to seek advice from legal counsel, governance, risk, or compliance officers before formal adoption.
This publication does not indicate any commitment by SafeAI-Aus to a particular course of action. SafeAI-Aus accepts no responsibility or liability for any loss, damage, or costs incurred as a result of the information contained in this template.
Licence¶
This template is made available under the Creative Commons Attribution 4.0 International (CC BY 4.0) licence.
You are free to:
- Share — copy and redistribute the material in any medium or format.
- Adapt — remix, transform, and build upon the material for any purpose, even commercially.
Under the following terms:
- Attribution — You must give appropriate credit, provide a link to the licence, and indicate if changes were made.
Attribution statement for reuse:
“This template was developed by SafeAI-Aus and is licensed under CC BY 4.0. Source: SafeAI-Aus.”
Full licence text: https://creativecommons.org/licenses/by/4.0/