RANZCR Chapter 9: What every practice director needs to know
The Royal Australian and New Zealand College of Radiologists published Chapter 9 of their Standards of Practice in late 2025, establishing formal expectations for how radiology practices govern AI tools used in clinical workflows. This is the first time a specialist medical college in Australia has issued binding guidance specifically addressing AI governance in diagnostic imaging.
For practice directors, Chapter 9 is not optional reading. It sets the baseline that regulators, insurers, and courts will reference when evaluating whether your practice met its standard of care. If your practice uses AI in any clinical capacity — triage, detection, measurement, or reporting — Chapter 9 applies to you.
This article breaks down the five key sections of Chapter 9, explains what compliance looks like in practice, and highlights the gaps most practices miss.
Section 1: AI tool registration and documentation
Chapter 9 requires every practice to maintain a register of all AI tools in clinical use. This is not a spreadsheet of product names. The register must document each tool's intended clinical purpose, its regulatory status (TGA classification), the vendor, the software version in use, and when it was last reviewed.
What compliance looks like: A living register that is updated whenever a tool is added, removed, or updated. Each entry includes the tool's intended use boundaries — what it was designed to do, and critically, what it was not designed to do.
Common gap: Most practices can name their AI tools but cannot produce documentation of TGA classification, version history, or intended use boundaries. If your register exists only in someone's memory, it does not exist.
Section 2: Risk assessment and management
The standard expects practices to conduct formal risk assessments for each AI tool before it enters clinical use, and to review those assessments periodically. Risk categories include clinical safety (false negatives, false positives), workflow impact (alert fatigue, system dependencies), data security, and patient consent.
What compliance looks like: A documented risk assessment for each tool, reviewed at least annually or whenever the tool is updated. Risk controls are assigned to named individuals, not to roles or departments.
Common gap: Practices often conduct an informal assessment when they first adopt a tool, but never revisit it. When the tool receives a major update — a new model version, expanded indications, or a change in processing pipeline — the original assessment becomes stale.
Section 3: Performance monitoring
This is where most practices fall furthest behind. Chapter 9 mandates that practices track the performance of their AI tools against clinical ground truth. For radiology, this means monitoring concordance rates between AI outputs and radiologist determinations, tracking false positive and false negative rates, and documenting any cases where AI output was materially incorrect.
What compliance looks like: A structured monitoring programme with defined metrics, review frequency, and escalation thresholds. If your chest X-ray AI flags 40% of studies as abnormal but your radiologists only agree with 12% of those flags, you need to document that discordance and have a plan to address it.
Common gap: The overwhelming majority of practices have no structured performance monitoring. They rely on informal radiologist impressions — \"the AI seems to work well\" — which is neither measurable nor defensible.
Section 4: Human oversight and clinical authority
Chapter 9 is unambiguous: the radiologist retains final interpretive authority over all AI-assisted findings. AI tools are decision support, not decision makers. The standard requires practices to document how human oversight is maintained, including workflows that ensure radiologists review AI outputs before they influence clinical reports.
What compliance looks like: Documented workflows showing that AI outputs are reviewed by a qualified radiologist before being incorporated into reports. Clear policies on what happens when a radiologist disagrees with an AI finding.
Common gap: Workflow documentation often does not exist. Practices assume that because radiologists review images, oversight is automatic. But Chapter 9 asks for evidence that oversight is systematic, not incidental. Can you demonstrate that every AI-flagged finding was reviewed by a radiologist? If the answer requires you to check individual cases, your process is not documented.
Section 5: Governance documentation and accountability
The final section ties everything together. Practices must have a documented AI governance policy, assign governance responsibilities to named individuals, and maintain an audit trail of governance activities. This includes meeting minutes, policy review dates, incident reports, and any changes to the AI tool register.
What compliance looks like: A governance policy that names the responsible individual (often the practice director or a designated governance lead), specifies review frequencies, and is actually followed. An audit trail showing that governance activities happen on schedule.
Common gap: Policies exist but are not maintained. The governance lead was assigned two years ago but has not convened a review. The policy references tools that have since been replaced. The audit trail is empty because nobody logged the reviews that did happen informally.
Self-assessment checklist
Tick each requirement your practice currently meets. The tool will score your compliance in real time.
Interactive self-assessment
RANZCR Chapter 9 Compliance Checklist
What happens next
RANZCR has not yet announced an enforcement timeline, but the direction is clear. State health departments, medical indemnity insurers, and the courts will treat Chapter 9 as the benchmark. Practices that can demonstrate compliance will be in a strong position when questions arise. Those that cannot will face uncomfortable conversations.
The standard is not asking practices to stop using AI. It is asking them to govern it properly. For most practices, the work is not technically difficult — it is organisational. Having a system that tracks tools, documents risks, monitors performance, and maintains an audit trail is what separates compliant practices from those that are simply hoping for the best.
See how your practice measures up against RANZCR Ch.9
Take the free AI Governance Readiness Assessment and see where your practice stands.
Take the assessment