Building a clinical AI governance framework from scratch
Most radiology practices that use AI tools have no formal governance framework. Not because they are negligent, but because nobody told them they needed one until recently. RANZCR Chapter 9, evolving TGA guidance, and insurer expectations have changed the equation. If your practice uses AI in any clinical capacity, you need documented governance.
This guide provides a practical, prioritised path to building an AI governance framework from scratch. It is designed for practices that currently have nothing — no tool register, no governance policy, no monitoring — and need to get to a defensible position as efficiently as possible.
The approach is simple: address the highest-risk gaps first, build the foundation in the first 30 days, expand it over 60 days, and mature it by 90 days.
Before you start: Assign ownership
Governance without ownership fails. Before you begin any of the work below, designate a single person as your AI governance lead. In most practices, this is the practice director. In larger groups, it may be a senior radiologist or practice manager with clinical oversight responsibilities.
This person does not need to do all the work. They need to be accountable for ensuring it gets done, reviewed, and maintained. Document the appointment — a simple email to the practice confirming the role is sufficient to start.
Days 1–30: Foundation
The first 30 days focus on understanding what you have and documenting the basics. These are the activities that close your most critical gaps.
Week 1: Build your AI tool register
Walk through your clinical workflows and identify every AI tool in use. Include:
- Tools embedded in your PACS or RIS system
- Standalone AI applications (desktop or cloud-based)
- Any research or trial tools being used alongside clinical workflows
- AI features in reporting platforms
For each tool, document:
- Tool name and vendor
- Software version currently deployed
- TGA registration status and ARTG number (if applicable)
- Intended clinical purpose (what it is designed to detect, measure, or triage)
- Intended use boundaries (what it is not designed to do)
- Date of deployment at your practice
Week 2: Conduct initial risk assessments
For each tool in your register, complete a basic risk assessment. You do not need a formal risk matrix at this stage. Answer four questions for each tool:
- What could go wrong clinically? (false negatives, false positives, workflow disruption)
- How serious would the harm be if it went wrong?
- What controls are currently in place? (radiologist review, audit processes, system alerts)
- Are those controls documented or informal?
Write the answers down. A structured spreadsheet is fine. The goal is to move from \"we have not thought about this\" to \"we have assessed the risks and documented our reasoning.\"
Weeks 3–4: Draft your governance policy
Your AI governance policy does not need to be a 50-page document. A practical policy for a small to medium practice covers:
- Scope: Which AI tools and clinical workflows are covered
- Roles and responsibilities: Who is the governance lead? Who monitors performance? Who handles incidents?
- Tool registration requirements: What must be documented before a tool enters clinical use
- Risk assessment requirements: How and when risk assessments are conducted and reviewed
- Performance monitoring requirements: What metrics are tracked, how often, and by whom
- Human oversight requirements: How radiologist authority over AI findings is maintained
- Incident management: How AI-related incidents are logged, investigated, and resolved
- Review schedule: When the policy and its components are reviewed (at minimum annually)
Sign it, date it, and distribute it to your clinical team.
Days 31–60: Operationalise
With the foundation in place, the second month focuses on making governance operational — turning documentation into practice.
Establish performance monitoring
Start simple. For each AI tool, define one or two metrics you will track monthly:
- Concordance rate: What percentage of AI-flagged findings does the radiologist agree with?
- False positive rate: How often does the AI flag something that turns out to be clinically insignificant?
- Workflow impact: Are AI-flagged studies being reviewed within the expected timeframe?
You do not need automated dashboards on day one. A monthly review of a sample of AI-assisted cases, documented in a spreadsheet, is a legitimate starting point. The key is that the monitoring is structured, scheduled, and recorded.
Create your incident logging process
Define what constitutes an AI-related incident worth logging:
- AI output that was materially incorrect and could have affected patient care
- AI system outage during clinical hours
- Significant discordance between AI output and clinical findings
- Patient complaint related to AI-assisted diagnosis
Create a simple log template: date, tool involved, description of incident, clinical impact, investigation findings, actions taken. Communicate the process to your clinical team so they know when and how to log an incident.
Conduct team training
Brief your radiologists and clinical staff on:
- Which AI tools are in use and what they do
- The intended use boundaries of each tool
- How human oversight is maintained in your workflows
- When and how to log an AI-related incident
- Who the governance lead is and how to escalate concerns
Document that the training occurred (date, attendees, topics covered).
Days 61–90: Mature
The third month focuses on stress-testing your framework and filling remaining gaps.
Conduct your first governance review
Schedule a formal governance review meeting. Review:
- The tool register: Is it complete and current?
- Risk assessments: Have any tools been updated since the initial assessment?
- Performance monitoring: What do the first two months of data show?
- Incidents: Were any logged? Were they handled appropriately?
- The governance policy: Does anything need updating based on what you have learned?
Document the meeting: date, attendees, items reviewed, decisions made, actions assigned.
Build your evidence vault
Collect all governance documentation in a single, accessible location:
- Tool register
- Risk assessments
- Governance policy
- Performance monitoring reports
- Incident logs
- Training records
- Governance meeting minutes
This is your evidence vault. If a regulator, insurer, or legal counsel asks how you govern AI, this is what you hand them.
Set the cadence
Governance is not a project — it is an ongoing operational activity. Set the recurring schedule:
- Monthly: Performance monitoring review, incident log review
- Quarterly: Tool register review, governance meeting
- Annually: Full governance policy review, risk assessment review, team training refresh
The 90-day outcome
At the end of 90 days, your practice will have a documented AI governance framework that satisfies the core requirements of RANZCR Chapter 9, positions you well for insurer reviews, and provides the evidence trail that protects you in a medico-legal context.
It will not be perfect. Governance frameworks mature over time as you learn what works for your practice. But the difference between having a basic framework and having nothing is the difference between a defensible position and an indefensible one.
The hardest part is starting. Everything after that is maintenance.
Start with a free governance readiness score
Take the free AI Governance Readiness Assessment and see where your practice stands.
Take the assessment