TGA SaMD compliance: what radiology practices actually need to do
The Therapeutic Goods Administration's approach to AI in healthcare centres on the classification of AI tools as Software as a Medical Device (SaMD). For radiology practices, this regulatory framework creates obligations that are split between the tool vendor and the deploying practice — and the boundary between them is less clear than most practice directors assume.
This guide breaks down what the TGA requires, what falls on the vendor, what falls on you, and the governance documentation that bridges the gap.
SaMD classification basics
The TGA classifies medical devices, including software, into risk classes based on the severity of the condition the device addresses and the significance of the information it provides to clinical decisions.
For radiology AI tools, the classification typically falls into:
- Class I (low risk): Software that provides general wellness information or supports administrative workflows. Most radiology AI tools are not Class I.
- Class IIa (medium-low risk): Software that provides information used to make clinical decisions about non-serious conditions, or that monitors patient parameters for non-critical applications.
- Class IIb (medium-high risk): Software that provides information used to make clinical decisions that could directly impact patient management for serious conditions. Many radiology AI triage and detection tools fall here.
- Class III (high risk): Software that provides information used to make clinical decisions for life-threatening or irreversible conditions. Some radiology AI tools, particularly those used in stroke detection or PE identification, may be classified at this level.
The higher the class, the more stringent the regulatory requirements for market entry — but the post-market obligations that affect your practice are significant across all classes above Class I.
What the vendor is responsible for
The AI tool vendor (the "sponsor" in TGA terminology) bears the primary regulatory burden for market authorisation:
Pre-market requirements: - Clinical evidence demonstrating safety and performance - Conformity assessment against Essential Principles - Quality management system (typically ISO 13485) - Inclusion on the Australian Register of Therapeutic Goods (ARTG) - Post-market surveillance plan
Ongoing obligations: - Adverse event reporting to the TGA - Maintaining the quality management system - Updating clinical evidence as required - Software lifecycle management per IEC 62304 - Notifying the TGA of significant changes
When a vendor tells your practice that their tool is "TGA cleared" or "ARTG listed," they are confirming they have met these sponsor obligations. This is necessary but not sufficient for your governance.
What the practice is responsible for
Here is where the gap emerges. The TGA's framework places obligations on the sponsor for the tool itself, but deploying a SaMD in a clinical environment creates governance obligations for the practice that are distinct from the vendor's regulatory compliance.
Intended use compliance
Every SaMD has a defined intended use — the specific clinical scenarios for which it has been validated and approved. The practice is responsible for ensuring the tool is used within these boundaries.
In radiology, intended use violations often happen gradually. A chest X-ray AI tool validated for adult patients begins being applied to adolescent cases. A detection algorithm intended as a triage aid begins being treated as a diagnostic tool. A mammography CAD system validated for standard screening protocols is used with non-standard imaging parameters.
What to document: For each AI tool, maintain a record of the approved intended use (from the ARTG listing and the vendor's instructions for use) alongside your actual clinical deployment scope. Any divergence should be formally risk-assessed.
Local performance verification
The TGA's pre-market assessment relies on validation data that may not reflect your clinical environment. Patient demographics, imaging equipment, scan protocols, and workflow integration all affect real-world performance.
The practice cannot rely solely on the vendor's published validation data. You need local performance evidence.
What to document: Concordance rates between AI outputs and clinical findings, tracked over time. At minimum, quarterly reviews comparing AI performance against radiologist determinations. Document any anomalies, performance shifts after software updates, and seasonal variations.
Adverse event awareness
While the vendor has the primary adverse event reporting obligation, the practice has a role in recognising and escalating potential adverse events. If an AI tool produces an output that contributes to a delayed or incorrect diagnosis, the practice should:
1. Log the event in its internal incident management system 2. Notify the vendor, who may need to report to the TGA 3. Assess whether the event represents a systematic issue or an isolated case 4. Document any workflow changes made in response
What to document: An incident log with investigation notes, vendor notifications, and corrective actions. Even if the vendor determines the event does not meet TGA reporting thresholds, your documentation of the event and your response demonstrates governance.
Software update management
When a vendor releases a software update, the practice needs to assess whether the update changes the tool's intended use, performance characteristics, or risk profile. This is particularly important for updates that modify the AI model itself (as opposed to user interface or infrastructure changes).
What to document: A record of every software update applied, including the date, version change, vendor release notes, and any internal assessment of whether the update affects clinical governance arrangements.
The ARTG verification gap
A common governance failure is assuming that ARTG listing means the practice has no further regulatory obligations. The ARTG listing confirms the vendor has met their regulatory requirements. It does not confirm:
- That the tool is appropriate for your specific clinical use case
- That the tool performs adequately in your clinical environment
- That your workflows provide adequate human oversight
- That your staff are competent in using the tool within its intended use boundaries
- That you have a system for monitoring ongoing performance
Each of these is a practice-level governance obligation that exists independently of the vendor's regulatory status.
Practical steps for compliance
For each AI tool in clinical use, maintain a governance file containing:
1. ARTG listing confirmation — the tool's ARTG number, classification, and intended use statement 2. Deployment assessment — your practice's assessment of the tool's suitability for your clinical environment, including any risk mitigations 3. Clinical workflow documentation — how the tool integrates into your reporting workflow, where human oversight occurs, and how discordant findings are managed 4. Performance monitoring records — local concordance data, reviewed quarterly at minimum 5. Incident log — any AI-related incidents, investigations, and corrective actions 6. Software update log — version history with assessment of clinical governance impact 7. Staff competency records — evidence that clinicians using the tool understand its intended use, limitations, and the practice's governance arrangements
The convergence
The TGA's SaMD framework, RANZCR Ch.9's governance expectations, medical indemnity requirements, and the CAIOS standard all converge on the same set of practice-level obligations. They use different language and come from different regulatory directions, but they ask the same fundamental question: can your practice demonstrate that it governs its clinical AI tools responsibly?
The practices that build a unified governance infrastructure — rather than maintaining separate compliance efforts for each regulatory requirement — will find that a single evidence base satisfies all of them. A well-maintained tool register, current governance policies, documented clinical oversight, active performance monitoring, and formal incident management collectively satisfy TGA post-market obligations, RANZCR governance expectations, insurer risk assessment requirements, and CAIOS certification criteria.
The cost of maintaining these separately is duplication and drift. The cost of maintaining them as integrated governance infrastructure is minimal once established.
Start your readiness assessment
Take the free AI Governance Readiness Assessment and see where your practice stands.
Take the assessment