TGA's evolving approach to AI as a medical device
The Therapeutic Goods Administration is the Australian regulator responsible for ensuring that medical devices — including software — are safe and fit for their intended purpose. As AI-powered clinical tools proliferate in radiology, the TGA's classification framework is becoming one of the most important regulatory considerations for any practice deploying these tools.
This article explains the current TGA classification framework for AI and machine learning-based Software as a Medical Device (SaMD), what recent guidance updates mean for radiology practices, and what you should be documenting today.
What is Software as a Medical Device?
The TGA defines SaMD as software intended to be used for one or more medical purposes that performs these purposes without being part of a hardware medical device. In radiology, this includes AI tools that detect abnormalities on imaging, triage studies by urgency, measure anatomical structures, or generate quantitative analyses.
The critical distinction is intended purpose. A general image viewer is not a medical device. But software that analyses a chest X-ray and flags potential nodules is performing a diagnostic function and falls within the TGA's regulatory scope. If the software influences clinical decisions, it is almost certainly SaMD.
The classification framework: Class I through Class III
The TGA classifies medical devices on a risk-based scale. For SaMD, the classification depends on two factors: the seriousness of the condition the software addresses, and the significance of the information the software provides to the clinical decision.
Class I: Low risk
Software that provides information to support clinical decisions about non-serious conditions. Examples include wellness apps or general health monitoring tools. Most radiology AI tools do not fall into this category because they address conditions where a missed finding could cause harm.
Class IIa: Low-moderate risk
Software that provides information to support decisions about non-serious conditions where the information is important to the clinical decision, or software addressing serious conditions where the information is not the primary basis for the decision. Some radiology workflow tools — such as study prioritisation algorithms that flag urgent cases for earlier review but do not provide diagnostic information — may fall here.
Class IIb: Moderate-high risk
Software that provides information that is the primary basis for clinical decisions about serious conditions. This is where most diagnostic radiology AI tools land. A chest X-ray AI that identifies potential pneumothorax, a mammography AI that highlights suspicious lesions, or a CT AI that measures coronary calcium scores — these all provide information that directly influences diagnosis of serious conditions.
Class III: High risk
Software intended to directly treat or diagnose life-threatening conditions. In radiology, this classification is rare but could apply to AI tools used in emergency stroke triage where the software's output directly determines treatment pathways.
Recent TGA guidance updates
The TGA has issued several guidance documents that signal where regulation is heading:
Predetermined change control plans
The TGA is moving towards allowing manufacturers to define an envelope of acceptable changes to their AI models upfront, rather than requiring a new regulatory submission for every model update. This mirrors the FDA's approach in the United States. For practices, this means you will need to verify that deployed software versions fall within the manufacturer's approved change control plan. If a vendor pushes an update outside that plan, the tool may no longer be operating under its TGA registration.
Real-world performance monitoring
The TGA is increasingly interested in post-market surveillance — how SaMD performs in real-world clinical use, not just in controlled validation studies. This aligns with RANZCR Chapter 9's requirement for practices to monitor AI tool performance. Practices that already track concordance rates and incident reports will be well-positioned.
Increased enforcement activity
The TGA has indicated it will be more active in identifying and acting on unregistered SaMD. Practices using AI tools that are not TGA-registered — whether research prototypes, tools purchased from overseas vendors without Australian registration, or software that has evolved beyond its original registration scope — face increasing regulatory risk.
What this means for your practice
The regulatory landscape is shifting more responsibility onto practices, not just manufacturers. Here is what you should be doing:
Practical checklist for practices
- Confirm TGA registration status for every AI tool in clinical use. Check the Australian Register of Therapeutic Goods (ARTG) and document the ARTG number, classification, and intended purpose
- Record software versions currently deployed and compare them against the manufacturer's approved change control plan or registered version
- Document intended use boundaries for each tool — what it is registered to do and, equally importantly, what it is not registered to do
- Flag unregistered tools — if you are using AI software that is not on the ARTG, document the rationale and assess the regulatory risk. Research-use-only tools should not be influencing clinical reports
- Monitor TGA guidance updates — subscribe to the TGA's SaMD regulatory updates. The framework is evolving and what is compliant today may not be tomorrow
- Maintain version change logs — when a vendor updates their AI tool, document what changed, when, and confirm it falls within the approved change control plan
- Integrate with your governance framework — TGA compliance is not a standalone activity. It should feed into your broader AI governance documentation, risk assessments, and performance monitoring
The enforcement reality
Today, TGA enforcement of SaMD in radiology is largely reactive — investigations tend to follow adverse events or complaints. But the trajectory is towards proactive oversight. The TGA is building its capability to audit SaMD in clinical use, and practices that cannot demonstrate they understood their regulatory obligations will be in a difficult position.
The defence is documentation. If you can show that you verified TGA registration, tracked versions, monitored performance, and responded to issues, you have a defensible position. If your only record is a purchase order from three years ago, you do not.
The TGA's evolving approach is not about restricting AI adoption. It is about ensuring that AI tools used in clinical practice meet the same safety and efficacy standards that apply to any other medical device. For radiology practices, the message is straightforward: know what you are using, confirm it is registered, track its performance, and document everything.
Check your AI tools against TGA requirements
Take the free AI Governance Readiness Assessment and see where your practice stands.
Take the assessment