The rapid evolution of artificial intelligence and the expansion of the use of software in the delivery of healthcare is transforming the medtech landscape, offering unprecedented opportunities for innovation in patient care.
The Therapeutic Goods Administration (TGA) recently published its Clarifying and Strengthening the Regulation of Medical Device Software including Artificial Intelligence (AI) Report that includes a review of the TGA’s position on the regulation of software that amounts to a medical device software (software as a medical device or SAMD), and AI. The regulation of SAMD and AI will become increasingly relevant for medtech developers, privacy professionals, and anyone navigating the intersection of technology and the health sector in Australia.
We previously covered the consultation that preceded this review in our post, It’s alive! Safe and responsible AI in therapeutic goods.
A technology-agnostic, risk-based approach
The TGA’s regulatory framework remains fundamentally technology-agnostic and risk-based. Regulation is not tied to specific technology or types of AI, but rather to the risks posed by a device (or software) throughout its lifecycle. This approach is designed to be flexible and responsive to accommodate rapid innovation without the need for constant regulatory or legislative overhaul.
This places an onus on developers and sponsors to proactively identify, mitigate, and monitor risks.
Key legislative and regulatory findings
The 2025 review confirms that the current legislative framework is largely fit for purpose, but highlights several areas for potential refinement:

Review of software exclusions for consumer health
The TGA is re-examining the list of software products that are excluded from therapeutic goods regulation, particularly in light of the increasing use of AI. Exclusions for digital mental health tools, consumer health products, and certain laboratory information management systems are undergoing urgent review due to a concern that these exclusions may no longer be appropriate in light of increasing risk. The TGA is considering whether to remove or amend these exclusions, or to introduce new exemptions with specific conditions.
As AI and advanced analytics become increasingly embedded in consumer-facing technologies — such as fitness trackers, smartwatches, and mobile health apps — the distinction between general wellness products and regulated medical devices is becoming less clear.
This may mean that some apps and wearables previously considered low-risk may soon be subject to stricter regulatory oversight, including requirements for transparency, performance validation, and post-market surveillance. Developers and providers of these technologies will need to continually assess whether products meet the definition of a medical device (so as to be regulated as a therapeutic good), ensure compliance with evolving TGA guidance, and be prepared for increased scrutiny around claims, data use, and user safety.
Guidance for adaptive AI, open datasets, and performance monitoring
Emerging applications of new technology, such as adaptive AI (which can change functionality post-deployment) and the use of open datasets or software of unknown provenance, are at the forefront of regulatory concern. The TGA acknowledges that current processes are based on static models, and that adaptive systems may require new approaches to change control, validation, and ongoing monitoring. Guidance is being developed to address these challenges, likely with a focus on:
- defining what constitutes a ‘significant change’ in adaptive AI, and how those changes should be managed and reported
- providing clarity on the use and validation of open datasets and software of unknown provenance, referencing applicable international standards (such as ISO/IEC 5338:2023 and IEC 62304) and
- enhancing post-market performance monitoring, including real-world data collection and mandatory adverse event reporting.
Transparency and user information
Stakeholders, including clinicians and consumers, are calling for greater transparency about medical device software and AI. This includes clear labelling, information about datasets used in training models, and in-app notifications about risks and updates.
The TGA is reviewing advertising provisions and considering modifications that could be made to the Australian Register of Therapeutic Goods (ARTG) to provide more accessible information about approved devices, their intended use, and AI components. The TGA is also considering the introduction of Unique Device Identification (UDI) systems that include software and AI-related details.
International harmonisation
Australia’s regulatory approach is closely aligned with international standards and practices (which is essential given most medical devices supplied locally are imported and certified overseas). The TGA is actively engaged with global regulators, including the International Medical Device Regulators Forum (IMDRF), to promote harmonisation, reduce regulatory burden, and facilitate timely access to innovative devices. The TGA is also monitoring international developments such as the EU AI Act and the US FDA’s approach to AI-enabled medical devices.
Implications for medtech developers and privacy professionals
For emerging medtech companies, these updates underscore the importance of:
- understanding whether software or an app meets the definition of a ‘medical device’ for the purpose of regulation by the TGA
- clarifying the roles and responsibilities of developers, deployers, and sponsors
- ensuring robust risk management, data governance, and performance monitoring processes are in place, including compliance with relevant international standards
- preparing for increased scrutiny around transparency, user information, and post-market surveillance and
- staying informed about evolving guidance, particularly for adaptive AI, the use of open datasets, and changes to software exclusions.
Looking ahead
The TGA’s 2025 review signals a proactive and consultative approach to regulating medical device software and AI. While the core framework remains stable, we expect further targeted consultations and guidance will follow, particularly in areas where technology is outpacing regulation. Medtech innovators should engage early with the TGA, seek expert advice on compliance, and prioritise transparency and user safety in their product development.
As the regulatory landscape continues to evolve, privacy, IP, and health law professionals will play a critical role in helping companies navigate these complexities — ensuring that innovation in healthcare is both safe and responsible.
