As more medical device manufacturers integrate software or artificial intelligence (AI) components into their solutions, the healthcare AI market is booming and projected to grow from $56.01 billion in 2026 to $1033.27 billion by 2034, exhibiting a staggering CAGR of 43.96%.
Digital and AI-enabled software offer unprecedented potential to improve diagnostics, treatment accuracy, operational efficiency and patient outcomes.
With examples including the following, advances in AI are reshaping healthcare:
- precision oncology that aligns treatments with tumour genomic profiles
- intelligent glucose monitoring systems that anticipate hypoglycaemia and adjust insulin delivery
- adaptive rehabilitation technologies that tailor therapy to individual recovery.
The potential impact on patients’ lives is substantial. But so is the risk: misdiagnoses, incorrect dosages or hacked drug delivery systems pose a tangible danger to patient health.
Ruaidhrí Primrose, Director at Firefinch Software, and Jonathan Ripley, Managing Director at IMed Group, take up the story.

Although regulators are focusing on improving patient safety across the board, the application of AI to the highly regulated world of healthcare is still in its infancy; the EU AI Act came into force on 1 August 2024 and, even then, with a phased implementation.
Another legislation, the EU Data Act (in force since September 2025), is similarly new.
Whereas the UK, US and EU all agree on putting patient safety at the top of their concerns, their pathways, classifications and evidence requirements diverge.
Manufacturers thus need to navigate diverging UK, EU and US regulations, assess their team’s skills, manage data quality and cybersecurity from the outset and much more when dealing with AI applied to healthcare.
Does the integration make this a Software as a Medical Device (SaMD)?
The first step to navigating this complexity is to determine whether the software or AI features integrated into the device make it an SaMD.
Most cases are not clear-cut… and it can be difficult to discern whether a device that tracks sleep or heartbeats is simply a lifestyle app or a medical device.

One useful set of guidelines, regardless of the geography in which the device will be launched, has been provided by the UK Medicines and Healthcare products Regulatory Agency (MHRA), which regulates medicines, medical devices and blood components for transfusion in the UK.
This guidance is a solid sounding board as it offers an illustrative starting point to ascertain whether a device is an SaMD.
A model offered by the MHRA
The MHRA’s new framework to assess whether a digital mental health technology (DMHT) qualifies as a medical device takes two key elements into consideration: intended purpose and level of functionality.
Intended purpose requires evaluating what the manufacturer claims the tool is designed to do. If the software explicitly states that it diagnoses, treats, prevents or monitors a medical condition, it is more likely to fall under medical device regulations.
A thorough analysis of labelling, instructions for use, promotional materials and technical documentation is required.
This analysis is not foolproof, however, and a digital device may have a medical purpose but still be excluded from regulation if its functional impact is low.
This means that the tool does not provide a clinical effect or influence patient care decisions. Once it has been determined that the device is an SaMD, several additional development and regulatory requirements will need to be met.
Data is part of the SAMD device
The data that an AI-powered device is trained on is a part of the device.
As new data is typically required to constantly finetune a model, an additional data set could mean that the model needs to go back through validation and verification, with the technical file also requiring updates — a lengthy and complex process.
Including predetermined change control plans (PCCPs) in the initial files, however, can help to speed up the process of including new data sets.
Similarly, in conventional software, traceability focuses on who developed a feature, who reviewed it and which requirements it links back to.
With AI, teams must track where the data, rather than the feature, originated and how it was quality checked.
The larger and more diverse the data sets involved, the more complex the task.
SOUP as part of the device
Software of unknown provenance (SOUP) is a common practice during development; its use avoids the pointless rebuilding of something that already exists.
Given the risks entailed in the medical sector, however, every piece of SOUP must be assessed more thoroughly.
Putting an effective quality management system (QMS) for an SaMD also means meeting cybersecurity requirements.
Teams must maintain a clear record of every external software component they use, why it was chosen and how it links back to specific software requirements.
Keeping a copy of each SOUP element ensures manufacturers can meet obligations under IEC 62304.
Where to launch an SaMD?
Given the complexities involved with SaMD compliance, choosing where to launch a new or newly integrated device can be critical to commercial success.
Lack of awareness of the complex regulations can mean delays and higher overall costs.
The US currently provides greater predictability of timelines and costs compared with other markets; plus, FDA guidance regarding the positioning of products as general wellness devices and clinical decision support software is available.
Manufacturers should be aware that enforcement discretion can be overturned by the FDA.

In the EU, the AI Act defines SaMDs including AI as “high risk” and will roll out rules related to this definition in August 2026 and August 2027.
By contrast, the Commission recently published COM(2025) 1023, which includes the proposal to amend the classification rule 1.
If the proposal succeeds, more SaMDs will fall within Class I and not require the involvement of a Notified Body, thereby making the EU route more attractive.
In the UK, more SaMD solutions are currently classified as low risk (Class I) compared with the EU.
However, with new regulatory requirements set to take effect in 2026 and recent MHRA guidance highlighting increased oversight — particularly for technologies such as ambient scribes — the regulatory landscape is tightening.
In short, compliance should be built into the SaMD by design and not be an afterthought.
This will ensure easier access into the chosen launch market and clear pathways to update and safeguard the tool’s digital elements.
Creating documentation as part of a consistent and ongoing compliance management system is key to ensuring the business is prepared to face increasing scrutiny as technology continues to evolve at a faster pace than safety guidelines.
