Software in Medical Devices, by MD101 Consulting

To content | To menu | To search


AIB 2025-1 - MDCG 2025-6 - Welcome to the AI Act, MD manufacturer

The Artificial Intelligence Bureau (AIB), in charge of some regulatory aspects of the AI Act, and the MDCG published in June 2025 a common guidance. This guidance is a FAQ, it will be updated when new questions arise.

Compared to both the Team-NB position paper, and the BSI white paper, we don't find contradictory information on how to apply the AI Act for medical devices.

Introduction

The introduction of this guide outlines the definitions of economic operators found in the AI Act and makes a comparison with MDR/IVDR economic operators.
MDR/IVR manufacturers are AI Act providers. No surprise

Deployers

The concept of deployer is new. No such equivalent in the MDR/IVDR. A user isn't necessarily a deployer.
This is understandable for a Healthcare Center. If it makes available a Medical Device Artificial Intelligence (MDAI) to healthcare personnel, then it is a deployer. And healthcare personnel are users.
However, the definition of deployer incorporates natural or legal person. To my best knowledge a physician is a natural person.
Following the definition of deployer, if private practitioners buy a MDAI and use it in their day-to-day practice, then, they are at the same time deployers and users.

It implies they shall comply with the obligations set out in article 15. Good luck European and National Authorities to have private practitioners follow AI Act obligations for deployers, and maintain appropriate records!

Unless a guide is produced on the subject, sprinkling down these obligations on manufacturers, pardon, providers. Urging them to put in place the means in their MDAI or in information provided to deployers, or in their QMS, to let such natural persons meet these obligations seamlessly.

Least burdensome approach

Borrowing this expression from the FDA, this guide confirms that the least burdensome approach is fostered by the AI Act. No need of a new QMS or a new tech file, or new a risk-management process or a new PMS process.
Just incorporate in the existing ones the additional provisions applicable to MDAI.
We retrieve this approach in several questions and answers throughout the guide. No surprise, the AI Act already says that.

High-Risk MDAI and safety component

This guide rephrases the content of AI Act article 6, shedding a new light on this article with the use of MDCG 2019-11 guide. It adds a table showing when a MDAI is a high-risk AI according to its MDR/IVDR regulatory class. Even though it may seem obvious for some readers, it's better to have it in black and white (not black and white actually. There are colors in the table!).

It's also better to have in black and white that the risk classification of the AI Act doesn't change the MDR/IVDR regulatory classification scheme. Even though it may also seem obvious for some readers.

However, we are no further ahead in defining "safety component":

  • Can we have a MDAI in class I MDR or class A IVDR, containing a safety component? Thus, high-risk AI?
  • Conversely, can we have a MDAI in class II+ MDR or class B+ IVDR, where the AI isn't a safety component? Thus, low-risk AI?

A guide, like the MDCG 2019-11 or the manual on borderline classification would be welcome.

Substantial Modification / significant change

The concept of substantial modification is an autonomous concept under the AIA says the document. And: the Commission will develop guidelines on the practical implementation of the provisions related to substantial modification.
Will they publish a document adopting the charts we have in MDCG 2020-3? Wait and see.

Fortunately, we have a little time before this kind of guide is required by AIMD manufacturers. Contrary to hardware, software can be changed every other day. This guide should be at the top of the pile of AIB guides.

Do it like the FDA PCCP

The guide mentions the possibility to have what the FDA calls a Predetermined Change Control Plan: high-risk MDAIs that continue to learn after being placed on the market or put into service have the possibility of having pre-determined changes plan checked at the time of the conformity assessment
Changes according to this plan won't be considered as substantial modification.

This is really good news! There is absolutely no equivalent requirement in MDR / IVDR. The AI Act brings us on a platter something we should have expected for years, from a skittish MDCG.

Needless to say that it would be easier for manufacturers, if the fundamentals of a PCCP we find in FDA guidance are reused in EU guidelines. A common ground on PCCP guidance is too much to ask for? Maybe IMDRF could do something.

MDAI that continues to learn

The guide contains this:
For high-risk MDAI that continues to learn after being deployed the post-market monitoring system is key to ensuring continued performance and compliance.

Woohoo, MDAI may continue to learn! Calm down. It doesn't mean it can continue to learn in production. This has never been done up to now. The knowledgable FDA didn't manage to put together a policy on MDAI continuously learning in production. How would MDCG and AIB come with a magic wand and authorize this?

Continuously learning is going to be limited to MDAI in pre-production, for a long time. A discrete V&V phase is and will be required for a long time as well, before releasing a version in production. Unless a scientific breakthrough is made, allowing to continuously monitor and validate a MDAI in production.

Data governance

We retrieve in the guide a message similar to the one present in the two BSI and Team-NB white papers on data governance.
This guide sheds a light on the correspondance between requirements on clinical/performance data in MDR/IVDR and requirements on training, validation and testing data in the AI Act. A new data management process, different or adapted from existing clinical data management, is needed to match AI Act article 10 on data governance. It aims to eliminate biases on these data.

We also retrieve here a message similar to data management requirements present in the FDA guidance on AI enabled device software functions.

The AIB guide also puts the emphasis on GDPR compliance. Thus, this data management process shall also ensure compliance to GDPR or other EU regulations on data.

It also mentions that CEN/CENELEC Joint Technical Committee 21 is working on the developing a harmonised standards on data and bias.
Such standards would be welcomed! However, there is a risk that they will be one-size-fits-all for any AI system in the scope of the AI Act. Thus, either the standardisation committee will add specific clauses or notes for data used in MDAI (optimal solution) or such clauses or notes won't be present, and MDAI manufacturers will have to interpret them for their case.
These standards will probably inspired by the ISO 5259-x series, referenced by the TEAM-NB questionnaire on AI in MD. For example:

  • ISO 5259-x series have clauses on data requirements (data requirement specification, mirroring SW requirement specification),
  • These new standards will for sure have clauses on data requirements, similar to those in ISO 5259-x. They may add clauses ensuring that data requirements don't break fundamental rights (something common to any AI system, not present in ISO 5259-x),
  • But these new standards may not have clauses specific to the scientific validity of data used in MDAI. Something that we find in the IMDRF - FDA document on SaMD clinical evaluation.


Transparency

Once again similar message on transparency, compared to the two other documents.
Whom is transparency for:

  • First and for all, for users, to be able to use the MDAI, knowing its performance and limitations, its advantages and drawbacks. This comes with the classical documentation:
    • IFU, labelling,
    • Disclosure of residual risks,
    • And the model card, as recommended by the FDA,
  • Also for deployers, with the same aim as users, but capable of determining if the use of the MDAI is appropriate for users. This is detailed in AI Act article 13:
    • Detailed instructions for use.
    • The technical level of these IFU for deployers may be higher than what is given to end-users. Especially when end-users are lay persons.
  • We can add for regulatory authorities, to be able to assess MDAI safety, performance, and compliance:
    • Answers to Team-NB checklist,
    • Recommendations in FDA guidance on AI enabled device software functions.


Transparency goes hand in hand with trustworthiness. Manufacturers must provide information in their documentation (transparency), in particular IFU and accompanying documentation, so that professionals can assume their responsibility when using the MDAI, and lay users can feel confident in using the MDAI.

In-house MDAI

According to article 5 of MDR / IVDR healthcare centers and healthcare professionals can put into service a medical device in their organization, without requiring the intervention of a Notified Body. This is applicable also to MDAI. Thus, according to article 6 of the AI Act, an in-house MDAI is not a high-risk AI system.

This is really good news for healthcare professionals, who can develop MDAI without any intervention of Notified Bodies.


Conclusion

As expected, this kind of guide won't change the world. But it brings some reassuring messages to MDAI manufacturers. AI Act will be for sure an additional burden. But not so much, compared to MDR / IVDR.


(Imagine the mood of other industries, like education, worker's management, law enforcement, ... who have to implement the AI Act from scratch).



Add a comment

Comments can be formatted using a simple wiki syntax.

They posted on the same topic

Trackback URL : https://blog.cm-dm.com/trackback/298

This post's comments feed