logo

QA/RA Consulting, Auditing & Training

logo

Let's get started

The EU AI Act and Its Impact on Medical Devices

Artificial Intelligence (AI) is being implemented in medical devices at a fast pace. The US FDA has approved close to 1,000 devices with AI integration. Many of these are for radiology and other imaging uses. For example:

However, more and more devices are integrating AI into other uses, such as in cardiovascular or surgical applications. Here are a couple of examples:

These examples illustrate how AI is being integrated into medical devices to improve diagnostic precision, streamline workflows, and ultimately enhance patient care. While this is great news for clinicians and patients, it complicates the regulatory landscape for medical device manufacturers.

Device manufacturers selling in Europe are familiar with the EU Medical Device Regulation (EU MDR 2017/745) and the EU In Vitro Diagnostic Device Regulation (EU IVDR 2017/746). The EU MDR and EU IVDR regulations have increased the technical documentation requirements and quality management system (QMS) processes needed to obtain and sustain CE marking.

Add to that a new AI Act (2024/1689) that came into force in the EU in January 2024. Now medical device manufacturers with AI embedded into their products must comply with both regulations. The EU AI Act intersects significantly with the EU MDR and EU IVDR. Both already impose rigorous requirements on medical devices, including those incorporating AI. The AI Act introduces additional layers of compliance, particularly for high-risk AI systems used in medical devices. Before we get ahead of ourselves, let’s review the definition of AI from Article 3 of the EU AI Act:

A machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.

The integration of a machine-based system (i.e., software) automatically makes a medical device at least Class IIa in the EU MDR. So, devices with software are higher risk under the EU MDR and high risk under the AI Act. Do these regulations align or contradict each other? Let’s take a look.

High-Level Overview of the EU AI Act

The EU AI Act was enacted to ensure the safe and ethical deployment of AI technologies. The Act categorizes AI systems into four risk levels: unacceptable, high, limited, and minimal. Many medical applications are considered high-risk AI systems, and thus are subject to stringent requirements, including data quality, transparency, human oversight, and accuracy. Just like the EU MDR/IVDR, the legal framework of the AI Act is written to allow adaptation over time, by fostering innovation while still mitigating risks and ensuring that AI systems remain under human control. The AI Act places requirements on medical device manufacturers of AI systems / products, including:

  • Data governance and data management requirements for training and testing data sets
  • New record-keeping requirements, including the automatic recording of events (logs) over the system’s lifetime
  • Transparent design requirements so deployers (users) can interpret the output and use it appropriately
  • Human oversight design requirements
  • Accuracy and cybersecurity requirements
  • Risk management systems
  • Quality management systems
  • Labeling that provides information on the functioning of the AI system as well as its operation and maintenance

Integrating the EU AI Act Requirements with the EU MDR/IVDR Requirements

When we review the list of AI requirements above, we can see that many of these fit into our existing processes of medical device risk management, design and development processes, and postmarket surveillance systems. For example:

  • New design inputs are the transparent design and human oversight design requirements, labeling, and the accuracy and cybersecurity requirements.
  • A postmarket surveillance system can incorporate the need for recordkeeping over the system’s lifetime. An additional reporting requirement is communication with the competent authorities about any threats to individual rights.
  • The QMS requirements in ISO 13485 are very similar to those in ISO 42001 AI Management Systems. For medical device manufacturers, ISO 13485 takes precedence, but a crosswalk of requirements between the two standards is a great idea.
  • And finally, medical device manufacturers should have a robust risk management system in place already. They just need to add the safety and compliance requirements associated with the AI Act into their risk analysis and risk control tools.

The integration of AI Act requirements with MDR and IVDR means that manufacturers will need to harmonize their compliance strategies. This could involve updating existing technical documentation to include AI-specific information and ensuring that AI components meet both sets of regulatory standards. While this may increase the regulatory burden, it also ensures that AI-integrated medical devices are safe, effective, and trustworthy.

Next Steps / Call to Action

As the EU AI Act comes into force, stakeholders in the medical device industry must take proactive steps to ensure compliance. Here are some key actions to consider:

  • Stay informed: Keep abreast of the latest developments and guidelines related to the AI Act. The European Commission and relevant industry bodies will likely issue further guidance to clarify compliance requirements.
  • Assess impact: Conduct a thorough assessment of how the AI Act affects your products and operations. Identify which AI systems fall under the high-risk category and determine the necessary steps for compliance.
  • Enhance capabilities: Invest in training and resources to build expertise in AI technologies within your organization. This includes understanding the technical and regulatory aspects of AI systems and ensuring that your team is equipped to handle the new requirements.
  • Update documentation: Ensure that all technical documentation is up to date and includes AI-specific information as required by the AI Act. This will facilitate conformity assessment and demonstrate compliance with both the AI Act and MDR/IVDR.
  • Upgrade the QMS: Review and improve the key QMS processes, such as postmarket surveillance, design and development, risk management, data governance, and human oversight mechanisms. Create a quality plan with actions, responsibilities, and timelines to demonstrate a proactive approach.
  • Review your Notified Body designation: Check on whether your Notified Body possesses expertise in AI technologies to effectively evaluate high-risk AI systems. The AI Act mandates that Notified Bodies be independent, maintain confidentiality, and have robust quality management and cybersecurity measures in place. This requirement will likely necessitate additional training and resources, potentially leading to longer assessment times and higher costs for manufacturers.

By taking these steps, manufacturers can navigate the complex regulatory landscape and ensure that their AI-integrated medical devices meet the highest standards of safety and efficacy. The EU AI Act represents a significant shift in how AI technologies are regulated, but with careful planning and proactive measures, the medical device industry can continue to innovate while ensuring patient safety and trust.

ELIQUENT Life Sciences can help you integrate the requirements of the EU AI Act into your existing QMS and technical documentation. Contact us at info@eliquent.com for more information.

Our team is here to help. Contact us online
or
Get answers right now. Call

US OfficeWashington DC

1.800.472.6477

EU OfficeCork, Ireland

+353 21 212 8530

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.