FDA Oversight of AI and Machine-Learning Medical Devices
The application of artificial intelligence (AI) and machine learning (ML) in medical device software is moving at breakneck speed. Money is flowing into AI / ML start-ups, and the technology holds huge promise for its ability to predict, diagnose, and manage patient health conditions. Still, in the race to be first to market, it is easy for some start-ups to overlook the current ground rules for medical device regulation established by FDA. For many, FDA’s detailed requirements for design control traceability do not align with a desire to “fail faster” via quick iteration and market testing. In this article, we are going to give you a breakdown of what FDA expects from software as a medical device (SaMD) developers as of December 2021, as well as what to expect in the future.
How FDA Regulates AI / ML SaMD Now and Future Plans
Artificial intelligence represents one of the biggest challenges to FDA’s current regulatory framework, which is entirely built around approving medical devices that are fixed in design and do not change often. That is certainly not the case with medical device AI and ML products. Realizing that AI is not a fad, FDA has been deciding on how to adapt to this tectonic shift in technology. FDA and other regulators have not been sitting still, and hundreds of AI-powered devices have successfully navigated the 510(k) or De Novo process with FDA. Still, those devices on the market rely on “locked” algorithms, which do not fully unleash the potential of AI to learn and adapt.
If you have read this far, we are assuming that you are already familiar with basic FDA regulations but want some additional specifics as they relate to your technology. Here are some documents you will definitely want to download and study.
Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD)
The ability to learn from new data is, of course, the real promise of AI and machine learning, so FDA has been working on how to regulate products that are in a constant state of evolution. This means moving from the current predicate device review model to a total lifecycle-based regulatory framework. This proposed framework was first outlined by FDA in April 2019. An important aspect of this document is a plan to implement a predetermined change control plan to address two types of anticipated modifications to AI / ML SaMD. Here is what FDA has to say about it:
- SaMD Pre-Specifications (SPS): An SaMD manufacturer’s anticipated modifications to “performance” or “inputs,” or changes related to the “intended use” of AI / ML-based software. These are the types of changes the manufacturer plans to achieve when the SaMD is in use. The SPS draws a “region of potential changes” around the initial specifications and labeling of the original device. This is what the manufacturer intends the algorithm to become as it learns.
- Algorithm Change Protocol (ACP): Specific methods that a manufacturer has in place to achieve and appropriately control the risks of the anticipated types of modifications delineated in the SPS. The ACP is a step-by-step delineation of the data and procedures to be followed so that the modification achieves its goals, and the device remains safe and effective after the modification. This is how the algorithm will learn and change while remaining safe and effective.
This framework document has a lot more information about how FDA plans to regulate AI / ML in the future. Most importantly, it provides real-life examples of algorithm changes that do (and do not) require additional FDA review or resubmission. Add it to your must-read list.
Speaking of changes… for devices already on the market, FDA has stated that certain software modifications may trigger a new premarket submission, especially if the AI / ML software modification significantly affects device performance, or safety and effectiveness; the modification is to the device’s intended use; or the modification introduces a major change to the SaMD algorithm. provides examples specific to AI devices, but this detailed guidance from 2017 should be your definitive guide on which changes may require a new 510(k) submission.
After receiving more than a few earfuls of feedback on the proposed regulatory framework, FDA issued an action plan in January 2021. It is a good overview of current FDA thinking on how it plans to implement the framework and what needs to be done to make it happen. FDA notes that an update to the proposed regulatory framework is on the way, along with a draft guidance on the predetermined change control plan.
Released in October 2021, this joint document put out by the US, Canadian, and UK regulators outlines 10 guiding principles every AI / ML developer should follow. It is brief but useful.
Believe it or not, the last time FDA updated its guidance on software for medical devices was 2005, well before the iPhone came into existence! This new software draft guidance — issued in November 2021 — is definitely a must-read if you are developing AI / ML technology with the intent to seek 510(k), De Novo, or Premarket Approval (PMA).
FDA smartly recognizes that it will never be able to keep pace with advancements in medical technology, and it certainly does not want to be the one to impede the adoption of new technology that could have a positive impact on patient safety or outcomes. With that in mind, FDA created this pilot program that focuses on the software developer practices rather than the software itself. The test program is expected to guide FDA’s future regulatory framework related to AI / ML devices.
The “Locked” Design Conundrum
AI and ML technologies fall into the category of Software as a Medical Device (SaMD) by FDA. However, unlike traditional fixed-code SaMD, the great promise of AI and ML comes in their ability to learn based on new data. By its very nature, AI causes heartburn for regulators because, unlike typical medical devices, (1) it is adaptive technology in which the algorithm learns from the input of new data and not from a programmer improving its code; and (2) how AI arrives at a conclusion is a “black box” to physicians. This second point, with its lack of transparency and trust, is big for physicians and clinicians.
The core issue is that FDA’s regulatory construct revolves around approving clearly defined versions of devices. The fact that AI learns and adapts in significant ways presents a conundrum for regulators.
Given the fact that AI has the ability to learn on the fly, a core focus for FDA is how to protect patient safety. While current AI / ML devices approved today are based on “locked” algorithms, AI / ML devices will need to utilize adaptive learning to reach their full potential. This raises questions such as:
- Will the next generation of AI or machine learning algorithms always be safer and more effective than the previous version?
- How can inherent algorithmic data bias be measured, prevented, or diminished? Are the datasets used to develop, test, and validate the AI inclusive of diverse populations?
- Is there anything a patient or user can do that would interfere with the algorithm?
What You Need to Know About Change Control for SaMD
As mentioned earlier, FDA’s proposed regulatory framework provides some excellent examples of algorithm changes that may or may not require an FDA submission. But that is only what has been proposed. As you may know, the FDA Quality System Regulation (21 CFR Part 820) is what medical device manufacturers (including software companies) must follow today. Within it are a few areas you need to become very familiar with:
21 CFR Part 820.30 (Design changes)
The reason your compliance with this section matters (a lot) right from the outset is that when it comes time to compile and submit your application for a 510(k) submission, a lack of design control procedures and associated records can result in FDA issuing a Refuse to Accept (RTA) letter. Believe me, that is a letter nobody wants to bring to their boss. If you have not adequately tracked and documented all aspects of code and algorithm updates, you will be faced with the very unpleasant task of recreating the documentation trail from day one.
21 CFR Part 820.70 (Production and process changes)
The language in this section seems tailored to physical devices, but it most certainly also applies to SaMD. One noteworthy section is 820.70(5b), which states, “Each manufacturer shall establish and maintain procedures for changes to a specification, method, process, or procedure. Such changes shall be verified or where appropriate validated according to 820.75, before implementation and these activities shall be documented. Changes shall be approved in accordance with 820.40.”
So, what does this mean from a practical standpoint? It means you need to document everything you do from the outset. The process generally looks like this:
1 – Identify a need for a change
2 – Justify the proposed change
3 – Review the proposed change internally
4 – Finalize the change by securing management approvals
5 – Document all steps above
6 – Communicate the change to relevant parties
7 – Train employees affected by the change
8 – Implement the change
9 – Evaluate the change and its effects
Does that sound like overkill? Welcome to the world of regulated products. While the regulatory framework for managing design controls is likely to change, it will take some time for FDA to figure out how to best manage that process for adaptive AI products. For now, document, document, document! Being proactive now can help you avoid a very unpleasant written reply from FDA after you submit your application for 510(k) clearance. For more insight, read this article on medical device change control best practices.
How Europe is Planning to Regulate Medical Device AI
FDA is not the only regulator trying to figure out how to address AI. In April 2021, EU regulators released a final draft of a proposed Artificial Intelligence Act (AIA) regulation applicable to all AI-driven software (not just medical devices). The 53,000-word tome would apply on top of the existing requirements imposed by the Medical Device Regulation (MDR 2017/745) and the In Vitro Diagnostic Regulation (IVDR 2017/746). As proposed, the AIA would create parallel technical documentation and vigilance reporting requirements. Team-NB (the European Association of Medical Devices of Notified Bodies) is pushing back against that idea, citing the redundancy and confusion it would create. Other regulators outside the EU and US are also working on proposed regulations.
Other Sources of Valuable Information
FDA works with numerous medical-related subgroups of the Institute of Electrical and Electronics Engineers (IEEE), and you will want to check out the range of projects they are working on, including the one related to the quality management of medical device AI datasets. Of course, we would be remiss if we did not mention that the International Organization for Standardization (ISO) is working on a wide variety of new standards and related documents pertaining to AI. You can peruse current and proposed ISO AI standards here.
Building Trust Through Transparency
These are early days for AI- and ML-powered devices. Physicians have not yet built up enough trust that the AI black box will produce favorable outcomes for their patients. Without that level of transparency, many agree that clinical validation should be required for all AI-powered devices before they come to market. Physicians really want to know how the algorithm learns, what decisions it makes, what the output means, and how to discern whether the algorithm’s results are just plain wrong. Of course, building trust in AI is dependent on the quality of the data used to train the algorithm, and if the data were collected only from certain populations or in specific environments, this can have a huge impact on the outcomes. AI is only as smart as the quality of data that feeds it.
What Is Next for How the World Regulates Medical Device Artificial Intelligence?
Regulators’ mission of protecting patient safety will need to be balanced with the obvious benefits associated with letting AI reach its full potential. Figuring out how to embrace the adaptive nature of AI- and ML-powered devices while not hampering the advancement of the technology will be a challenge and will require compromise on both sides. One thing is certain: FDA and other regulatory agencies will need to fundamentally change how they approach device approvals and manage change control.
Want to Learn More?
In this article, we have just scratched the surface of regulatory requirements related to medical device software. If you want to take your understanding of FDA and other requirements to the next level, consider our training courses on medical device software development, verification, and validation and / or cybersecurity. Our team is also available to help with cybersecurity and quality management system (QMS) compliance, or we can even provide short-term staff augmentation to help you get ready for your 510(k) submission.