SLTC
Get Started
AI/ML-Based SaMD in 2026: What Indian Medical Device Manufacturers Must Know About FDA Guidance, the EU AI Act, and Total Product Lifecycle Compliance

Author: Regulatory Affairs Team, Satori One Click Solutions LLP | Reading time: ~14 minutes


If you are an Indian medical device manufacturer with an AI-powered software product — a diagnostic imaging tool, a clinical decision support system, a remote patient monitoring application — 2026 is not a year to wait and watch.

Two of the world's most consequential regulatory changes are happening right now, simultaneously. The US FDA has fundamentally restructured how it expects manufacturers to develop, validate, and maintain AI-enabled medical devices across their entire lifecycle. And the EU AI Act — the world's first comprehensive law governing artificial intelligence — has begun its full enforcement phase for high-risk AI systems, which includes virtually every AI product intended for medical use.

At the same time, India's own CDSCO released its first-ever draft guidance on Medical Device Software in October 2025, including explicit provisions for AI and ML-based SaMD. The regulatory world your AI product must navigate in 2026 looks nothing like it did even two years ago.

This guide explains what has changed, what it means for Indian manufacturers targeting the US and EU markets, and the specific compliance steps you need to take today.


What Is SaMD — and Why Does It Matter for Indian Manufacturers?

Software as a Medical Device (SaMD) is standalone software that performs a medical purpose independently — without being embedded in a physical hardware device. Examples include:

  • AI-powered radiology tools that detect abnormalities in CT or MRI scans
  • Clinical decision support software that recommends treatment protocols
  • Electrocardiogram (ECG) analysis applications
  • Wearable-connected patient monitoring systems
  • Ophthalmology screening tools using computer vision
  • Predictive algorithms for early sepsis or deterioration detection

India has a rapidly growing SaMD sector. The government's Production Linked Incentive (PLI) Scheme for Medical Devices, the National Medical Devices Policy, and active investment in MedTech innovation at IITs have made India a credible originator — not just an importer — of medical software. Companies from Bangalore, Hyderabad, Pune, and Mumbai are building AI diagnostic tools that they want to sell in the US, Europe, Canada, and Australia.

But here is the challenge most Indian SaMD manufacturers face when they try to go global: the regulatory requirements for AI-based medical devices in the US and EU are more demanding, more nuanced, and more rapidly evolving than for any other product category. Getting them wrong delays your market entry by 12 to 24 months. Getting them right — early — gives you a significant competitive advantage over rivals who are still figuring it out.


Part 1: FDA's New Approach to AI/ML SaMD — The Total Product Lifecycle (TPLC) Framework

The Core Problem FDA Was Solving

Traditional medical device regulation was built on a straightforward model: design a device, test it, submit it for clearance, sell it. The product you submitted is the product on the market. Changes require new submissions.

AI breaks this model completely. Machine learning models can update. Algorithms can drift. A model trained on one patient population may perform differently on another. An AI that is performing well today may be performing poorly six months from now as the real-world data it processes diverges from its training data. The FDA recognised by 2019 that its "traditional paradigm was not designed for adaptive artificial intelligence and machine learning technologies."

The answer FDA developed over several years is the Total Product Lifecycle (TPLC) approach — a framework that governs not just the moment a device is cleared, but every stage of its existence from design through deployment through ongoing real-world performance.

The Key FDA Guidance Documents (2021–2026)

The FDA has built its AI/ML regulatory framework through a series of guidance documents. Indian manufacturers targeting the US market need to understand all of them:

  • October 2021 — Good Machine Learning Practice (GMLP) for Medical Device Development: Guiding Principles This joint guidance from FDA, Health Canada, and the UK's MHRA established 10 core principles for how AI/ML medical devices should be designed, developed, and validated. It covers multi-disciplinary teams, appropriate data management, clinical relevance of reference datasets, model performance testing, and the importance of human-AI workflow design.
  • April 2023 — Predetermined Change Control Plans (PCCPs): Guiding Principles PCCPs are one of the most important concepts for AI SaMD manufacturers to understand. A PCCP is a plan that you submit to FDA before market authorisation, describing the kinds of changes you expect to make to your AI model after launch — and the protocol you will follow to implement those changes safely — without requiring a brand-new submission for each update. This is critical for adaptive AI systems.
  • June 2024 — Transparency for Machine Learning-Enabled Medical Devices This guidance addresses how manufacturers should communicate to users, healthcare providers, and patients about how their AI works, what its known limitations are, and how its performance has been validated.
  • January 2025 — Artificial Intelligence-Enabled Device Software Functions: Lifecycle Management and Marketing Submission Recommendations (Draft) This is the most comprehensive FDA guidance on AI SaMD to date. Published in January 2025, it applies the TPLC framework and specifies exactly what manufacturers must include in their marketing submissions. Key requirements include:
  1. A detailed model description (architecture, training methodology, intended function)
  2. Data lineage and train/test/validation splits — where your data came from and how it was used
  3. Performance tied to specific clinical claims — not just technical accuracy but clinical relevance
  4. Bias analysis and mitigation — evidence that the model performs equitably across relevant patient subgroups (gender, age, ethnicity, comorbidities)
  5. Human-AI workflow documentation — how the device fits into clinical practice, and what the human oversight mechanisms are
  6. Post-market monitoring plan — how you will track real-world performance and detect algorithmic drift
  7. Predetermined Change Control Plan (if you intend to update the model post-approval)
  • August 2025 — Final PCCP Guidance FDA finalised its PCCP guidance in August 2025, formalising the mechanism for pre-authorised algorithm modifications. This is now the pathway Indian manufacturers should design their AI development roadmaps around from the start.

What This Means in Practice for Indian SaMD Manufacturers

As of early 2026, FDA has authorised over 1,350 AI-enabled medical devices — roughly double the number from 2022. The market is growing fast. But the compliance bar has risen just as rapidly.

For an Indian company seeking FDA 510(k) clearance or De Novo authorisation for an AI SaMD product, the January 2025 draft guidance effectively defines the new minimum standard. Your submission must demonstrate TPLC thinking — not just that your device performs well today, but that you have a disciplined system for monitoring it, updating it responsibly, and maintaining performance over time.

Three specific gaps trip up Indian SaMD submissions most frequently:

1. Training data diversity. FDA expects bias analysis across patient subgroups. If your model was trained entirely on Indian patient data (or, conversely, entirely on Western patient data), you must document this clearly and address the implications for performance in the intended US patient population. The regulatory scrutiny on algorithmic bias has increased significantly.

2. Algorithm Change Protocol (ACP). Many Indian manufacturers still treat their AI model as a static product. FDA now expects a documented plan for how changes will be managed. Companies without an ACP or PCCP in place will find post-approval model updates require completely new submissions — causing costly delays.

3. Post-market surveillance design. FDA expects you to specify your monitoring metrics before market authorisation. What is your baseline false-positive rate? How will you detect distribution shift? What will trigger a safety review? These must be pre-specified in your submission, not figured out after your product is on the market.


Part 2: The EU AI Act — What Indian Exporters Must Comply With From August 2026

The World's First Comprehensive AI Law

The EU Artificial Intelligence Act (Regulation (EU) 2024/1689) entered into force on 1 August 2024 and is being phased in progressively. For medical device manufacturers, the most important deadline is August 2026, when the high-risk AI obligations become fully applicable to new AI-powered SaMD placed on the EU market.

This is not a distant horizon. If you are planning to launch an AI medical device in Europe, or if you already have CE-marked medical software in the EU, you need to be assessing and building compliance now.

How the EU AI Act Overlaps with MDR and IVDR

Before the EU AI Act, Indian manufacturers entering the European market for medical software needed to comply with the EU Medical Device Regulation (MDR 2017/745) or the In Vitro Diagnostic Regulation (IVDR 2017/746). Those requirements remain fully in force.

The EU AI Act does not replace MDR or IVDR. It adds a second, overlapping regulatory layer on top of them. Any AI system that is part of a CE-marked medical device, or that functions as a standalone SaMD, is classified as high-risk AI under the AI Act's Article 6(1). This applies automatically and cannot be avoided by product design.

The result is dual compliance: your AI SaMD must satisfy both MDR/IVDR technical documentation requirements and the new AI Act Annex IV requirements. In practical terms, this means your Notified Body will audit your technical file twice — once against MDR standards, and once against AI Act standards.

The Medical Device Coordination Group (MDCG) released MDCG 2025-6 in June 2025 — its first FAQ formally clarifying how the AI Act overlaps with MDR and IVDR. If you have a Notified Body relationship for EU market access, this document is essential reading.

The Core EU AI Act Requirements for High-Risk SaMD

Under the EU AI Act, high-risk AI systems — including all medical AI — must meet the following requirements:

  • Data governance and data quality. Your training, validation, and testing datasets must be subject to appropriate data governance practices. You must demonstrate that datasets are relevant, representative, free of errors, and complete. For Indian manufacturers training on Indian patient data, you must address how this data represents the intended EU patient population.
  • Transparency and explainability. Users and healthcare providers must be able to understand how your AI reaches its outputs. "Black box" AI that cannot provide interpretable reasoning is significantly harder to approve in the EU than in other markets.
  • Human oversight. The AI Act requires that high-risk AI systems be designed to allow effective oversight by humans. Your device must include mechanisms for clinical staff to override, correct, or reject AI outputs. This must be built into the product design and documented in your technical file.
  • Robustness, accuracy, and cybersecurity. Your AI must be resilient to errors, faults, and adversarial inputs, and must maintain consistent performance throughout its lifecycle.
  • Quality Management System aligned with AI Act requirements. Your QMS (typically aligned with ISO 13485 for medical devices) must now also demonstrate data governance, bias mitigation, transparency, and cybersecurity controls — requirements that go beyond what ISO 13485 alone covers.
  • Post-market monitoring for AI-specific drift. Similar to FDA's approach, the EU AI Act requires real-world performance monitoring and bias-drift assessment as part of your ongoing PMS obligations. Annual PDF reports are not sufficient — you need live monitoring processes.
  • Registration in the EU AI database. High-risk AI systems must be registered in a new EU AI database (currently being built). This will become a standard part of market access alongside EUDAMED registration for medical devices.

Timeline for Indian Manufacturers

  • August 2024: EU AI Act entered into force
  • August 2025: GPAI (General Purpose AI) model obligations applied — relevant if your SaMD is built on a foundation model
  • August 2026: High-risk AI obligations fully applicable to NEW AI-powered SaMD entering the EU market
  • August 2027: Full compliance required for existing high-risk AI medical devices already on the EU market

If you are launching a new AI SaMD in the EU after August 2026, you must comply from day one. If you already have an AI product on the EU market, you have until August 2027 to achieve full compliance — but given the documentation work involved, that timeline is already tight.


Part 3: India's CDSCO — The New Draft Guidance on Medical Device Software

While the focus for exporters is on FDA and EU compliance, Indian manufacturers must also understand what is changing at home. In October 2025, CDSCO released its first comprehensive Draft Guidance on Medical Device Software — a 76-page document that explicitly covers AI/ML-based SaMD for the first time.

What the CDSCO Draft Guidance Changes

Prior to this guidance, India's Medical Device Rules 2017 applied to SaMD in principle, but lacked detailed software-specific direction. Developers struggled with threshold questions — does my software qualify as a medical device? How is it classified? What documentation is required? The new draft guidance answers these questions directly.

Key provisions relevant to AI/ML SaMD:

The guidance distinguishes clearly between Software in a Medical Device (SiMD) — software embedded in physical hardware like insulin pumps — and Software as a Medical Device (SaMD), which operates independently. AI-powered diagnostic tools, ECG analysis apps, and clinical decision support systems fall under SaMD.

A risk-based classification framework now applies: Class A (low risk) through Class D (highest risk). The classification is determined by two factors — the significance of the information the software provides and the severity of the healthcare condition it addresses. AI tools that directly inform diagnosis or treatment in critical clinical scenarios will typically be classified as Class C or D.

For AI/ML-based SaMD, submission documentation must include an Algorithm Change Protocol (ACP) — a document describing how changes to the AI algorithm will be managed throughout the product lifecycle. This mirrors FDA's approach and reflects the CDSCO's intent to align with global digital health regulatory standards.

Class A and B SaMD are licensed by state authorities; Class C and D fall under CDSCO's Central Licensing Authority with more rigorous review requirements.

What this means for exporters: The CDSCO guidance, once finalised, creates a domestic regulatory foundation that is explicitly aligned with FDA and EU approaches. Indian manufacturers who build their AI SaMD compliance programmes to meet the higher bar of FDA and EU requirements will find that CDSCO compliance follows naturally — not the other way around. Design for the most demanding market first; domestic compliance becomes significantly easier as a result.


Part 4: The Five Compliance Actions Indian SaMD Manufacturers Must Take Now

1. Classify Your AI System Under All Three Frameworks Simultaneously

Before investing further in development or market entry planning, classify your AI product under all relevant frameworks:

  • US FDA: Determine device class (I, II, III), identify the appropriate submission pathway (510(k), De Novo, PMA), and identify predicate devices
  • EU AI Act + MDR/IVDR: Confirm your device is high-risk AI (almost certainly yes), determine your MDR risk class, identify your Notified Body requirements
  • CDSCO: Classify under the draft SaMD framework (Class A–D) and identify your licensing authority

Classification drives everything downstream — documentation requirements, timelines, costs, and strategy. Many Indian manufacturers waste 6–12 months because they begin documentation before completing a thorough regulatory classification analysis.

2. Build Your Data Governance Framework Before Training Your Model

Both FDA and the EU AI Act place extraordinary emphasis on data quality, data lineage, and bias analysis. This is an area where Indian manufacturers consistently underinvest at the beginning of a project, then pay the price during regulatory review.

Specifically, you need to document:

  • Where your training, validation, and test data came from
  • How the data was labelled, curated, and quality-controlled
  • Whether the data represents the patient population you intend to serve (demographic diversity, comorbidity range, equipment variability)
  • Your bias analysis methodology — how you evaluated and addressed performance disparities across patient subgroups

If your device will be sold in the US or EU, bias analysis across gender, age, and ethnicity is non-negotiable. Build this into your data collection and annotation plan from the start — retrofitting it after model training is extremely costly.

3. Design Your Algorithm Change Protocol (ACP) Before Launch

One of the most consequential decisions you will make about your AI product is whether to design it as a locked model (no post-market learning) or an adaptive model (capable of updating). The regulatory implications are significant.

Locked models follow a more predictable approval pathway but require a new submission for any meaningful algorithm change. Adaptive models — or locked models with a PCCP in place — allow pre-authorised updates but require significantly more upfront documentation.

As of 2025, fewer than 10% of FDA-cleared AI devices had an authorised PCCP. This means most manufacturers are paying the price of new submissions every time they want to improve their model. Indian manufacturers entering the market now have the opportunity to design PCCP-readiness in from the start — giving them a long-term competitive advantage in speed and cost of product iteration.

4. Build Human Oversight Into Product Design — Not as an Afterthought

Both FDA and the EU AI Act require that high-risk AI systems allow effective human oversight. For clinical AI, this means your product design must include:

  • Clear display of AI confidence scores or uncertainty estimates
  • Mechanisms for clinicians to override, accept, or reject AI outputs
  • Audit trails of AI-assisted decisions
  • User interface design that supports rather than supplants clinical judgment

Products that present AI outputs as definitive answers, without uncertainty quantification or override capability, will face significant friction in both FDA submissions and EU Notified Body assessments. Design for human-AI collaboration from your first prototype.

5. Align Your QMS for Dual Compliance From Day One

Your Quality Management System is the backbone of regulatory approval in every market. For AI SaMD manufacturers targeting the US and EU, a QMS that meets only ISO 13485 is no longer sufficient.

Your QMS must also demonstrate:

  • Data governance procedures (for FDA GMLP and EU AI Act requirements)
  • Software lifecycle management aligned with IEC 62304
  • Risk management aligned with ISO 14971
  • Cybersecurity controls (increasingly scrutinised in both FDA and EU reviews)
  • Bias monitoring and algorithmic drift detection procedures
  • Post-market surveillance processes that include AI-specific performance metrics

Building a QMS that satisfies all these requirements simultaneously is complex — but it is far less costly than building separate systems for each market, or rebuilding your QMS after a regulatory rejection.


How Satori Helps Indian SaMD Manufacturers Navigate Global Compliance

At Satori One Click Solutions LLP, we work with Indian medical device manufacturers at every stage of their global market access journey. For AI/ML-based SaMD companies, our regulatory support spans:

  • Pre-submission strategy: We help you classify your device correctly under FDA, EU MDR/AI Act, and CDSCO frameworks simultaneously — before you commit to a documentation or development approach. Getting this right at the start saves 6–18 months.
  • FDA submission preparation: We prepare your 510(k) or De Novo submissions with full TPLC documentation — model descriptions, data lineage, bias analysis, PCCP development, and post-market monitoring plans aligned with FDA's January 2025 draft guidance.
  • EU MDR + AI Act dual compliance: We support your CE marking pathway under MDR/IVDR and build the AI Act Annex IV documentation layer your Notified Body will require from August 2026 onwards.
  • CDSCO SaMD licensing: We guide your domestic licensing under the new CDSCO Medical Device Software framework, including Algorithm Change Protocol preparation and QMS alignment.
  • US Agent and Indian Agent services: Satori provides FDA US Agent services for Indian manufacturers required to register with the FDA, and Indian Agent services for foreign companies registering with CDSCO.
  • Ongoing compliance support: Regulatory requirements for AI SaMD are evolving faster than in any other product category. We provide ongoing monitoring of FDA, EU, CDSCO, Health Canada, and TGA regulatory changes affecting AI medical devices, so your compliance posture stays current without requiring constant internal resource investment.

Frequently Asked Questions

Does my AI diagnostic app qualify as a SaMD? If your software is intended to perform a medical purpose — diagnosis, treatment, monitoring, prevention, or prediction — independently of a physical hardware device, it is almost certainly classified as SaMD in India, the US, and the EU. The key factor is intended use, not how the software is technically built. Consult a regulatory specialist before assuming your product falls outside SaMD scope.

Do I need FDA clearance before I can sell my AI medical device in the US? Yes, with very limited exceptions. Most AI SaMD products require 510(k) clearance, De Novo classification, or PMA approval before US market entry. The appropriate pathway depends on your device classification and whether a substantially equivalent predicate device exists. Satori can assess your pathway in a pre-submission consultation.

When does EU AI Act compliance become mandatory for my product? For new AI-powered SaMD entering the EU market, full high-risk AI obligations apply from August 2026. For existing products already on the EU market, the deadline is August 2027. If you have a CE-marked AI medical device today, you have less than 18 months to achieve full AI Act compliance alongside your existing MDR/IVDR obligations.

Our AI model was trained on Indian patient data. Can we use this for US or EU submissions? You can, but you must address the representativeness and bias implications thoroughly. FDA and the EU AI Act both require bias analysis across patient subgroups in the intended market population. Training data from an Indian population may have demographic, environmental, and equipment characteristics that differ meaningfully from US or EU patients. A well-documented data governance and bias analysis plan can address this — but it must be proactive, not retrofitted.

What is a Predetermined Change Control Plan (PCCP) and do we need one? A PCCP is a document submitted to FDA as part of your market authorisation that describes categories of changes you plan to make to your AI model post-approval, and the processes you will follow to implement them safely without requiring a new submission for each update. PCCPs are optional but strongly recommended for adaptive AI systems. Manufacturers without a PCCP must submit a new 510(k) for any significant algorithm change — which can take 6–12 months per update. Given the rapid development cycles of most AI products, designing a PCCP from the start is almost always the right strategic decision.


Summary: The 2026 Regulatory Landscape for AI SaMD in Brief

IndiaCDSCO MDR 2017 + Draft MDS Guidance (Oct 2025)Risk classification (A–D), Algorithm Change Protocol, QMS per ISO 13485
United StatesFDA TPLC Framework + January 2025 Draft Guidance + PCCP Final GuidanceTPLC documentation, bias analysis, PCCP, post-market monitoring plan
European UnionEU MDR/IVDR + EU AI Act (Regulation (EU) 2024/1689)Dual compliance: MDR technical file + AI Act Annex IV; mandatory from August 2026 for new products
CanadaHealth Canada SaMD guidance (AI-specific rules under development)GMLP alignment; Health Canada is developing AI-specific guidance harmonised with FDA
AustraliaTGA SaMD frameworkIMDRF SaMD alignment; TGA references FDA and EU guidance closely

Ready to Build Your Global AI SaMD Regulatory Strategy?

The regulatory landscape for AI medical devices is more complex and more consequential than for any other product category — but it is navigable with the right partner and the right approach.

At Satori One Click Solutions LLP, we combine regulatory expertise across CDSCO, FDA, Health Canada, EMA/MDR, TGA, and MHRA with deep practical experience in medical device compliance. We work with Indian manufacturers building AI diagnostic tools, clinical decision support systems, and connected health platforms who want to reach global markets efficiently and sustainably.

Contact our team for a complimentary regulatory assessment for your AI SaMD product. We will help you understand your classification, your submission pathway, your documentation requirements, and your timeline — so you can make confident strategic decisions about your global market entry.

Email: satoriocs@gmail.com

Contact: +91 98290 98077

Website: www.satoriocs.com


Satori One Click Solutions LLP is a global pharma, medical device, and healthcare product regulatory consultancy operating from offices in India, Canada, and the United States. We provide end-to-end regulatory affairs, quality management, legal compliance, and market access services across USFDA, Health Canada, CDSCO, EMA, MHRA, and TGA jurisdictions.

Leave a Reply

Your email address will not be published. Required fields are marked *