SIGNAL DETAIL

Singapore commits AI accountability to a tripartite developer-deployer-user structure: the MOH-HSA AI in Healthcare Guidelines Version 2.0 launched 10 March 2026 require healthcare AI manufacturers to red-team generative-AI models, deployers to establish organisational AI governance, users to maintain clinical judgement, and direct-to-consumer AI applications to register with HSA under the Health Products Act 2007.

AI in Healthcare Guidelines Version 2.0 (AIHGle 2.0) — Updated by MOH and HSA (AIHGle 2.0 (March 2026) · WEF 10 March 2026)

Ministry of Health · Pub 10 March 2026 · WEF 10 March 2026 · MEDIUM Guideline
Regulatory reference: AIHGle 2.0 (March 2026)
Specialist Panel Analysis · RegLegBrief · Verified Primary Source

International references analysed by the Specialist Panel: Regulation (EU) 2024/1689 (Artificial Intelligence Act); Regulation (EU) 2017/745 (Medical Device Regulation); FDA Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence-Enabled Device Software Functions (December 2024); UK MHRA Software and AI as a Medical Device Change Programme (December 2024 roadmap); WHO Regulatory Considerations on Artificial Intelligence for Health (October 2023); ISO/IEC 42001:2023 (Artificial Intelligence Management Systems); IMDRF/AIML WG/N88 FINAL:2025 (Good Machine Learning Practice for Medical Device Development: Guiding Principles); IMDRF/AIMD WG/N67:2022 (Machine Learning-enabled Medical Devices: Key Terms and Definitions); Therapeutic Goods Act 1989 (Cth) (Australia TGA SaMD framework).

Domestic references analysed by the Specialist Panel: Health Products Act 2007 (Act 14 of 2007); Health Products (Medical Devices) Regulations 2010; Healthcare Services Act 2020 (Act 27 of 2020); Medical Registration Act 1997; Personal Data Protection Act 2012 (Act 26 of 2012); Human Biomedical Research Act 2015 (Act 29 of 2015); MOH and HSA — AI in Healthcare Guidelines Version 2.0 (AIHGle 2.0), 10 March 2026; MOH — Emerging Regulatory Policy Issues page; HSA — GL-04 Regulatory Guidelines for Software Medical Devices (life-cycle approach); HSA / MFDS Korea — Joint Guiding Principles for Conducting Clinical Trial for ML-enabled Medical Devices; PDPC — Advisory Guidelines for the Healthcare Sector (2023); PDPC — Advisory Guidelines on use of Personal Data in AI Recommendation and Decision Systems (2024); IMDA — Model AI Governance Framework (2nd Edition); BAC — Report on Ethical Use of Big Data and AI in Biomedical Research (2025); MOH — Cybersecurity and Data Security Essentials (2026) (cross-ref RLB-SG-2026-00048).

The Ministry of Health (MOH) and the Health Sciences Authority (HSA) jointly published AI in Healthcare Guidelines Version 2.0 (AIHGle 2.0) on 10 March 2026, replacing AIHGle 1.0 issued in October 2021 by MOH, HSA, and Synapxe (then IHiS). The forty-two-page Guideline restructures Singapore's healthcare-AI governance around a developer–deployer–user triad and adds substantive treatment of generative AI, continuous-learning models, and direct-to-consumer applications. Publication carries forward as the operational effective date; AIHGle 2.0 is structured as a living document subject to periodic refinement.

The Guideline prescribes ten sections covering ethical principles, total-product-lifecycle obligations on developers, organisational-governance and risk-assessment requirements on deployers, and clinical-judgement and patient-communication duties on users. AIHGle 2.0 expressly complements HSA's Regulatory Guidelines for Software Medical Devices (GL-04, life-cycle approach) and the Health Products (Medical Devices) Regulations 2010, which together establish the binding regulatory regime under the Health Products Act 2007 (Act 14 of 2007).

The 2021 framework predated the widespread emergence of generative AI. AIHGle 2.0 addresses the amplified risks introduced by machine learning and deep learning systems: hallucination, undesirable content generation, inadvertent disclosure of sensitive data through user prompts, and vulnerability to adversarial attacks. The Guideline targets the more complex subset of AI solutions whose opacity and scalability have grown with model capability, while preserving broad applicability to all AI used in healthcare settings.

AIHGle 2.0 strengthens accountability through clarity of responsibilities for each stakeholder group, improves trust via transparency guidance to facilitate informed decision-making, and updates AI-deployment guidance for risk assessment and mitigation. Section 8.2 introduces specific obligations on generative-AI deployment, including red-teaming, retrieval-augmented generation for hallucination reduction, baseline and component testing, and front-end labelling of AI-generated outputs in user interfaces.

The RegLegBrief Specialist Panel considered the AIHGle 2.0 Guideline alongside the MOH Emerging Regulatory Policy Issues page, the HSA Digital Health regulatory framework, the HSA Guidance Documents index for medical devices, and the bilateral Singapore–Korea Guiding Principles for Conducting Clinical Trial for Machine Learning-enabled Medical Devices co-issued by HSA and the Korean Ministry of Food and Drug Safety. Read in conjunction, these documents establish that AIHGle 2.0 sits as a horizontal governance overlay above HSA's vertical product-regulation regime, with cross-border bilateral cooperation supplying the practical edge for machine-learning-enabled medical devices.

Read against the comparable jurisdiction cohort, the Specialist Panel finds that Singapore's developer–deployer–user architecture is closer to soft-governance peers than to the European hard-regulation track. The Regulation (EU) 2024/1689 (Artificial Intelligence Act), in force from 1 August 2024, classifies AI systems acting as safety components of medical devices regulated under Regulation (EU) 2017/745 (Medical Device Regulation) as Annex III high-risk, with full compliance for high-risk medical-device AI required by August 2027. The FDA Marketing Submission Recommendations for a Predetermined Change Control Plan for Artificial Intelligence-Enabled Device Software Functions (December 2024) takes a third path, authorising pre-specified iterative model updates within a single marketing submission rather than per-version re-clearance.

The UK MHRA Software and AI as a Medical Device Change Programme roadmap of December 2024 schedules a Post-Market Surveillance Statutory Instrument in force from June 2025 and pre-market reform for 2026. The WHO Regulatory Considerations on Artificial Intelligence for Health (October 2023) sets out eighteen considerations across documentation, risk management, validation, data quality, privacy, and stakeholder engagement. The IMDRF/AIML WG/N88 FINAL:2025 (Good Machine Learning Practice for Medical Device Development) consolidates ten GMLP principles co-authored with FDA, Health Canada, and MHRA, with IMDRF/AIMD WG/N67:2022 (Machine Learning-enabled Medical Devices: Key Terms and Definitions) supplying the foundational ML-MD terminology directly cited in the Singapore–Korea bilateral. The complete document set is listed in the document panel below.

AIHGle 2.0 binds three formal categories. Healthcare AI developers — including manufacturers of AI-enabled medical devices regulated by HSA under the Health Products Act 2007 and Health Products (Medical Devices) Regulations 2010 — must observe total-product-lifecycle controls. Healthcare deployers — healthcare institutions licensed under the Healthcare Services Act 2020 (Act 27 of 2020) — must establish organisational AI governance, risk-assessment frameworks, and pre-deployment testing. Healthcare users — registered medical practitioners under the Medical Registration Act 1997 regulated by the Singapore Medical Council, registered nurses under the Singapore Nursing Board, registered pharmacists under the Singapore Pharmacy Council, and registered allied health professionals — bear clinical-judgement and patient-communication responsibilities.

The accountability split materially changes operational practice. Developers must subject generative-AI models to red-teaming and document the outputs; deployers must support fact-checking workflows and provide clear front-end AI labelling; users must maintain clinical judgement as the primary decision-making tool and review generative-AI outputs for sensitive or identifiable information. Direct-to-consumer AI applications that fall within the AI-Medical Device or Software Medical Device definition must be registered with HSA — regulatory oversight is preserved regardless of consumer-facing positioning.

Adjacent obligations attach. Personal-data handling within AI solutions remains subject to the Personal Data Protection Act 2012 (Act 26 of 2012) and the PDPC Advisory Guidelines for the Healthcare Sector (2023) and on Personal Data in AI Recommendation and Decision Systems (2024). Healthcare-research uses fall additionally within the Human Biomedical Research Act 2015 (Act 29 of 2015) and the Bioethics Advisory Committee's 2025 Report on Ethical Use of Big Data and AI in Biomedical Research. Cybersecurity controls cross-reference RLB-SG-2026-00048 (MOH Cybersecurity and Data Security Essentials).

AIHGle 2.0 takes effect from its publication on 10 March 2026 and is structured as a living document. While the Guideline itself is non-binding, the underlying legal obligations it interprets — under the Health Products Act 2007, Healthcare Services Act 2020, and parallel professional Acts — carry the full enforcement consequences of those instruments. Healthcare AI deployers and developers should align internal governance with the Guideline before HSA pre-market consultation or HCSA licensing reviews. This regulatory development is preserved and cited by RegLegBrief at reglegbrief.com/cite/RLB-SG-2026-00060.

CITE THIS SIGNAL
reglegbrief.com/cite/RLB-SG-2026-00060
Open full citation page →