TL;DR
| Question | Answer |
|---|---|
| Is ISO 42001 aligned with Article 9 AI Act? | Substantially, yes — the iterative risk management cycle, documentation discipline, and lifecycle orientation closely mirror Article 9 expectations. |
| Is ISO 42001 certification proof of Article 9 compliance? | No. As of May 2026 it is not a formally harmonised standard under Article 40 of the AI Act, and no implementing act has been adopted granting it presumption of conformity. |
| What percentage of Article 9 does it cover? | Approximately 70 %. The gaps relate primarily to AI-Act-specific obligations on vulnerable persons, children, Annex IV technical documentation, and conformity assessment triggers. |
| Can it serve as an operational baseline? | Yes — and it is the most efficient baseline available if you are already certified or are building a management system from scratch. |
| What about ISO/IEC 23894:2023? | It provides closer textual alignment to Article 9's risk management vocabulary but shares the same harmonisation status: not yet formally recognised under Article 40. |
Table of Contents
- What ISO/IEC 42001:2023 is
- What Article 9 of the EU AI Act requires
- Clause-by-clause mapping table
- Where ISO 42001 and Article 9 align
- Where ISO 42001 falls short of Article 9
- Where ISO/IEC 23894:2023 fits
- Practical guidance for compliance leads
- DACH-specific overlay
- Next steps with EKM Global Consulting GmbH
1. What ISO/IEC 42001:2023 Is
ISO/IEC 42001:2023, Information technology — Artificial intelligence — Management system, is the first internationally recognised management system standard specifically designed for organisations that develop, provide, or use AI systems. Published in December 2023 by ISO/IEC JTC 1/SC 42, it follows the Annex SL high-level structure (HLS) shared by ISO/IEC 27001:2022, ISO 9001:2015, and all modern management system standards. This structural alignment means that organisations already holding ISO 27001 or ISO 9001 certification can integrate ISO 42001 into an existing management system with significantly reduced duplication.
The standard is built on the Plan-Do-Check-Act (PDCA) cycle and covers:
- Organisational context and scope determination (Clause 4)
- Leadership, policy, and role assignment (Clause 5)
- Planning, risk and opportunity management, and AI-specific impact assessment (Clause 6)
- Support requirements including competence, awareness, and documentation (Clause 7)
- Operational planning and control, including AI system impact assessment (Clause 8)
- Performance evaluation, internal audit, and management review (Clause 9)
- Continual improvement and nonconformity management (Clause 10)
- Annex A (normative) with 38 controls across 9 control objectives covering policies, internal organisation, resources for AI systems, AI system impact assessment, AI system lifecycle, responsible AI, third-party relationships, and data for AI
Certification is granted by accredited conformity assessment bodies following an audit against the standard's requirements. Certification is voluntary. There is no mandatory industry or legal obligation to hold ISO 42001 certification — including under the EU AI Act.
2. What Article 9 of the EU AI Act Requires
Article 9 of Regulation (EU) 2024/1689 (the AI Act) imposes a mandatory risk management system on providers of high-risk AI systems (as classified under Annex III and Article 6). A detailed treatment of the operative requirements is available in the Article 9 RMS Practitioner Guide.
In summary, Article 9 requires a documented, iterative risk management process throughout the entire lifecycle of a high-risk AI system. The four operative obligations are:
- Article 9(2): Identification and analysis of known and reasonably foreseeable risks to health, safety, or fundamental rights; estimation and evaluation of risks that may emerge under the intended purpose or reasonably foreseeable misuse.
- Article 9(4): Adoption of suitable risk management measures giving due consideration to the state of the art.
- Article 9(5)and (6): Elimination or reduction of risks through design and development; implementation of adequate mitigation and control measures; provision of information pursuant to Article 13; training measures where residual risk is accepted.
- Article 9(7), (8), and (9): Testing to verify risk management measures; specific obligations regarding risks to persons belonging to vulnerable groups including children; obligations to review and update the risk management system.
The risk management system must be documented (Article 17), its outputs must feed into the Annex IV technical documentation, and its adequacy is assessed during conformity assessment under Article 43. Substantial modifications to a high-risk AI system trigger a review of the risk management system under Article 43(4).
3. Clause-by-Clause Mapping Table
The following table maps ISO 42001 clauses and Annex A controls against specific Article 9 sub-clauses. Coverage assessments reflect the substantive overlap in process requirements; they do not imply that ISO 42001 certification constitutes legal proof of compliance with the mapped Article 9 obligation.
| ISO/IEC 42001:2023 Reference | Article 9 Sub-clause | Coverage | Notes |
|---|---|---|---|
| Clause 6.1 — Risk and opportunity assessment | Art. 9(2)(a)–(b): risk identification and analysis | Substantial | 42001 requires risk identification across AI system impact; vocabulary differs from Art. 9's health/safety/fundamental-rights framing |
| Clause 6.2 — AI system impact assessment | Art. 9(2)(c)–(d): risk estimation and evaluation | Substantial | Impact assessment must be recast in Art. 9 terms; 42001 does not mandate fundamental-rights impact assessment by default |
| Clause 8.4 — AI system operation and monitoring | Art. 9(4): risk management measures | Moderate | 42001 covers operational controls; Art. 9(4) additionally requires state-of-the-art consideration explicitly |
| Annex A, A.6 — AI system lifecycle | Art. 9(5): lifecycle risk treatment | Substantial | Both require iterative treatment across development, deployment, and decommissioning |
| Annex A, A.2.2 — Internal responsibilities for AI | Art. 9(5): design-based risk reduction | Moderate | 42001 assigns responsibility; Art. 9(5) specifies design-level elimination before other measures |
| Annex A, A.8 — Data for AI systems | Art. 9(6): data governance and training-set controls | Substantial | Strong alignment; 42001 Annex A.8 covers data quality, provenance, and bias |
| Clause 9.1 — Performance evaluation | Art. 9(7): testing obligations | Substantial | 42001 mandates performance monitoring; Art. 9(7) adds pre-market testing requirements specific to high-risk AI |
| Clause 9.3 — Management review | Art. 9(9): system update and review | Substantial | Management review cadence maps to Art. 9(9) periodic update obligation |
| Annex A, A.9 — Third-party and supplier relationships | Art. 9(4) read with Art. 25: supply chain risk | Moderate | 42001 addresses supplier obligations; Art. 9 read with Art. 25 imposes specific deployer-provider allocation |
| Clause 7.5 — Documented information | Art. 9 read with Art. 11/17: documentation requirements | Substantial | 42001 requires documented information; must be expanded to meet Annex IV technical documentation specifics |
| Annex A, A.4 — AI system impact assessment process | Art. 9(2): risk assessment process | Moderate | Procedural alignment is strong; Art. 9 additionally requires fundamental-rights lens not present in 42001 by default |
For a full treatment of Article 9's sub-clause requirements, consult the Article 9 RMS Practitioner Guide.
4. Where ISO 42001 and Article 9 Align
Risk management process and iterative cycle. Both ISO 42001 and Article 9 require an iterative, documented risk management process tied to the full AI system lifecycle. The PDCA structure of ISO 42001 maps naturally to Article 9's requirement that the risk management system be updated as the AI system evolves.
Lifecycle orientation. Annex A, A.6 of ISO 42001 addresses the AI system lifecycle from design through decommissioning — the same scope demanded by Article 9's lifecycle obligation. Organisations that have implemented A.6 controls have addressed the lifecycle framing that Article 9 requires.
Data governance overlap. ISO 42001 Annex A.8 — covering data quality, labelling, and bias — maps directly to the data-related risk controls that Article 9(6) expects for high-risk AI systems, particularly in relation to training, validation, and test datasets.
Documentation discipline. Clause 7.5 of ISO 42001 requires documented information for all management system processes. This discipline is a prerequisite for producing the Annex IV technical documentation that Article 11 and Article 17 require, even though 42001 does not specify Annex IV's content structure.
Management review as periodic update. The Clause 9.3 management review obligation creates the governance cadence that Article 9(9) expects for periodic review and update of the risk management system following deployment.
5. Where ISO 42001 Falls Short of Article 9
Article 9(8): specific treatment of vulnerable persons. Article 9(8) requires providers to give particular attention to risks to persons belonging to vulnerable groups due to their age, disability, or socioeconomic status. ISO 42001 contains no equivalent specific obligation. Organisations relying solely on 42001 must introduce a supplementary control explicitly addressing this population-specific risk dimension.
Article 9(9): specific treatment of children. Article 9(9) separately singles out children as a protected category requiring dedicated risk assessment treatment. This is absent from ISO 42001's control set and must be added explicitly.
Integration with Annex IV technical documentation. Article 9's risk management outputs are required to be reflected in the Annex IV technical documentation (per Articles 9 and 11 read together). ISO 42001's documented information requirements do not map to Annex IV's structure. A documented mapping exercise is required to trace risk management outputs into the Annex IV dossier.
Conformity assessment under Article 43 and Article 43(4). The adequacy of an Article 9 risk management system is assessed as part of conformity assessment under Article 43. ISO 42001 certification does not constitute, replace, or contribute procedural credit toward conformity assessment. Organisations must ensure their risk management documentation is accessible to the notified body or self-assessment process on AI Act terms. A detailed treatment of conformity assessment is available in the Conformity Assessment and Annexes VI and VII guide.
Substantial modification triggers. Article 43(4) requires that a substantial modification to a high-risk AI system triggers a new or revised conformity assessment, which in turn requires an updated risk management system. ISO 42001 does not contain a control explicitly tied to modification triggers under AI Act definitions. This must be addressed through a supplementary procedure.
Fundamental-rights impact framing. Article 9(2) is explicit that risk analysis must address risks to health, safety, and fundamental rights. ISO 42001's impact assessment (Annex A.4) is not framed around fundamental rights. Organisations must ensure their risk identification and evaluation processes explicitly incorporate fundamental-rights considerations as required by Article 9's practitioner obligations.
6. Where ISO/IEC 23894:2023 Fits
ISO/IEC 23894:2023, Information technology — Artificial intelligence — Guidance on risk management, is a companion standard to ISO 42001 and provides substantially more granular guidance on risk management processes for AI systems. Its risk management vocabulary — including risk sources, risk treatment, residual risk, and risk communication — maps considerably more directly to the language of Article 9 than the management system framework of ISO 42001.
Organisations that implement ISO 23894 as the operational methodology underpinning their ISO 42001 risk management process will find that the textual alignment with Article 9 is materially closer. ISO 23894 is, however, a guidance standard and does not support third-party certification.
Critically, as of May 2026, neither ISO/IEC 42001:2023 nor ISO/IEC 23894:2023 has been formally harmonised under the EU AI Act per Article 40. The European Commission has received input from standards bodies and industry stakeholders encouraging the recognition or mandating of these standards in implementing acts, but no such act has been adopted. Until a harmonisation decision is published in the Official Journal of the European Union, neither standard confers a presumption of conformity with Article 9 or any other AI Act requirement. Compliance leads should monitor CEN/CENELEC standardisation work, particularly in JTC 21, for developments on harmonisation.
7. Practical Guidance for Compliance Leads
If your organisation holds ISO 42001 certification:
You have approximately 70 % of an Article 9-compliant risk management system in operational form. The work that remains is not a rebuild — it is a targeted gap-bridging exercise. The following steps are recommended:
- Map your existing ISO 42001 risk register and impact assessment outputs to the Article 9 sub-clause structure, using the table in Section 3 as a starting reference. Document the mapping explicitly in your management system.
- Add a supplementary procedure addressing Article 9(8) vulnerable persons and Article 9(9) children. This can be implemented as a dedicated section within your existing AI system impact assessment template.
- Introduce a fundamental-rights lens into your risk identification methodology. For high-risk AI systems falling under Annex III, a fundamental-rights impact assessment aligned with Article 9(2) should be conducted as part of the Clause 6 planning process.
- Build a traceability matrix connecting your risk management documentation to the Annex IV technical documentation structure. This is a documentation discipline exercise, not a substantive compliance gap.
- Add a documented procedure for substantial modification assessment, triggered when a change to the AI system may meet the AI Act's definition of substantial modification.
- Ensure your management system documentation is accessible and legible in the format required for conformity assessment under Articles 43 and Annexes VI/VII.
If your organisation does not yet hold ISO 42001 certification:
Building Article 9 compliance directly — using ISO 23894:2023 as the risk management methodology and ISO 42001's management system structure as the governance framework — is a viable and efficient path. Pursuing ISO 42001 certification simultaneously provides management system maturity and demonstrates organisational commitment to AI governance, which is increasingly relevant to enterprise procurement and public-sector tendering in the DACH region.
8. DACH-Specific Overlay
DAkkS-accredited certification bodies. In Germany, ISO 42001 certification audits are conducted by bodies accredited by the Deutsche Akkreditierungsstelle (DAkkS). As of 2026, several DAkkS-accredited management system certification bodies have extended their scopes to cover ISO 42001. Organisations based in Germany, Austria, or Switzerland should verify that their prospective certification body holds DAkkS accreditation (or the equivalent national body in Austria and Switzerland: BMK/ACCREDIA equivalent and SAS respectively) and has demonstrable competence in AI system auditing, not only management system auditing.
BSI C5 and AI risk controls. The Bundesamt für Sicherheit in der Informationstechnik (BSI) Cloud Computing Compliance Criteria Catalogue (C5) addresses cloud infrastructure security controls that are frequently relevant to AI systems operating on cloud infrastructure. BSI C5 is not an AI-specific framework, but its controls for availability, integrity, and access management are complementary to the operational security controls that an Article 9 risk management system should address for AI systems in cloud-deployed configurations. Organisations already aligned with BSI C5 will find that a portion of their infrastructure security evidence is reusable within their ISO 42001 control documentation.
BSI guidance on AI management systems. The BSI has published technical guidance and position papers on AI security and management, including material that references ISO/IEC JTC 1/SC 42 standards. Compliance leads in Germany should monitor BSI AI security publications for guidance that may supplement or contextualise the AI Act's risk management requirements at the national implementation level.
BaFin context for financial sector. For financial sector entities within BaFin's supervisory remit, AI risk management obligations under the AI Act overlay existing model risk management expectations. An ISO 42001-based management system that is explicitly mapped to Article 9 obligations provides a defensible audit trail for regulatory examinations that increasingly address AI governance alongside traditional model risk.
9. Next Steps with EKM Global Consulting GmbH
EKM Global Consulting GmbH advises regulated-sector and technology organisations in the DACH region on EU AI Act compliance, including Article 9 risk management system design, ISO 42001 gap assessments, and Annex IV technical documentation structuring.
If you are evaluating your current ISO 42001 posture against Article 9, the EU AI Act Quick Scan at app.ekmgc.de provides an initial scoping assessment at no cost.
For organisations requiring a structured gap analysis and implementation roadmap, advisory engagements are described at ekmgc.de/eu-ai-act.html.
Frequently Asked Questions
Does ISO 42001 certification give us a presumption of conformity with the EU AI Act?
No. A presumption of conformity under Article 40 of the AI Act arises only from harmonised standards listed in the Official Journal of the European Union. As of May 2026, ISO/IEC 42001:2023 has not been harmonised under the AI Act, and no implementing act granting it this status has been adopted. Certification demonstrates management system maturity and provides a strong operational baseline, but it does not substitute for AI Act conformity assessment.
If we implement ISO 42001, do we still need to conduct a conformity assessment under Article 43?
Yes. For high-risk AI systems subject to Article 43, conformity assessment is mandatory regardless of management system certifications held. ISO 42001 certification may support the evidence base for conformity assessment — particularly regarding the risk management process — but it does not satisfy or replace the conformity assessment procedure. See the Conformity Assessment and Annexes VI and VII guide for procedural detail.
Is ISO/IEC 23894:2023 more relevant than ISO 42001 for Article 9 compliance?
ISO 23894 provides closer textual alignment with Article 9's risk management vocabulary and is particularly useful as the methodological layer beneath an ISO 42001 management system. However, it is a guidance document and does not support third-party certification. For organisations seeking both methodological alignment and certifiable governance maturity, using ISO 23894 as the operative methodology within an ISO 42001 framework is the recommended combination.
We are a deployer, not a provider. Does Article 9 apply to us?
Article 9 obligations apply primarily to providers of high-risk AI systems. Deployers have distinct obligations under Article 26 and are required to use high-risk AI systems in accordance with the instructions for use and to implement appropriate technical and organisational measures. However, deployers who substantially modify a high-risk AI system may assume provider obligations, including Article 9 obligations. The Article 9 RMS Practitioner Guide addresses provider and deployer scope in detail.
How long does a gap-bridging exercise typically take for an organisation with ISO 42001 already in place?
This depends on the maturity of the existing management system, the number and complexity of high-risk AI systems in scope, and whether the organisation has an existing fundamental-rights impact assessment methodology. For a single high-risk AI system with a well-documented ISO 42001 implementation, a structured gap analysis and documentation remediation can typically be completed within six to ten weeks. Organisations with multiple Annex III systems or complex supply chains should anticipate a longer engagement.
EKM Global Consulting GmbH — Baden-Baden, Germany. Founded January 2013. EU AI Act advisory, TPRM, and AI governance for regulated organisations in the DACH region.
Founder: Elshan Musayev
Schema.org Markup Recommendation
This page should carry a SchemaArticle (@type: "Article") with the following additional properties:
{
"@context": "https://schema.org",
"@type": "Article",
"headline": "ISO/IEC 42001:2023 vs EU AI Act Article 9 RMS — Does the Certification Cover the Regulation?",
"description": "ISO/IEC 42001 certification is not a formally harmonised standard under the EU AI Act and does not alone constitute proof of Article 9 compliance. This article maps relevant clauses and identifies specific gaps.",
"author": {
"@type": "Organization",
"name": "EKM Global Consulting GmbH",
"url": "https://ekmgc.de"
},
"publisher": {
"@type": "Organization",
"name": "EKM Global Consulting GmbH",
"url": "https://ekmgc.de"
},
"dateModified": "2026-05-09",
"about": [
{ "@type": "Thing", "name": "ISO/IEC 42001:2023" },
{ "@type": "Thing", "name": "EU AI Act Article 9" },
{ "@type": "Thing", "name": "AI Risk Management System" }
],
"mentions": [
{ "@type": "Legislation", "name": "Regulation (EU) 2024/1689", "alternateName": "EU AI Act" },
{ "@type": "Standard", "name": "ISO/IEC 42001:2023" },
{ "@type": "Standard", "name": "ISO/IEC 23894:2023" }
]
}