ISO/IEC 42001 — AI Management System
- Organization
- International Organization for Standardization / International Electrotechnical Commission
- Official URL
- View framework (opens in new tab)
An international standard specifying requirements for establishing, implementing, maintaining, and continually improving an AI management system within organizations.
ISO/IEC 42001, published in December 2023, is the first international management system standard specifically designed for artificial intelligence. Following the established structure of ISO management system standards (such as ISO 27001 for information security), it provides a systematic framework for organizations to govern AI responsibly throughout the system lifecycle.
The standard adopts a process-based approach, requiring organizations to establish an AI management system (AIMS) that addresses the unique characteristics and risks of AI technologies. Key requirements include conducting AI impact assessments, implementing data quality management, ensuring appropriate human oversight, maintaining transparency about AI system capabilities and limitations, and establishing clear accountability for AI-related decisions and outcomes.
ISO/IEC 42001 is designed to be certifiable, allowing organizations to demonstrate compliance through independent third-party audits. Its alignment with other ISO management system standards facilitates integration into existing governance frameworks. The standard is particularly relevant for organizations seeking to demonstrate responsible AI practices to regulators, customers, and business partners, and it complements both the NIST AI RMF and EU AI Act requirements.
Controls
| ID | Control | Description |
|---|---|---|
| CONTEXT | Context of the Organization | Understanding the organization's context, stakeholder needs, and scope of the AI management system. |
| LEADERSHIP | Leadership and Commitment | Requirements for top management commitment, AI policy establishment, and role assignment. |
| RISK-OPP | Risk and Opportunity Assessment | Systematic identification and treatment of AI-related risks and opportunities. |
| SUPPORT | Support and Resources | Competence requirements, awareness, communication, and documentation for AI management. |
| OPERATION | Operational Planning and Control | AI system lifecycle management including impact assessment, data management, and third-party considerations. |
| EVALUATION | Performance Evaluation | Monitoring, measurement, internal audit, and management review of AI management system effectiveness. |
Other Frameworks
Last updated: 2026-02-25
→ AI Risk Resources & Regulatory Frameworks · → All frameworks