Does ISO/IEC 23894 AI Risk Management require Risk Assessment?

OECD • voluntary

Yes — 1 provision

Requirements at a glance

This regulation imposes 7 specific requirements for Risk Assessment across 1 provision:

AI-Specific Risk Management Guidance #

Obligation:
Risk Assessment
enforcing
Effective:
Feb 1, 2023
Risk tier:
all
Scope:
providers, deployers
cross-domain
ISO/IEC 23894 is the specialist AI risk guidance standard that extends the ISO 31000 risk management framework for AI-specific risks (bias, robustness, explainability failures). Regulators cite it as a reference for "state of the art" risk management when defining what compliant AI risk governance looks like.

Requirements

RequirementDetails
AI risk principlesApply AI-specific risk management principles adapted from ISO 31000 Clause 4
Risk identificationIdentify AI-specific risk sources including bias, robustness failures, explainability gaps, and misuse
Risk assessmentAssess likelihood and consequence of identified AI risks throughout the lifecycle
Risk treatmentSelect and implement risk treatment options proportionate to identified risks
Monitoring and reviewContinuously monitor AI risk posture and review risk management effectiveness
Recording and reportingDocument risk management activities, decisions, and outcomes
Lifecycle mappingApply risk management across the full AI system lifecycle per Annex C

Penalties

ViolationFine
Non-complianceVoluntary — no binding enforcement mechanism
View full regulation View obligation Obligation matrix