Does ISO/IEC 42005 AI Impact Assessment require Risk Assessment?

OECD • voluntary

Yes — 1 provision

Requirements at a glance

This regulation imposes 7 specific requirements for Risk Assessment across 1 provision:

AI System Impact Assessment #

Obligation:
Risk Assessment
enforcing
Effective:
May 1, 2025
Risk tier:
all
Scope:
providers, deployers
cross-domain
ISO/IEC 42005 fills the gap between generic risk management (ISO 23894) and impact on individuals and society — it is the AI equivalent of a Data Protection Impact Assessment (DPIA). As AI impact assessment requirements appear in the EU AI Act, CETS 225, and national strategies, this standard provides the reference methodology for conducting them.

Requirements

RequirementDetails
Impact identificationIdentify potential impacts of AI systems and their foreseeable applications on individuals, groups, and society
Intended and unintended use assessmentAssess intended, unintended, sensitive, restricted uses, and foreseeable misuse scenarios
Benefit and harm evaluationEvaluate both positive and negative impacts throughout the AI lifecycle
Stakeholder perspectiveIntegrate perspectives of affected individuals and groups in the assessment process
DocumentationProduce assessment documentation supporting transparency, accountability, and fairness
Lifecycle integrationApply impact assessment from design and development through deployment and post-market monitoring
Integration with risk managementCoordinate impact assessment with ISO/IEC 23894 (risk management) and ISO/IEC 42001 (management system)

Penalties

ViolationFine
Non-complianceVoluntary — no binding enforcement mechanism
View full regulation View obligation Obligation matrix