High-risk provider obligations

Understanding the EU AI Act

Dan Nechita

Lead Technical Negotiator, EU AI Act

Concept of intended purpose

  • AI Act balances builder and user obligations
  • Relies on "intended purpose" concept

Image of balancing the builder and user

Understanding the EU AI Act

Concept of intended purpose

Example:

  • Animal identification AI

 

Image using AI for animal identification

Understanding the EU AI Act

Concept of intended purpose

Example:

  • Animal identification AI
  • Used for life insurance pricing
  • Misuse not recommended and likely ineffective
  • Company who built the system has no obligations under AI Act
    • Built a no-risk AI system
  • Obligations rest with the deployer

 

Image using AI for life insurance pricing

Understanding the EU AI Act

Concept of intended purpose

Tailoring products to needs

Understanding the EU AI Act

Conformity assessment

 

Providers of high-risk AI systems must conduct and document a "conformity assessment" to ensure safety and compliance, which can be self-assessed.

iStock-1139088313.jpg

Understanding the EU AI Act

Risk management and governance

 

  • Providers need a risk management system for identifying and mitigating risks

 

  • Data governance measures must ensure unbiased, fit-for-purpose training data, considering the impact on various groups

 

iStock-1321088893.jpg

Understanding the EU AI Act

Documentation

CleanShot 2024-07-31 at 11.34.15.png

  • Providers must create detailed documentation and enable record-keeping for compliance and traceability
  • Supply deployers with information on features, proper use, limitations, and potential misuse
  • Ensure human oversight and implement measures for accuracy, robustness, and cybersecurity
Understanding the EU AI Act

Other obligations

iStock-1472229829.jpg

Understanding the EU AI Act

Ensuring compliance

iStock-1271095979.jpg

Understanding the EU AI Act

Let's practice!

Understanding the EU AI Act

Preparing Video For Download...