High-risk deployer obligations

Understanding the EU AI Act

Dan Nechita

Lead Technical Negotiator, EU AI Act

Obligations for deployers

iStock-1078932568-modified-99672ec9-fa88-4c7b-b9ac-416413b34060.jpg

Deployers and providers share obligations to ensure that AI systems do not post threats to health, safety and fundamental rights.

Understanding the EU AI Act

So what are those obligations?

 

  • Deployers must ensure AI use follows instructions
  • Maintain automated logs and monitor for malfunctions
  • Report serious incidents to provider and authorities

iStock-807331754.jpg

Understanding the EU AI Act

Notification required

 

  • Deployers must inform workers about AI usage in task assignment or performance monitoring.
  • Deployers using high-risk AI for decisions must inform affected individuals.
  • Examples: life insurance premiums, educational institution admissions.

iStock-2156606051.jpg

Understanding the EU AI Act

Public authorities

CleanShot 2024-07-31 at 08.22.39.png

 

  • Public authorities and essential service deployers must conduct a fundamental rights impact assessment (FRIA).
  • FRIA ensures AI does not breach rights, similar to GDPR's DPIA.
  • High-risk AI use by public authorities must be registered in the EU database.
Understanding the EU AI Act

Deployers can become providers!

 

  • Providers and deployers roles can overlap with substantial AI modifications.
  • Modifying AI's intended purpose makes the deployer a provider with provider obligations.
  • Example: Using GPT-4 for hiring decisions makes the company a high-risk AI provider.

CleanShot 2024-07-31 at 11.09.50.png

Understanding the EU AI Act

Looking back

Risk pyramid with four levels.png

Understanding the EU AI Act

Thank you!

Understanding the EU AI Act

Preparing Video For Download...