EU AI ActScoping guide

EU AI Act (Regulation (EU) 2024/1689) Applicability and roles

Classify the system, then classify your role. Both decisions need evidence.

This guide turns Article 2 scope and operator definitions into a practical role matrix for product, legal, procurement, and engineering teams.

Author
Sorena AI
Published
Mar 4, 2026
Updated
Mar 4, 2026
Sections
4

Structured answer sets in this page tree.

Primary sources
5

Cited legal and guidance references.

Publication metadata
Sorena AI
Published Mar 4, 2026
Updated Mar 4, 2026
Overview

The most common AI Act failure is not a missed policy. It is a bad scoping record. Teams often know they use AI, but they cannot show which system was assessed, whether the output is used in the Union, who is the provider, and when a distributor or deployer becomes a provider because of branding or substantial modification. This page focuses on that first layer of compliance.

Section 1

When the EU AI Act applies

Start with the unit you are assessing. Record the AI system or general purpose AI model, its intended purpose, the user group, the deployment context, and whether the output is intended to be used in the Union. Article 2 is wider than simple place of establishment. A provider or deployer outside the EU can still be in scope when the system output is intended for use in the Union.

Also document the carve outs. The Act excludes AI used only for military, defence, or national security purposes, systems and models developed only for scientific research and development before market placement or service use, and certain free and open source releases. That open source carve out does not protect systems that fall under Article 5, Article 50, or high risk categories.

  • Record whether the system is placed on the market, put into service, or only tested in research settings.
  • Record whether output is used by an EU business process, customer, public body, or downstream integrator.
  • Document why an exclusion applies instead of assuming it does.
  • Set a retest trigger for new use cases, new geographies, major model upgrades, or new user claims.
Section 2

How to assign operator roles

Assign roles per system and per use case. You may be a deployer for one internal workflow, a provider for a customer facing feature, and an importer or distributor for a third party system you place on the Union market. The role drives the evidence you must hold and the obligations you must perform.

The provider role is not limited to the original developer. Under the Act, a distributor, importer, deployer, or another third party can become the provider if it puts its name or trademark on a high risk AI system, makes a substantial modification, or changes the intended purpose so that a previously non high risk system becomes high risk.

  • Provider: technical documentation, conformity work, quality management, post market monitoring, and system level instructions.
  • Deployer: use according to instructions, assign human oversight, keep logs under your control, perform FRIA where required, and inform affected persons in relevant Annex III decisions.
  • Importer and distributor: traceability, cooperation, and corrective action support.
  • Authorised representative: mandatory for certain non EU providers of high risk systems and GPAI providers established outside the Union.
Section 3

Role changes that teams often miss

The role can change after launch. A fine tune that materially alters the intended purpose, a white label arrangement, or a substantial modification can shift responsibility. If your commercial model allows reselling or downstream customization, your contract and evidence model must anticipate that role shift.

Product manufacturers also need attention. Where an AI system is a safety component of a product covered by Union harmonisation legislation, the product manufacturer may carry provider obligations for the embedded AI component.

  • Check whether your branding on a third party system makes you the provider in the Union chain.
  • Check whether post deployment learning or customer specific tuning changes the intended purpose.
  • Check whether a connected product route under Annex I changes who owns conformity work.
  • Check whether your procurement process captures the upstream model provider and the downstream system provider separately.
Section 4

Minimum evidence pack for applicability and roles

A good scoping file is short, versioned, and attributable. It should show why the Act applies or does not apply, which exclusions were considered, which role each operator holds, and what event would force reassessment.

This file is also the backbone for procurement, sales diligence, internal approvals, and regulator questions. Without it, later work on transparency, high risk controls, or GPAI obligations will drift.

  • AI register entry with intended purpose, geography, user group, owner, and last review date.
  • Role matrix naming provider, deployer, importer, distributor, and authorised representative where relevant.
  • Substantial modification and reclassification triggers with approvers.
  • Supplier clauses covering documentation, incident notice, version changes, and cooperation with authority requests.
Recommended next step

Use EU AI Act (Regulation (EU) 2024/1689) Applicability and roles as a cited research workflow

Research Copilot can take EU AI Act (Regulation (EU) 2024/1689) Applicability and roles from deciding whether these obligations apply in practice to a reusable workflow inside Sorena. Teams working on EU AI Act (Regulation (EU) 2024/1689) can keep owners, evidence, and next steps aligned without copying this guide into separate documents.

Primary sources

References and citations

Related guides

Explore more topics

EU AI Act Applicability Test | Scope, Role, and Obligation Routing
Run a practical EU AI Act applicability test that checks scope, exclusions, operator role, prohibited practices, high risk status, transparency triggers.
EU AI Act Checklist | Practical Compliance Checklist by Obligation
Use a detailed EU AI Act checklist covering inventory, role mapping, Article 5 screening, high risk controls, Article 50 disclosures, GPAI evidence, logging.
EU AI Act Compliance Program | Build an Operational AI Act Program
Build an EU AI Act compliance program that covers inventory, governance, AI literacy, prohibited practice gates, high risk controls, Article 50 product work.
EU AI Act Deadlines and Compliance Calendar | Exact Dates and Workplan
Track the exact EU AI Act dates, including entry into force on 1 August 2024, early obligations from 2 February 2025, GPAI obligations from 2 August 2025.
EU AI Act FAQ | Dates, High Risk, GPAI, Transparency, and Penalties
Get grounded answers to common EU AI Act questions on application dates, high risk status, provider versus deployer roles, transparency.
EU AI Act GPAI and Foundation Model Obligations | Chapter V Guide
Understand EU AI Act obligations for general purpose AI model providers, including Article 53 documentation, copyright policy.
EU AI Act High Risk AI Use Cases by Industry | Annex III and Product Routes
See how EU AI Act high risk status appears across biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration.
EU AI Act High Risk Requirements Checklist | Articles 9 to 15 and Beyond
Use a detailed high risk AI checklist covering Article 9 risk management, Article 10 data governance, Annex IV technical documentation, logging, instructions.
EU AI Act Penalties and Fines | Article 99 and GPAI Fine Exposure
Understand EU AI Act penalty tiers, including Article 5 fines up to EUR 35,000,000 or 7 percent.
EU AI Act Prohibited AI Practices | Article 5 Screening Guide
Screen AI systems against EU AI Act Article 5 prohibited practices, including manipulative and deceptive techniques, exploitation of vulnerabilities.
EU AI Act Requirements | Prohibited, High Risk, Transparency, and GPAI
Get a grounded overview of EU AI Act requirements across Article 5 prohibited practices, Article 6 and Annex III high risk systems.
EU AI Act Timeline and Phasing Roadmap | Practical Implementation Roadmap
Follow a practical EU AI Act roadmap that aligns workstreams to the phased application dates for prohibited practices, AI literacy, GPAI obligations.
EU AI Act Transparency, Labeling, and User Disclosures | Article 50 Guide
Implement EU AI Act Article 50 transparency duties for direct interaction notices, machine readable marking of synthetic outputs, deepfake disclosures.
EU AI Act vs ISO 42001 | What ISO 42001 Covers and What It Does Not
Compare the EU AI Act with ISO/IEC 42001:2023. Learn where ISO 42001 helps with AI policy, roles, risk assessment, impact assessment, documented information.
EU AI Act vs NIST AI RMF | How to Use AI RMF Without Missing AI Act Duties
Compare the EU AI Act with NIST AI RMF 1.0. Learn how the voluntary NIST AI RMF functions Govern, Map, Measure.