EU AI ActApplicability test

EU AI Act (Regulation (EU) 2024/1689) Applicability test

Run one intake test, then route the work to the right compliance track.

This workflow is designed for product intake, change review, procurement, and launch governance.

Author
Sorena AI
Published
Mar 4, 2026
Updated
Mar 4, 2026
Sections
5

Structured answer sets in this page tree.

Primary sources
5

Cited legal and guidance references.

Publication metadata
Sorena AI
Published Mar 4, 2026
Updated Mar 4, 2026
Overview

Treat applicability as a repeatable intake control, not an ad hoc legal memo. The test below is designed so that product, legal, security, procurement, and model teams can reach a documented answer using the same structure every time a new AI feature or model enters the portfolio.

Section 1

Step 1: define the object you are testing

Assess each AI system and each general purpose AI model separately. A single product can contain multiple AI systems, multiple model suppliers, and multiple intended purposes. If you test the product at too high a level, you will miss obligations that attach only to one component.

Capture intended purpose, target users, decision impact, deployment channel, and whether the output is intended to be used in the Union. That final point matters even when the provider or deployer sits outside the EU.

  • Name the system or model version under review.
  • Record the intended purpose in one sentence.
  • Record the business process or customer decision the output influences.
  • Record upstream model dependencies and any downstream resellers or integrators.
Section 2

Step 2: test exclusions before you classify obligations

Check whether the system falls outside the Act because it is used only for military, defence, or national security purposes, or because it is still in pure research and development before being placed on the market or put into service. Document the reason and the evidence. Exclusions should be provable, not assumed.

Check the free and open source position carefully. The broad open source carve out does not apply where the system is high risk or where Article 5 or Article 50 duties are triggered. Teams often over rely on open source language and miss downstream obligations.

  • Document the exclusion and the article basis.
  • Identify what event would end the exclusion, such as launch to customers or internal production use.
  • Flag any open source system that could still trigger prohibited practice, transparency, or high risk obligations.
  • Route mixed purpose systems for manual review where civilian and excluded uses overlap.
Section 3

Step 3: assign the operator role

Decide whether you are acting as provider, deployer, importer, distributor, authorised representative, or product manufacturer for the specific system. You can hold more than one role in the same supply chain. The provider role can shift if you rebrand, substantially modify, or change intended purpose.

If the system will be placed on the Union market by a provider outside the EU, check whether an authorised representative is required. If the system uses a third party model, split model provider obligations from system provider obligations.

  • Assign one accountable owner for each role decision.
  • Attach the procurement owner where a third party supplier is involved.
  • Record whether branding, white labelling, or substantial modification could shift provider status.
  • Record whether the product safety route under Annex I applies.
Section 4

Step 4: route to the right obligation layer

Check prohibited practices first. If Article 5 is triggered, the right answer is stop, redesign, or narrow the use case before launch. If not prohibited, test high risk status under Article 6 and Annex III or the product safety route. Then check transparency under Article 50 and model level duties under Chapter V for GPAI.

Do not treat these as mutually exclusive. A high risk system can also trigger transparency duties. A deployer using a third party GPAI model can still need upstream documentation to satisfy its own system level obligations.

  • Article 5 screen complete and signed off.
  • Article 6 and Annex III analysis complete, including any derogation analysis and profiling check.
  • Article 50 disclosure triggers and machine readable marking triggers mapped to product surfaces.
  • Chapter V check complete for any GPAI model provider or systemic risk scenario.
Section 5

Step 5: produce the minimum decision record

The output of the test should be a one page decision record with links to evidence, not a vague conclusion. It should show the system or model, article triggers considered, outcome, owner, date, sources used, and the next review trigger.

This record should be stored where release, procurement, and governance teams can actually find it. A hidden spreadsheet is not an effective compliance control.

  • Outcome field: out of scope, prohibited, high risk, transparency only, GPAI provider, or mixed case.
  • Evidence links: architecture note, supplier docs, policy approvals, and release gate outcome.
  • Review triggers: new geography, new model, new intended purpose, substantial modification, or major incident.
  • Approvals: product owner, legal or compliance reviewer, and security or risk reviewer where required.
Recommended next step

Turn EU AI Act (Regulation (EU) 2024/1689) Applicability test into an operational assessment

Assessment Autopilot can take EU AI Act (Regulation (EU) 2024/1689) Applicability test from deciding whether these obligations apply in practice to a reusable workflow inside Sorena. Teams working on EU AI Act (Regulation (EU) 2024/1689) can keep owners, evidence, and next steps aligned without copying this guide into separate documents.

Primary sources

References and citations

Related guides

Explore more topics

EU AI Act Applicability and Roles | Provider, Deployer, Importer Guide
Determine whether the EU AI Act applies, when output used in the Union brings a system into scope, and how to assign provider, deployer, importer.
EU AI Act Checklist | Practical Compliance Checklist by Obligation
Use a detailed EU AI Act checklist covering inventory, role mapping, Article 5 screening, high risk controls, Article 50 disclosures, GPAI evidence, logging.
EU AI Act Compliance Program | Build an Operational AI Act Program
Build an EU AI Act compliance program that covers inventory, governance, AI literacy, prohibited practice gates, high risk controls, Article 50 product work.
EU AI Act Deadlines and Compliance Calendar | Exact Dates and Workplan
Track the exact EU AI Act dates, including entry into force on 1 August 2024, early obligations from 2 February 2025, GPAI obligations from 2 August 2025.
EU AI Act FAQ | Dates, High Risk, GPAI, Transparency, and Penalties
Get grounded answers to common EU AI Act questions on application dates, high risk status, provider versus deployer roles, transparency.
EU AI Act GPAI and Foundation Model Obligations | Chapter V Guide
Understand EU AI Act obligations for general purpose AI model providers, including Article 53 documentation, copyright policy.
EU AI Act High Risk AI Use Cases by Industry | Annex III and Product Routes
See how EU AI Act high risk status appears across biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration.
EU AI Act High Risk Requirements Checklist | Articles 9 to 15 and Beyond
Use a detailed high risk AI checklist covering Article 9 risk management, Article 10 data governance, Annex IV technical documentation, logging, instructions.
EU AI Act Penalties and Fines | Article 99 and GPAI Fine Exposure
Understand EU AI Act penalty tiers, including Article 5 fines up to EUR 35,000,000 or 7 percent.
EU AI Act Prohibited AI Practices | Article 5 Screening Guide
Screen AI systems against EU AI Act Article 5 prohibited practices, including manipulative and deceptive techniques, exploitation of vulnerabilities.
EU AI Act Requirements | Prohibited, High Risk, Transparency, and GPAI
Get a grounded overview of EU AI Act requirements across Article 5 prohibited practices, Article 6 and Annex III high risk systems.
EU AI Act Timeline and Phasing Roadmap | Practical Implementation Roadmap
Follow a practical EU AI Act roadmap that aligns workstreams to the phased application dates for prohibited practices, AI literacy, GPAI obligations.
EU AI Act Transparency, Labeling, and User Disclosures | Article 50 Guide
Implement EU AI Act Article 50 transparency duties for direct interaction notices, machine readable marking of synthetic outputs, deepfake disclosures.
EU AI Act vs ISO 42001 | What ISO 42001 Covers and What It Does Not
Compare the EU AI Act with ISO/IEC 42001:2023. Learn where ISO 42001 helps with AI policy, roles, risk assessment, impact assessment, documented information.
EU AI Act vs NIST AI RMF | How to Use AI RMF Without Missing AI Act Duties
Compare the EU AI Act with NIST AI RMF 1.0. Learn how the voluntary NIST AI RMF functions Govern, Map, Measure.