EU AI ActChapter V

EU AI Act (Regulation (EU) 2024/1689) GPAI obligations

Chapter V is model level law. Treat it as a production workflow, not as a one off filing task.

This guide explains which providers are in scope, what they must maintain, and how systemic risk changes the operating model.

Author
Sorena AI
Published
Mar 4, 2026
Updated
Mar 4, 2026
Sections
5

Structured answer sets in this page tree.

Primary sources
5

Cited legal and guidance references.

Publication metadata
Sorena AI
Published Mar 4, 2026
Updated Mar 4, 2026
Overview

The AI Act uses the term general purpose AI model. In practice, this is where many foundation model discussions land. The obligations are not limited to one document. They require a standing workflow for documentation, downstream information, copyright compliance, publication, and regulator cooperation.

Section 1

When GPAI obligations apply

Chapter V obligations apply from 2 August 2025. If a provider placed a GPAI model on the market before 2 August 2025, Article 111 gives that provider until 2 August 2027 to take the necessary steps to comply. The obligations still exist from 2 August 2025 for providers placing models on the market from that date onward.

The Commission guidance in the local source pack makes clear that the absence of immediate enforcement fines in the first year does not suspend the legal applicability of Chapter V.

  • 2 August 2025: Chapter V applies.
  • 2 August 2026: Commission enforcement powers on GPAI fines begin.
  • 2 August 2027: legacy GPAI models placed before 2 August 2025 must be compliant.
  • Use a dated model registry so you can prove which transition rule applies.
Section 2

Core Article 53 deliverables

Article 53 requires four pillars. First, technical documentation for the model and its training, testing, and evaluation process. Second, downstream information and documentation that lets AI system providers understand capabilities and limitations and comply with their own obligations. Third, a copyright policy that addresses Union copyright and related rights, including reservation of rights under the text and data mining rules. Fourth, a sufficiently detailed public summary of the content used for training, using the AI Office template.

These are living artifacts. They have to stay current as the model, training approach, or release strategy changes.

  • Maintain an internal model documentation workflow tied to releases.
  • Prepare downstream integration documentation that is usable by system providers.
  • Maintain and review the copyright policy against data sourcing practice.
  • Publish and update the public summary of training content using the official template and explanatory notice.
Section 3

Open source and authorised representative points

Article 53 contains a limited exception for providers releasing models under a free and open source licence where the parameters, architecture information, and model usage information are made publicly available. That exception does not apply to GPAI models with systemic risk.

Article 54 requires providers established outside the Union to appoint an authorised representative in the Union before placing a GPAI model on the Union market, unless the open source exception applies and the model does not present systemic risk.

  • Do not assume open source status removes all GPAI duties.
  • Check whether the model could still be systemic risk.
  • Check whether a Union authorised representative is required.
  • Keep the mandate and contact details available for the AI Office and national authorities.
Section 4

Systemic risk obligations and incident handling

A GPAI model is presumed to have high impact capabilities when the cumulative training compute exceeds the Article 51 threshold. Once that threshold is met or known to be met, the provider must notify the Commission without delay and in any event within two weeks. The Commission can also designate systemic risk on its own decision basis.

Systemic risk providers need stronger safety, security, and serious incident readiness under Article 55. The local source pack also includes the Commission reporting template and the code of practice materials that help structure this work.

  • Track compute and maintain evidence for threshold analysis.
  • Prepare the Article 52 notification path and responsible contacts.
  • Define what counts as a serious incident and how evidence is captured.
  • Run drills for exfiltration, major safety event, and severe downstream harm scenarios.
Section 5

What downstream integrators should demand

Even if you are not the GPAI provider, you need upstream material that supports your own compliance. Integrators should not accept a black box relationship where capabilities, limitations, and incident obligations are unknown.

The downstream package should be usable by product, legal, security, and procurement teams, not only by machine learning engineers.

  • Versioned technical and downstream documentation extracts.
  • Known limitations, evaluation coverage, and unsafe use boundaries.
  • Change notices for model upgrades and major policy changes.
  • Incident notice commitments and escalation contacts.
  • Training content summary and copyright policy references where required.
Recommended next step

Use EU AI Act (Regulation (EU) 2024/1689) GPAI obligations as a cited research workflow

Research Copilot can take EU AI Act (Regulation (EU) 2024/1689) GPAI obligations from clarifying scope and applicability with cited answers to a reusable workflow inside Sorena. Teams working on EU AI Act (Regulation (EU) 2024/1689) can keep owners, evidence, and next steps aligned without copying this guide into separate documents.

Primary sources

References and citations

Related guides

Explore more topics

EU AI Act Applicability and Roles | Provider, Deployer, Importer Guide
Determine whether the EU AI Act applies, when output used in the Union brings a system into scope, and how to assign provider, deployer, importer.
EU AI Act Applicability Test | Scope, Role, and Obligation Routing
Run a practical EU AI Act applicability test that checks scope, exclusions, operator role, prohibited practices, high risk status, transparency triggers.
EU AI Act Checklist | Practical Compliance Checklist by Obligation
Use a detailed EU AI Act checklist covering inventory, role mapping, Article 5 screening, high risk controls, Article 50 disclosures, GPAI evidence, logging.
EU AI Act Compliance Program | Build an Operational AI Act Program
Build an EU AI Act compliance program that covers inventory, governance, AI literacy, prohibited practice gates, high risk controls, Article 50 product work.
EU AI Act Deadlines and Compliance Calendar | Exact Dates and Workplan
Track the exact EU AI Act dates, including entry into force on 1 August 2024, early obligations from 2 February 2025, GPAI obligations from 2 August 2025.
EU AI Act FAQ | Dates, High Risk, GPAI, Transparency, and Penalties
Get grounded answers to common EU AI Act questions on application dates, high risk status, provider versus deployer roles, transparency.
EU AI Act High Risk AI Use Cases by Industry | Annex III and Product Routes
See how EU AI Act high risk status appears across biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration.
EU AI Act High Risk Requirements Checklist | Articles 9 to 15 and Beyond
Use a detailed high risk AI checklist covering Article 9 risk management, Article 10 data governance, Annex IV technical documentation, logging, instructions.
EU AI Act Penalties and Fines | Article 99 and GPAI Fine Exposure
Understand EU AI Act penalty tiers, including Article 5 fines up to EUR 35,000,000 or 7 percent.
EU AI Act Prohibited AI Practices | Article 5 Screening Guide
Screen AI systems against EU AI Act Article 5 prohibited practices, including manipulative and deceptive techniques, exploitation of vulnerabilities.
EU AI Act Requirements | Prohibited, High Risk, Transparency, and GPAI
Get a grounded overview of EU AI Act requirements across Article 5 prohibited practices, Article 6 and Annex III high risk systems.
EU AI Act Timeline and Phasing Roadmap | Practical Implementation Roadmap
Follow a practical EU AI Act roadmap that aligns workstreams to the phased application dates for prohibited practices, AI literacy, GPAI obligations.
EU AI Act Transparency, Labeling, and User Disclosures | Article 50 Guide
Implement EU AI Act Article 50 transparency duties for direct interaction notices, machine readable marking of synthetic outputs, deepfake disclosures.
EU AI Act vs ISO 42001 | What ISO 42001 Covers and What It Does Not
Compare the EU AI Act with ISO/IEC 42001:2023. Learn where ISO 42001 helps with AI policy, roles, risk assessment, impact assessment, documented information.
EU AI Act vs NIST AI RMF | How to Use AI RMF Without Missing AI Act Duties
Compare the EU AI Act with NIST AI RMF 1.0. Learn how the voluntary NIST AI RMF functions Govern, Map, Measure.