Artifact GuideEU

EU Digital Services Act (DSA) Requirements

A layered obligations map you can translate into workstreams, owners and evidence.

Built around how the DSA is structured: intermediary -> hosting -> platform -> marketplace -> VLOP/VLOSE.

Author
Sorena AI
Published
Feb 21, 2026
Updated
Feb 21, 2026
Sections
6

Structured answer sets in this page tree.

Primary sources
2

Cited legal and guidance references.

Publication metadata
Sorena AI
Published Feb 21, 2026
Updated Feb 21, 2026
Overview

The DSA's most important design feature is its layered obligation model: you inherit additional obligations as your service classification moves from intermediary services to hosting services to online platforms, with special rules for marketplaces and a systemic-risk tier for VLOPs/VLOSEs. Use this page as the backbone for your requirements matrix and your compliance roadmap.

Section 1

Layer 1 - Baseline obligations for intermediary services (start here)

Baseline DSA requirements focus on operational accessibility, transparency and enforceability: clear terms and conditions, points of contact, and (where applicable) a legal representative in the Union.

These are often "policy + product surface" tasks that require legal, UX and engineering alignment.

  • Single point of contact for recipients (Article 12): user-friendly electronic communications that do not rely solely on automated tools.
  • Legal representative (Article 13): required if you're not established in the EU but offer services in the EU (publish contact details).
  • Terms and conditions transparency (Article 14): disclose content moderation policies/procedures/tools (including algorithmic decision-making + human review) in clear language and machine-readable format.
  • Transparency reporting baseline (Article 15): annual reports on content moderation, including orders, notices, complaints, automated moderation use, and accuracy/error indicators (with micro/small exclusions).
Section 2

Layer 2 - Hosting services (illegal content handling + decision explainability)

If you host information provided by recipients, you must implement notice & action mechanisms and explain key restriction decisions.

This layer is where most "trust & safety engineering" work begins.

  • Notice & action mechanisms (Article 16): electronic, user-friendly intake; notices must allow a diligent host to identify illegality without detailed legal examination (when sufficiently precise/substantiated).
  • Confirmation + outcome notification to notifiers (Article 16(4)-(6)): receipt confirmation and decision notifications, including redress options.
  • Statement of reasons (Article 17): provide affected recipients with clear, specific reasons for restrictions (removal, demotion, account actions), including legal/contractual grounds and automation use.
  • Criminal offence suspicions (Article 18): promptly inform law enforcement/judicial authorities for certain serious threats.
Section 3

Layer 3 - Online platforms (user redress, interface integrity, additional transparency)

Online platform status adds stronger recipient rights: complaints and dispute settlement pathways, protections against misuse, and expanded transparency and interface obligations.

These requirements typically span policy operations, product UI/UX, and logging/reporting infrastructure.

  • Platform transparency reporting (Article 24): add dispute settlement volumes/outcomes and suspension metrics; publish AMAR at least every 6 months; submit statements of reasons for inclusion in the Commission database (Article 24(5)).
  • Online interface integrity (Article 25): do not design interfaces to deceive/manipulate recipients or materially impair free and informed decisions (anti-dark-pattern duty).
  • Advertising transparency (Article 26): per-ad disclosure that content is an ad, who benefits, who paid (if different), and meaningful targeting parameters plus how to change them; restrict certain profiling uses.
  • Recommender transparency (Article 27): disclose main parameters and user options to modify/influence them; provide a directly accessible option selector where multiple ranking options exist.
  • Online protection of minors (Article 28): appropriate measures for privacy, safety and security; restrictions on profiling-based ads to minors without requiring additional personal data processing solely for age estimation.
Section 4

Layer 4 - Online marketplaces (distance contracts with traders)

Marketplaces have dedicated consumer-protection obligations focused on trader traceability and compliance-by-design interfaces.

This layer is operationally heavy: KYC-like trader onboarding, evidence retention, and suspension workflows.

  • Trader traceability (Article 30): collect trader identity, registers, payment account details, and self-certification; make best efforts to assess reliability/completeness; store securely and delete after the retention period.
  • Compliance-by-design interface (Article 31): enable traders to provide required pre-contractual, compliance and product safety information; best-effort checks and random checks for illegality in official databases.
  • Right to information (Article 32): when aware of illegal products/services, inform impacted consumers (or publish accessible info) and provide redress paths.
Section 5

Layer 5 - VLOPs/VLOSEs (systemic-risk management, audits and enhanced transparency)

VLOP/VLOSE obligations apply after Commission designation under Article 33 and are the highest bar: systemic risk assessments, risk mitigation measures, independent audits, additional ad transparency, and more frequent reporting.

If you could be near the threshold, build the capabilities early: reporting pipelines, risk management governance, and audit evidence.

  • Designation threshold (Article 33): AMAR in the Union >= 45 million + Commission designation decision; obligations apply (or cease) from four months after notification.
  • Systemic risk assessment (Article 34): identify/analyse systemic risks stemming from design/functioning/use (illegal content, fundamental rights, civic discourse/elections, public security, minors and well-being), at least annually and before major feature deployments.
  • Independent audits (Article 37): annual independent audit of compliance and (where applicable) code-of-conduct commitments; publish/transmit audit reports and implementation reports under transparency reporting rules.
  • Recommender systems (Article 38): provide at least one option per recommender system that is not based on profiling (for VLOPs/VLOSEs that use recommenders).
  • Ad repository (Article 39): provide a searchable repository and API access for ads, while excluding personal data of recipients and keeping information accurate/complete.
  • Enhanced transparency reporting (Article 42): publish Article 15 reports within the specified timeframe and at least every six months; include language breakdown of moderation resources and publish risk assessment, mitigation and audit materials (with confidentiality carve-outs).
Section 6

How to turn requirements into an implementation plan (fast path)

A good DSA requirements matrix is a work plan: each requirement has an owner, control design, acceptance criteria, evidence, and a reporting cadence.

Use these steps to avoid "policy-only" compliance that fails under enforcement scrutiny.

  • Create a requirements matrix: Article -> obligation -> product/control -> owner -> evidence -> reporting cadence.
  • Build a statement-of-reasons pipeline first: it improves compliance, transparency reporting, and user redress workflows at once.
  • Implement notice intake + triage with SLA metrics and audit logs (include automation use disclosures).
  • Design transparency reporting as data engineering: define metrics, data sources, QA controls, and sign-off workflow.
  • If VLOP/VLOSE: build an annual risk assessment and audit calendar and reserve time for remediation and publication.
Recommended next step

Turn EU Digital Services Act (DSA) Requirements into an operational assessment

Assessment Autopilot can take EU Digital Services Act (DSA) Requirements from turning the requirements into assigned actions to a reusable workflow inside Sorena. Teams working on EU Digital Services Act (DSA) can keep owners, evidence, and next steps aligned without copying this guide into separate documents.

Primary sources

References and citations

Related guides

Explore more topics

DSA Ads & Recommender Systems | Article 26, 27, 38 & 39 Compliance
A deep compliance guide for DSA advertising and recommender system obligations: ad transparency (Article 26), recommender system transparency (Article 27).
DSA Applicability Test | Is the EU Digital Services Act Applicable to You?
A step-by-step applicability test for the EU Digital Services Act (DSA, Regulation (EU) 2022/2065): EU offering triggers.
DSA Enforcement & Investigations | DSCs, Commission Powers, Audits & Procedures
A practical guide to DSA enforcement (Regulation (EU) 2022/2065): how Digital Services Coordinators (DSCs) supervise services.
DSA Notice & Action Workflow | Article 16 Requirements + Templates
A deep implementation guide for DSA notice & action (Regulation (EU) 2022/2065, Article 16): intake design, required notice elements.
DSA Penalties & Fines | Digital Services Act Enforcement Exposure (6% / 1% / 5%)
How DSA penalties work under Regulation (EU) 2022/2065.
DSA Transparency Report Template | Article 15 + Article 24 + VLOP Article 42
Copy and paste ready DSA transparency report template aligned to Regulation (EU) 2022/2065 and Implementing Regulation (EU) 2024/2835.
DSA Transparency Reporting | Articles 15, 24 & 42 Reporting Requirements
A practical guide to EU Digital Services Act transparency reporting: what to publish for Article 15, what to add for Article 24.
DSA vs DMA | Digital Services Act vs Digital Markets Act (What's the Difference?)
A practical comparison of the EU Digital Services Act (DSA, Regulation (EU) 2022/2065) and the EU Digital Markets Act (DMA.
DSA vs UK Online Safety Act | EU vs UK Online Safety Compliance
A practical comparison of the EU Digital Services Act (DSA, Regulation (EU) 2022/2065) and the UK Online Safety Act: scope (EU recipients vs UK users).
EU DSA Checklist | Digital Services Act Compliance Checklist (Audit-Ready)
An audit-ready EU Digital Services Act (DSA) compliance checklist for Regulation (EU) 2022/2065: scope memo, terms transparency.
EU DSA Compliance Guide | Digital Services Act Implementation Playbook
A practical EU Digital Services Act (DSA) compliance guide for Regulation (EU) 2022/2065: scope memo and tiering.
EU DSA Deadlines & Compliance Calendar | Key Dates, Cadence and Milestones
A DSA compliance calendar for Regulation (EU) 2022/2065: entry into force, general applicability, Digital Services Coordinator designation, Article 15, 24.
EU DSA FAQ | Digital Services Act Questions & Answers (Practical)
Practical answers to the most searched EU Digital Services Act (DSA) questions: who is in scope, what "hosting" and "online platform" mean.
EU DSA Service Types & Scope | Hosting vs Platform vs Marketplace
How to classify your service under the EU Digital Services Act (DSA, Regulation (EU) 2022/2065): intermediary service types (mere conduit, caching, hosting).
VLOP/VLOSE Systemic Risk Assessment (DSA) | Articles 34-36 + Mitigation
A deep guide to DSA systemic risk management for VLOPs/VLOSEs: how to run the Article 34 systemic risk assessment (risk categories, frequency.