VLOP/VLOSE GuideEU

EU Digital Services Act (DSA) Systemic Risk & Mitigation

How to run Article 34 systemic risk assessments and build Article 35 mitigation measures that stand up to audits.

Designed for VLOP/VLOSE readiness, with a repeatable annual calendar and evidence pack.

Author
Sorena AI
Published
Feb 21, 2026
Updated
Feb 21, 2026
Sections
8

Structured answer sets in this page tree.

Primary sources
1

Cited legal and guidance references.

Publication metadata
Sorena AI
Published Feb 21, 2026
Updated Feb 21, 2026
Overview

For VLOPs and VLOSEs, DSA compliance becomes a risk management lifecycle: systemic risk assessment (Article 34) -> mitigation measures (Article 35) -> independent audit (Article 37) -> publication/transmission (Article 42). This page is an implementation guide for the risk and mitigation phases, with concrete measures and evidence expectations.

Section 1

Who this applies to: VLOPs/VLOSEs and designation timing

The systemic-risk obligations apply to online platforms and online search engines designated as very large under Article 33.

If you are approaching the AMAR threshold, build the capability early - you'll need risk and audit outputs within months of designation.

  • Threshold: AMAR in the Union >= 45 million (Article 33(1)) plus Commission designation decision.
  • Obligations apply from four months after notification of the designation decision (Article 33(6)).
  • AMAR publication cadence: publish AMAR at least every six months (Article 24(2)).
Section 2

Article 34 - What you must assess (systemic risk categories)

Article 34 requires VLOPs/VLOSEs to identify, analyse, and assess systemic risks stemming from design/functioning/use of the service and related systems (including algorithmic systems).

The risk assessment is service-specific and proportionate, considering severity and probability.

  • Illegal content dissemination risk (Article 34(1)(a)).
  • Fundamental rights risks (Article 34(1)(b)), including privacy, data protection, freedom of expression, non-discrimination, and child rights.
  • Risks to civic discourse/electoral processes and public security (Article 34(1)(c)).
  • Risks related to gender-based violence, public health, minors, and physical/mental well-being (Article 34(1)(d)).
Recommended next step

Turn EU Digital Services Act (DSA) Systemic Risk & Mitigation into an operational assessment

Assessment Autopilot can take EU Digital Services Act (DSA) Systemic Risk & Mitigation from turning this guidance into a repeatable review process to a reusable workflow inside Sorena. Teams working on EU Digital Services Act (DSA) can keep owners, evidence, and next steps aligned without copying this guide into separate documents.

Section 3

Article 34 - When and how often to assess (and what triggers an out-of-cycle update)

Risk assessments must be completed by the date of application for the designated service and at least annually thereafter.

They must also be performed prior to deploying functionalities likely to have a critical impact on identified risks.

  • Minimum frequency: annual.
  • Out-of-cycle triggers: major changes to ranking/recommenders, ad targeting, moderation systems, identity/account systems, or other features likely to critically impact risks.
  • Regional/language factors: incorporate regional or linguistic aspects, including Member State-specific issues (Article 34(2) guidance).
Section 4

Article 34 - What factors to consider (the "systems" lens)

Article 34(2) pushes a systems perspective: how platform systems influence risks.

This prevents "paper-only" risk assessments that ignore product design and operational reality.

  • Recommender and algorithmic systems effects (Article 34(2)(a)).
  • Content moderation systems and enforcement effects (Article 34(2)(b)).
  • Terms and conditions and their enforcement (Article 34(2)(c)).
  • Advertising systems effects (Article 34(2)(d)).
  • Data-related practices effects (Article 34(2)(e)).
  • Manipulation/inauthentic use and rapid amplification dynamics (Article 34(2) additional analysis).
Section 5

Documentation and retention (prove the assessment was real)

The DSA explicitly requires preserving supporting documents for risk assessments for a minimum period and providing them on request.

This means you need an evidence model: inputs, analysis, decisions, and approvals.

  • Preserve supporting documents for at least three years after performing the assessments (Article 34(3)).
  • Build an evidence pack: datasets, analysis notebooks, meeting minutes, decision logs, and sign-offs.
  • Ensure confidentiality and access controls for sensitive inputs while enabling auditability.
Section 6

Article 35 - Mitigation measures (what "reasonable, proportionate, effective" means)

Article 35 requires reasonable, proportionate, and effective mitigation measures tailored to the specific systemic risks identified, with particular consideration for impacts on fundamental rights.

This is where you translate the risk assessment into product, policy and operational changes.

  • Product changes: adapt design/features/online interface (Article 35(1)(a)).
  • Policy changes: adapt terms and enforcement (Article 35(1)(b)).
  • Moderation operations: adapt speed/quality of notice processing, decision processes, resources; expeditious removal where appropriate for certain content types (Article 35(1)(c)).
  • Algorithmic testing: test/adapt algorithmic systems and recommender systems (Article 35(1)(d)).
  • Ads system changes: adapt advertising systems and targeted measures to limit/adjust ad presentation (Article 35(1)(e)).
  • Governance: reinforce internal processes/resources/testing/documentation/supervision (Article 35(1)(f)).
  • Cooperation: adjust cooperation with trusted flaggers and dispute settlement bodies (Article 35(1)(g)).
  • Codes/crisis protocols: participate in codes of conduct and crisis protocols where applicable (Article 35(1)(h)).
  • Child protection measures: targeted measures including age verification/parental controls and reporting/support tooling for minors (Article 35(1)(j)).
  • Synthetic media marking: prominent markings + easy-to-use functionality to indicate manipulated/generated media (Article 35(1)(k)).
Section 8

Evidence pack checklist (what to keep ready for Article 42 publication and audits)

Under Article 42, VLOPs/VLOSEs must publish/transmit risk and audit materials. Make those outputs "exportable" by design.

This checklist helps you build an evidence model that supports both audits and publication obligations.

  • Risk assessment report + supporting datasets and analysis artifacts.
  • Mitigation plan with owners, KPIs, rollout tracking, and effectiveness evaluation results.
  • Change logs for major algorithmic systems, recommender surfaces, ad systems, and moderation policy updates.
  • Audit-ready policies and procedures (moderation workflows, appeals, data access governance).
  • Redaction plan: how confidential/sensitive information is handled for public versions while full versions are transmitted to authorities.
Primary sources

References and citations

Related guides

Explore more topics

DSA Ads & Recommender Systems | Article 26, 27, 38 & 39 Compliance
A deep compliance guide for DSA advertising and recommender system obligations: ad transparency (Article 26), recommender system transparency (Article 27).
DSA Applicability Test | Is the EU Digital Services Act Applicable to You?
A step-by-step applicability test for the EU Digital Services Act (DSA, Regulation (EU) 2022/2065): EU offering triggers.
DSA Enforcement & Investigations | DSCs, Commission Powers, Audits & Procedures
A practical guide to DSA enforcement (Regulation (EU) 2022/2065): how Digital Services Coordinators (DSCs) supervise services.
DSA Notice & Action Workflow | Article 16 Requirements + Templates
A deep implementation guide for DSA notice & action (Regulation (EU) 2022/2065, Article 16): intake design, required notice elements.
DSA Penalties & Fines | Digital Services Act Enforcement Exposure (6% / 1% / 5%)
How DSA penalties work under Regulation (EU) 2022/2065.
DSA Transparency Report Template | Article 15 + Article 24 + VLOP Article 42
Copy and paste ready DSA transparency report template aligned to Regulation (EU) 2022/2065 and Implementing Regulation (EU) 2024/2835.
DSA Transparency Reporting | Articles 15, 24 & 42 Reporting Requirements
A practical guide to EU Digital Services Act transparency reporting: what to publish for Article 15, what to add for Article 24.
DSA vs DMA | Digital Services Act vs Digital Markets Act (What's the Difference?)
A practical comparison of the EU Digital Services Act (DSA, Regulation (EU) 2022/2065) and the EU Digital Markets Act (DMA.
DSA vs UK Online Safety Act | EU vs UK Online Safety Compliance
A practical comparison of the EU Digital Services Act (DSA, Regulation (EU) 2022/2065) and the UK Online Safety Act: scope (EU recipients vs UK users).
EU Digital Services Act (DSA) Requirements | Obligations by Service Type & Tier
A practical breakdown of DSA requirements (Regulation (EU) 2022/2065): obligations for intermediary services, hosting services, online platforms.
EU DSA Checklist | Digital Services Act Compliance Checklist (Audit-Ready)
An audit-ready EU Digital Services Act (DSA) compliance checklist for Regulation (EU) 2022/2065: scope memo, terms transparency.
EU DSA Compliance Guide | Digital Services Act Implementation Playbook
A practical EU Digital Services Act (DSA) compliance guide for Regulation (EU) 2022/2065: scope memo and tiering.
EU DSA Deadlines & Compliance Calendar | Key Dates, Cadence and Milestones
A DSA compliance calendar for Regulation (EU) 2022/2065: entry into force, general applicability, Digital Services Coordinator designation, Article 15, 24.
EU DSA FAQ | Digital Services Act Questions & Answers (Practical)
Practical answers to the most searched EU Digital Services Act (DSA) questions: who is in scope, what "hosting" and "online platform" mean.
EU DSA Service Types & Scope | Hosting vs Platform vs Marketplace
How to classify your service under the EU Digital Services Act (DSA, Regulation (EU) 2022/2065): intermediary service types (mere conduit, caching, hosting).