- Primary DSA systemic-risk obligations (Articles 33-36) and mitigation measures list (Article 35), including documentation retention (Article 34(3)).
EU Digital Services Act (DSA) Systemic Risk & Mitigation
How to run Article 34 systemic risk assessments and build Article 35 mitigation measures that stand up to audits.
Designed for VLOP/VLOSE readiness, with a repeatable annual calendar and evidence pack.
Structured answer sets in this page tree.
Cited legal and guidance references.
For VLOPs and VLOSEs, DSA compliance becomes a risk management lifecycle: systemic risk assessment (Article 34) -> mitigation measures (Article 35) -> independent audit (Article 37) -> publication/transmission (Article 42). This page is an implementation guide for the risk and mitigation phases, with concrete measures and evidence expectations.
Who this applies to: VLOPs/VLOSEs and designation timing
The systemic-risk obligations apply to online platforms and online search engines designated as very large under Article 33.
If you are approaching the AMAR threshold, build the capability early - you'll need risk and audit outputs within months of designation.
- Threshold: AMAR in the Union >= 45 million (Article 33(1)) plus Commission designation decision.
- Obligations apply from four months after notification of the designation decision (Article 33(6)).
- AMAR publication cadence: publish AMAR at least every six months (Article 24(2)).
Article 34 - What you must assess (systemic risk categories)
Article 34 requires VLOPs/VLOSEs to identify, analyse, and assess systemic risks stemming from design/functioning/use of the service and related systems (including algorithmic systems).
The risk assessment is service-specific and proportionate, considering severity and probability.
- Illegal content dissemination risk (Article 34(1)(a)).
- Fundamental rights risks (Article 34(1)(b)), including privacy, data protection, freedom of expression, non-discrimination, and child rights.
- Risks to civic discourse/electoral processes and public security (Article 34(1)(c)).
- Risks related to gender-based violence, public health, minors, and physical/mental well-being (Article 34(1)(d)).
Turn EU Digital Services Act (DSA) Systemic Risk & Mitigation into an operational assessment
Assessment Autopilot can take EU Digital Services Act (DSA) Systemic Risk & Mitigation from turning this guidance into a repeatable review process to a reusable workflow inside Sorena. Teams working on EU Digital Services Act (DSA) can keep owners, evidence, and next steps aligned without copying this guide into separate documents.
Start from EU Digital Services Act (DSA) Systemic Risk & Mitigation and turn the guidance into owned tasks, evidence requests, and review checkpoints.
Review your current process, evidence gaps, and next steps for EU Digital Services Act (DSA) Systemic Risk & Mitigation.
Article 34 - When and how often to assess (and what triggers an out-of-cycle update)
Risk assessments must be completed by the date of application for the designated service and at least annually thereafter.
They must also be performed prior to deploying functionalities likely to have a critical impact on identified risks.
- Minimum frequency: annual.
- Out-of-cycle triggers: major changes to ranking/recommenders, ad targeting, moderation systems, identity/account systems, or other features likely to critically impact risks.
- Regional/language factors: incorporate regional or linguistic aspects, including Member State-specific issues (Article 34(2) guidance).
Article 34 - What factors to consider (the "systems" lens)
Article 34(2) pushes a systems perspective: how platform systems influence risks.
This prevents "paper-only" risk assessments that ignore product design and operational reality.
- Recommender and algorithmic systems effects (Article 34(2)(a)).
- Content moderation systems and enforcement effects (Article 34(2)(b)).
- Terms and conditions and their enforcement (Article 34(2)(c)).
- Advertising systems effects (Article 34(2)(d)).
- Data-related practices effects (Article 34(2)(e)).
- Manipulation/inauthentic use and rapid amplification dynamics (Article 34(2) additional analysis).
Documentation and retention (prove the assessment was real)
The DSA explicitly requires preserving supporting documents for risk assessments for a minimum period and providing them on request.
This means you need an evidence model: inputs, analysis, decisions, and approvals.
- Preserve supporting documents for at least three years after performing the assessments (Article 34(3)).
- Build an evidence pack: datasets, analysis notebooks, meeting minutes, decision logs, and sign-offs.
- Ensure confidentiality and access controls for sensitive inputs while enabling auditability.
Article 35 - Mitigation measures (what "reasonable, proportionate, effective" means)
Article 35 requires reasonable, proportionate, and effective mitigation measures tailored to the specific systemic risks identified, with particular consideration for impacts on fundamental rights.
This is where you translate the risk assessment into product, policy and operational changes.
- Product changes: adapt design/features/online interface (Article 35(1)(a)).
- Policy changes: adapt terms and enforcement (Article 35(1)(b)).
- Moderation operations: adapt speed/quality of notice processing, decision processes, resources; expeditious removal where appropriate for certain content types (Article 35(1)(c)).
- Algorithmic testing: test/adapt algorithmic systems and recommender systems (Article 35(1)(d)).
- Ads system changes: adapt advertising systems and targeted measures to limit/adjust ad presentation (Article 35(1)(e)).
- Governance: reinforce internal processes/resources/testing/documentation/supervision (Article 35(1)(f)).
- Cooperation: adjust cooperation with trusted flaggers and dispute settlement bodies (Article 35(1)(g)).
- Codes/crisis protocols: participate in codes of conduct and crisis protocols where applicable (Article 35(1)(h)).
- Child protection measures: targeted measures including age verification/parental controls and reporting/support tooling for minors (Article 35(1)(j)).
- Synthetic media marking: prominent markings + easy-to-use functionality to indicate manipulated/generated media (Article 35(1)(k)).
Annual risk calendar (recommended operating cadence)
A defensible VLOP/VLOSE program runs on a calendar with clear sequencing: assess -> decide -> implement -> measure -> audit -> publish.
Build this calendar so you don't compress remediation into the audit window.
- Q1: risk assessment planning, data collection, and risk workshops; update threat models and definitions.
- Q2: complete Article 34 risk assessment; approve mitigation plan and KPIs; begin mitigation rollouts.
- Q3: run measurement and effectiveness reviews; prepare audit evidence; address gaps before auditors arrive.
- Q4: independent audit execution and remediation; prepare Article 42 publication/transmission pack.
Evidence pack checklist (what to keep ready for Article 42 publication and audits)
Under Article 42, VLOPs/VLOSEs must publish/transmit risk and audit materials. Make those outputs "exportable" by design.
This checklist helps you build an evidence model that supports both audits and publication obligations.
- Risk assessment report + supporting datasets and analysis artifacts.
- Mitigation plan with owners, KPIs, rollout tracking, and effectiveness evaluation results.
- Change logs for major algorithmic systems, recommender surfaces, ad systems, and moderation policy updates.
- Audit-ready policies and procedures (moderation workflows, appeals, data access governance).
- Redaction plan: how confidential/sensitive information is handled for public versions while full versions are transmitted to authorities.