DSA Enforcement Landscape 2026
The European Commission intensified DSA enforcement through late 2025 and early 2026, with 17 platforms and 2 search engines designated as VLOPs/VLOSEs. X received a €120 million fine for advertising and user account transparency violations, signaling regulators' enforcement resolve.
Active Investigations Target Major Platforms
The European Commission has opened formal investigations into Google, Meta, Apple, and TikTok covering advertising transparency, targeted advertising practices, content recommendation algorithms, App Store policies, and minor protection measures.
| Platform | Investigation Focus | Status |
|---|---|---|
| Ad Transparency, Search Ranking, Recommender Systems | Formal Investigation | |
| Meta | Targeted Advertising, Content Algorithms, Researcher Access | Formal Investigation |
| Apple | App Store Policies, Interoperability Compliance | Formal Investigation |
| TikTok | Minor Protection, Addiction Design, Recommendation Transparency | Formal Investigation |
Active DSA Investigations 2026
DSA allows fines up to 6% of global annual turnover for non-compliance, with periodic penalties up to 5% of average daily worldwide turnover for continued violations.
2026 Enforcement Priorities
The Commission has identified six priority areas for 2026 enforcement: systemic risk assessments, generative AI content integrity, minor protection measures, advertising transparency, recommender system parameters, and independent audit compliance.
X (Twitter) €120M Fine Analysis
The €120 million fine against X centered on advertising transparency violations and user account transparency failures. Specific violations included failure to disclose advertising repositories, provide searchable ad databases, and verify advertisers as required under DSA.
- Failure to maintain complete advertising repository
- No searchable ad database accessible to users
- Advertiser verification mechanisms inadequate
- User account transparency requirements not met
- Fine represents approximately 0.3% of estimated global turnover
VLOP/VLOSE Oversight Structure
The European Commission directly supervises VLOPs and VLOSEs, while National Digital Services Coordinators oversee other intermediary services. Several member states face infringement proceedings for delayed DSC authority designation.
Generative AI Risk Assessments
Article 34 systemic risk assessments now must address generative AI content and information integrity. VLOPs must evaluate risks from synthetic content, deepfakes, and AI-generated misinformation in their risk management frameworks.
| Risk Category | 2026 Focus | Assessment Required |
|---|---|---|
| Illegal Content | AI-Generated Violations | Quarterly |
| Fundamental Rights | Synthetic Media Impact | Quarterly |
| Civic Discourse | Election Interference | Enhanced |
| Minor Protection | Age Verification Systems | Biannual |
| Gender Violence | Deepfake Abuse | Quarterly |
DSA Article 34 Systemic Risk Categories
Article 37 requires annual compliance verification by qualified independent auditors. Audit reports must be submitted to the Commission and national DSCs, with public summaries published by platforms.
Minor Protection Requirements
Article 28 mandates age verification and design choices protecting minors. TikTok faces specific investigation into addiction-related design features and content recommendation transparency affecting minor users.
Platforms implementing comprehensive compliance programs report smoother regulatory relationships. Annual audits, transparent reporting, and proactive risk assessments reduce enforcement risk significantly.