Blog Compliance

EU DSA Article 16 compliance for SMB platforms in 2026

What Article 16 of the Digital Services Act actually requires, what the €120M X fine taught us, and how to ship a compliant notice-and-action workflow without a six-figure consulting bill.

DP

Adrià Pérez

· 13 min read

The Digital Services Act has been in force for all platforms since February 17, 2024. The first big fine landed December 5, 2025: €120 million against X for transparency violations. Investigations are open against TikTok, Temu, AliExpress, Shein, and Pornhub. The European Commission has clearly signaled that DSA enforcement is not theoretical.

If you operate a hosting service, an online platform, a marketplace, or any SaaS product that lets users post content visible to other EU users, the DSA applies to you. Article 16 — the notice-and-action mechanism — is the single most concrete obligation, and it is the one most SMB platforms get wrong.

This post walks through what Article 16 actually requires, what the recent enforcement signals tell us, what a compliant minimum-viable workflow looks like, and where the genuinely tricky edge cases are.

What Article 16 says

Article 16 of Regulation (EU) 2022/2065 obligates "providers of hosting services" to:

  1. Establish a notice-and-action mechanism that allows any individual or entity to notify the provider of allegedly illegal content.
  2. Make the mechanism easy to access and user-friendly, with submission exclusively by electronic means.
  3. Facilitate the submission of sufficiently precise and adequately substantiated notices, including: an explanation of why the content is illegal, the URL or location, the notifier's name and email (with a self-declaration of good faith), and the nature of the alleged infringement.
  4. Process notices in a timely, diligent, non-arbitrary, and objective manner.
  5. Send a confirmation of receipt to the notifier without undue delay.
  6. Notify the notifier of the decision taken in respect of the notice, with reasons and information about redress (counter-notice, appeals, out-of-court dispute settlement).
  7. Generate a statement of reasons for any restriction (content removal, account suspension, demonetization, search visibility reduction) and submit it to the EU DSA Transparency Database.

The thresholds and obligations escalate from there:

  • All hosting providers (Article 16): notice-and-action mechanism + statement of reasons.
  • Online platforms above the SMB threshold (Articles 20–22): internal complaint-handling system, out-of-court dispute settlement access, trusted-flagger priority, statement of reasons in the public DSA Database.
  • VLOPs (Very Large Online Platforms, 45M+ EU users) (Articles 25+): risk assessments, audits, crisis protocols, researcher access, ad repository, recommender system transparency, plus the strictest enforcement.

For most SMB platforms, the operative requirements are Article 16 plus the SMB-platform additions: complaint handling, out-of-court access, statement-of-reasons publication.

What the X fine actually penalized

The €120M fine against X (December 2025, IAPP / MediaLaws / Goodwin reporting) cited three specific failures:

  1. Deceptive blue-check verification. X's redesigned blue-check did not clearly indicate verification status, misleading users in a way the Commission deemed contrary to consumer protection norms within DSA scope.
  2. Inaccessible ad repository. Article 26 requires VLOPs to maintain a queryable, searchable repository of all advertisements served. X's repository was incomplete and difficult to query.
  3. Researcher-access obstruction. Article 40 grants vetted researchers access to publicly accessible data. X's process was deemed obstructive.

Notably, the X fine was about VLOP-level obligations, not Article 16 specifically. But the message to SMB platforms was clear: the Commission is willing to fine, the precedent is set, and Article 16 is the foundation for everything else. If a platform's notice-and-action mechanism is broken, every downstream obligation cascades.

The €120M figure is also significant because it is approximately 1% of X's relevant turnover — well below the 6% maximum. The Commission held room for escalation. Subsequent investigations against TikTok (preliminary breach finding May 15, 2025), Temu (July 28, 2025), AliExpress, Shein, and Pornhub will likely produce larger fines if those platforms don't remediate.

What does a compliant notice-and-action workflow look like

A minimum-viable, audit-ready Article 16 mechanism has the following components:

1. The notice form

Public-facing, accessible without login, exclusively electronic submission. Required fields:

  • Notifier's name (mandatory)
  • Notifier's email (mandatory)
  • Whether the notifier is the rights-holder, an authorized agent, a third party, or a public authority (mandatory)
  • The exact URL(s) of the alleged illegal content (mandatory)
  • A clear and unambiguous explanation of why the content is illegal under EU or member-state law (mandatory)
  • Self-declaration of good-faith belief and accuracy of the notice (mandatory)
  • Optional: supporting documents

The form must be in at least one EU language; many member states (Germany, France, Italy, Spain, Poland, Netherlands) effectively require the relevant local language for notices submitted by their residents.

2. Confirmation of receipt

Sent to the notifier's email "without undue delay." In practice, an automated email within minutes is the safe interpretation. Must include a unique reference number for the notice.

3. The decision and statement of reasons

For each notice, the platform must:
- Make a timely, diligent, non-arbitrary decision (remove, partially restrict, demonetize, search-visibility reduce, or take no action).
- Notify the notifier of the decision with reasons and information about redress.
- Notify the affected user of the decision with reasons and information about redress.
- Generate a "statement of reasons" matching the schema defined in Commission Implementing Regulation (EU) 2025/910 (the prescriptive new format from July 1, 2025), and submit it to the public DSA Transparency Database via the official API.

The statement of reasons must include: the type of restriction, the legal grounds, the facts and circumstances, the automation involved (if any), the geographic scope, the duration, and the mechanism by which the affected user can challenge it.

4. The internal complaint-handling system (Articles 20–22)

For online platforms above the SMB threshold, the affected user must be able to:
- File an internal complaint within at least 6 months of the original decision.
- Receive a reasoned response from a qualified human reviewer.
- Escalate to an out-of-court dispute settlement body certified by the Digital Services Coordinator of their member state.

Internal complaint-handling decisions reverse the original decision in approximately 30% of cases, per the European Commission's own DSA enforcement statistics from 2024–2025. Out-of-court settlements reverse roughly 52% of disputes that reach them.

5. The audit log

Every notice, every decision, every appeal, every reversal must be timestamped, immutable, and retrievable for at least 5 years (the DSA Database retention period since February 2025). Platforms with sub-millisecond response times and platforms with multi-day human review must both produce identical-looking audit trails on request.

What this costs

The European Commission's own DSA impact assessment, as cited by ACT|The App Association, estimates approximately €15,000 per year for a micro-enterprise to handle 50 takedown notices, plus a roughly €1,500 setup cost for the notice-and-action mechanism. Of those 50 notices, approximately 5% trigger counter-notice procedures.

For platforms processing thousands of notices per year, the Commission's estimates scale roughly linearly. For platforms processing millions, the cost is dominated by human review and tooling, not infrastructure.

What "good enough" looks like for SMBs

There is a realistic minimum-viable Article 16 stack that an engineering team of one or two can ship in a couple of weeks. The pieces:

  1. A /dsa/notice form — public, accessible, multi-language. Validates the required fields server-side. Generates a UUID reference number.
  2. A backing data model with Notice, NoticeAction, Appeal, and Decision tables. Append-only audit log. Soft deletes only.
  3. An admin queue with the relevant context: full notice body, the affected URL, the user, prior history, suggested action.
  4. A decision-recording flow — required reason codes, free-text justification, automated email to the notifier and affected user.
  5. An automated job that posts statements of reasons to the EU DSA Database in the official JSON schema.
  6. A /dsa/transparency page publishing aggregate statistics: notices received per category, decisions taken, reversal rate, mean response time.
  7. An internal complaint flow — same data model, with an appeal_to_id link.
  8. A standing relationship with at least one certified out-of-court dispute settlement body in your principal jurisdiction.

This is a few days of work for a team that has done it before, and a few weeks for a team that hasn't. We have built this stack as a drop-in widget — Counterspine's Compliance Widget add-on (€99/mo) — for platforms that would rather not build it themselves.

The TAKE IT DOWN Act overlay

If you accept user-uploaded images or video at all, you also need to comply with the U.S. TAKE IT DOWN Act, signed May 19, 2025. The compliance deadline is May 19, 2026.

Required: a notice-and-removal process for nonconsensual intimate imagery and AI deepfakes, with a 48-hour SLA for removal of validly-noticed material and reasonable efforts to remove identical copies (notice-and-staydown). Enforced by the FTC under Section 5.

The DSA Article 16 stack handles most of this. The deltas are:
- Tighter SLA: 48 hours rather than "without undue delay."
- Notice-and-staydown obligation: hash-matching against removed material to prevent re-upload.
- Specific notice content requirements differ from §512(c)(3) and DSA Article 16; allow free-text "harm description."
- Subject scope is narrower (NCII + deepfakes) but applies to any U.S.-accessible UGC platform.

Edge cases worth thinking about

  • Manifestly unfounded notices. Article 16(6) allows you to suspend processing of notices from a notifier who frequently submits manifestly unfounded notices. Document your suspension policy.
  • Trusted flaggers (Article 22). You must give priority to notices from designated trusted flaggers. Most member states have not yet finalized their trusted-flagger rosters; track the Digital Services Coordinator in your principal jurisdiction.
  • Repeat-infringer policy (DMCA §512(i) parallel). The DSA does not explicitly mandate one, but the EU Copyright Directive Article 17 effectively does for online content-sharing service providers. Document and enforce.
  • Country-of-establishment vs. country-of-targeting. If you are established in Ireland but your content targets users in Germany, you are still under German criminal-law enforcement on local issues. Talk to a lawyer.
  • The B2B exemption. Article 16 applies broadly; the Article 19 SMB exemptions (no internal complaint system, no transparency reports) require demonstrating SMB status under EU Recommendation 2003/361/EC. Document your headcount, revenue, and balance-sheet status annually.

TL;DR

Article 16 is mandatory, audited, and increasingly enforced. The €120M X fine is a warning shot, not the ceiling. A minimum-viable compliant workflow is achievable for SMB platforms with a small engineering team and roughly €15,000/year in handling costs. The TAKE IT DOWN Act overlays a tighter U.S.-side SLA on the same machinery.

If you'd rather buy than build, our Compliance Widget add-on ships the full notice-and-action + statement-of-reasons + audit-log stack as a drop-in.

Take back control
of your takedown surface

Set up your first watched domain in 60 seconds. See every notice ever filed against it. Catch the next one in under 5 minutes.