In Brief
The Digital Services Act (DSA) has been fully operational since February 2024, creating the most comprehensive platform governance framework globally. The European Commission is now actively enforcing it – investigating Snapchat for child protection compliance, finding adult content platforms in breach, and partnering with the EU Intellectual Property Office on enforcement. Meanwhile, the Commission has proposed repealing the older Platform-to-Business (P2B) Regulation, consolidating digital rules under the DSA and Digital Markets Act (DMA). The question shifts from what are the rules? to how do they bite?
The regulatory architecture is built. For those shaping what comes next – whether in policy, technology, or investment – the conversation continues at Human x AI Europe in Vienna on May 19.
The Enforcement Phase Begins
A regulation's true character emerges not in its drafting but in its enforcement. The DSA entered that phase with force in early 2026.
On March 26, the European Commission announced preliminary findings that Pornhub, Stripchat, XNXX, and XVideos had breached the DSA by allowing minors to access their services. The same day, the Commission opened an investigation into Snapchat's compliance with child protection rules. These are not symbolic gestures. They signal that the Commission intends to use the DSA's enforcement mechanisms – including fines of up to six percent of global annual turnover – against platforms that fail to protect vulnerable users.
The pattern is instructive. The Commission is prioritizing child safety as its first major enforcement vector. This reflects both political salience and legal clarity: age verification failures are easier to demonstrate than algorithmic harms. Expect this sequencing to continue – clear violations first, systemic risk assessments later.
The Regulatory Consolidation
In November 2025, the Commission proposed repealing the Platform-to-Business (P2B) Regulation, arguing that both the DSA and DMA now provide more effective and far-reaching rules. This is regulatory housekeeping, but it reveals a strategic choice: consolidation over proliferation.
The P2B Regulation, adopted in 2019, established transparency requirements for how platforms treat business users – ranking parameters, data access, complaint mechanisms. Its provisions have been absorbed into the broader DSA framework, which covers not just business users but all users, and not just transparency but liability, content moderation, and systemic risk management.
This consolidation matters for compliance teams and legal departments. The regulatory surface area is shrinking even as the depth of obligations increases. Fewer instruments, but each instrument carries more weight.
The Brussels Effect in Motion
The DSA's global influence operates through a mechanism that scholars call the Brussels Effect: when compliance costs make it cheaper to adopt EU standards globally than to maintain separate systems for different jurisdictions.
As analysis from the Chicago Journal of International Law notes, the DSA's penalty structure – up to six percent of worldwide turnover – creates powerful incentives for platforms to harmonize their content moderation practices around European norms rather than maintain fragmented approaches. The result: EU rules shape platform behavior far beyond EU borders.
This creates tension with other regulatory regimes. The same analysis highlights conflicts with U.S. state laws in Texas and Florida that prohibit viewpoint-discriminatory content moderation. Platforms face contradictory pressures: the DSA incentivizes more aggressive moderation of harmful content, while some U.S. laws penalize moderation that appears politically motivated.
The practical resolution, for most platforms, is to default to the stricter standard. European rules become the baseline.
What the Transparency Data Reveals
The DSA's transparency requirements generate data that reveals how platforms actually implement the regulation. Research examining the DSA's Transparency Database shows a striking pattern: platforms overwhelmingly categorize content removals as violations of their Terms of Service rather than violations of national law.
This matters for several reasons. First, it suggests platforms prefer the flexibility of contractual enforcement over the rigidity of legal compliance. Second, it may underrepresent the actual scale of illegal content – if platforms classify removals as ToS violations rather than legal violations, regulators get a distorted picture of what's happening on their services.
The innovation implications are also significant. Rather than building differentiated legal compliance systems for each jurisdiction, platforms are embedding their Community Standards into technical architectures as default rules. Compliance becomes a design choice, not a legal afterthought.
The Innovation Question
Does the DSA enable or constrain innovation? The honest answer: both, depending on what kind of innovation.
The regulation creates demand for compliance tools, risk assessment systems, and content moderation technologies. This is complementary innovation – building around the regulatory framework rather than against it. European startups in the trust and safety space benefit from a regulatory environment that mandates their services.
But the DSA's procedural requirements – notice obligations, appeal mechanisms, transparency reports – impose costs that may constrain more radical product innovation. Every new feature must be evaluated not just for user value but for regulatory compliance. The question is whether these constraints are proportionate to the harms they prevent.
Academic analysis frames this through Schumpeter's concept of creative destruction: capitalist innovation proceeds through waves that revolutionize the economic structure from within. Regulation can channel where that creative energy flows. The DSA channels it toward safety and transparency, potentially at the cost of other directions.
The Institutional Architecture
The Commission's recent moves reveal an enforcement architecture taking shape. The partnership with the EU Intellectual Property Office announced in April 2026 signals that enforcement will be distributed across specialized agencies, not concentrated in a single body.
The first meeting of the Special Panel on child safety online in March 2026 suggests the Commission is building advisory structures to inform enforcement priorities. The workshop with enlargement countries on platform regulation indicates the DSA's geographic scope may expand as candidate countries align their frameworks with EU standards.
This is institution-building, not just rule-making. The DSA's long-term impact depends on whether these institutions develop the capacity, expertise, and political independence to enforce the rules consistently.
What to Watch
Three developments will shape the DSA's trajectory over the next twelve months.
First, the outcomes of current investigations. If the Commission imposes significant fines on platforms found in breach, it establishes credible deterrence. If enforcement stalls in procedural challenges, the regulation's teeth dull.
Second, the algorithmic transparency pilot. The Commission is conducting an in-depth analysis of algorithmic transparency and accountability at the European Parliament's request. The findings will inform whether the DSA's current provisions are sufficient or whether additional rules are needed.
Third, the interaction with the AI Act. As AI systems become central to content moderation and platform operations, the boundary between platform regulation and AI regulation blurs. The Commission's ability to coordinate these frameworks will determine whether Europe's digital rulebook remains coherent or fragments into overlapping regimes.
The DSA is no longer a proposal or a promise. It is operational law, actively enforced, with real consequences for platforms that fail to comply. The question now is not whether Europe will regulate platforms, but how effectively – and at what cost to innovation, competition, and the global digital economy.
Frequently Asked Questions
Q: What is the Digital Services Act (DSA)?
A: The DSA is an EU regulation that establishes liability, transparency, and content moderation requirements for online platforms. It entered full application in February 2024 and applies to all platforms serving EU users, with additional obligations for very large platforms.
Q: What penalties can platforms face under the DSA?
A: Platforms can face fines of up to six percent of their global annual turnover for DSA violations. The Commission has already opened investigations and issued preliminary breach findings against multiple platforms in 2026.
Q: How does the DSA affect platforms outside the EU?
A: Through the Brussels Effect, platforms often adopt EU standards globally rather than maintain separate systems. The DSA's penalty structure incentivizes harmonization around European norms, effectively exporting EU rules worldwide.
Q: What happened to the Platform-to-Business (P2B) Regulation?
A: The Commission proposed repealing the P2B Regulation in November 2025, arguing that the DSA and DMA now provide more comprehensive rules. Its transparency requirements have been absorbed into the broader DSA framework.
Q: What enforcement actions has the Commission taken under the DSA?
A: In March 2026, the Commission found four adult content platforms in preliminary breach for allowing minor access and opened an investigation into Snapchat's child protection compliance. The Commission also partnered with the EU Intellectual Property Office on enforcement.
Q: How does the DSA interact with the AI Act?
A: As AI systems increasingly power content moderation and platform operations, the boundary between platform and AI regulation blurs. The Commission is working to coordinate these frameworks, though the long-term relationship remains under development.