Denmark’s Presidency Secures Compromise, Though Tech-Scanning Mandate Remains Controversial
Copenhagen, November 26, 2025 — After months of contentious debate, EU member states have reached a political agreement on a revised legislative framework aimed at combating child sexual abuse (CSA) online. The compromise, brokered under Denmark’s EU Council Presidency, stops short of mandating end-to-end encrypted message scanning—a provision Denmark had initially championed—but establishes a more robust and permanent legal architecture for detecting and removing child sexual abuse material (CSAM).
The Danish Ministry of Justice confirmed the agreement in a statement issued today, marking a pivotal moment in the EU’s efforts to protect children in the digital sphere. Justice Minister Peter Hummelgaard (Social Democrats) hailed the deal as “absolutely crucial” while expressing disappointment that stronger measures against tech giants did not prevail.
A Step Back from Mandatory Scanning
Originally proposed by the European Commission in May 2022, the CSA Regulation sought to impose legally binding obligations on digital service providers—including messaging platforms like WhatsApp, Signal, and iMessage—to detect, report, and remove CSAM, even within encrypted communications. Under the initial plan, providers would have been required to deploy client-side scanning technologies capable of identifying known and even novel abusive imagery without breaching encryption outright.
However, the proposal faced stiff opposition from privacy advocates, civil society organizations, and several member states—including Germany—concerned about the implications for fundamental rights to privacy and secure communication. In October 2025, Denmark’s EU Presidency officially withdrew the mandatory scanning clause following resistance from its own coalition partner, the Moderates, who deemed it “too intrusive.”
“Let’s be clear: I would have preferred the EU to take bolder action—including requiring tech companies to actively detect CSAM in private communications,” Hummelgaard stated. “But given the political realities, this compromise represents real progress. The alternative—failing to act—was never acceptable.”
Permanent Obligations Replace Voluntary Regime
Currently, EU law permits voluntary detection of CSAM by online platforms under a temporary regulation set to expire in April 2026. The new agreement transforms this into a permanent, legally binding obligation—but with important caveats.

Under the compromise:
- Service providers must implement risk-assessment protocols to identify platforms or features susceptible to CSAM dissemination.
- Detection measures are limited to unencrypted data or content shared publicly or semi-publicly (e.g., cloud storage, social media posts).
- End-to-end encrypted private messages remain outside the scope of mandatory scanning—a concession to privacy concerns.
- A strict oversight framework will be established, including judicial authorization for detection orders and independent review mechanisms.
The regulation also introduces stricter reporting timelines, enhanced cooperation between EU law enforcement agencies via Europol’s EC3 unit, and new requirements for age verification in high-risk online environments such as live-streaming and gaming platforms.
Civil Society: Cautious Approval
Child protection organizations, including Save the Children Denmark and Børns Vilkår, welcomed the agreement as a necessary step forward but emphasized its limitations. “While we support strengthened obligations for platforms like Meta, Google, and Microsoft, the failure to address private encrypted channels leaves a dangerous blind spot,” said a joint statement from the two NGOs. “Predators are increasingly migrating to encrypted apps—this law must evolve to meet that threat.”
Digital rights groups, meanwhile, warned against mission creep. “Any system designed to scan private communications, even with safeguards, sets a precedent that authoritarian regimes will exploit,” cautioned European Digital Rights (EDRi) in a separate briefing.
Next Steps: Trilogue Negotiations
The Council’s political agreement now paves the way for formal trilogue negotiations with the European Parliament, which has taken a more assertive stance on both child protection and digital rights. The European Parliament’s Committee on Civil Liberties, Justice and Home Affairs (LIBE) is expected to push for stronger privacy safeguards while reaffirming the need for effective CSAM detection.
With Denmark’s Council Presidency concluding at the end of December 2025, Minister Hummelgaard stressed urgency: “We cannot afford delays. Every day without a permanent legal framework is a day when abusers operate with impunity and children remain at risk.”
If adopted in early 2026, the CSA Regulation would represent the EU’s most comprehensive legal instrument to date against online child sexual exploitation—balancing, however uneasily, the imperatives of child safety and digital privacy in the 21st century.
— Nordic Business Journal
