January 2026 - Digital Policy | Self-regulation

Strengthening Digital Trust in Europe

Digital trust in Europe depends on effective hotlines, well-resourced oversight, and rights-respecting rules. Alexandra Koch-Skiba, Head of eco’s Complaints Office, explains how these pillars protect users and ensure safe online spaces.

Strengthening Digital Trust in Europe-web

©M.photostock | istockphoto.com

Digital trust has become one of the most critical pillars of Europe’s expanding online ecosystem. At a time when policymakers are shaping ambitious regulatory frameworks – from the Digital Services Act (DSA) to the upcoming CSAM Regulation and the new rules on political advertising – users and companies need assurance that these measures are supported by reliable, effective, and rights-respecting structures. Trust does not emerge from legislation alone; it is built when people experience systems that work, institutions that respond, and rules that protect without overreaching.

In this article, I explore these three interconnected pillars – proven complaint structures, adequately staffed oversight bodies, particularly the Digital Services Coordinator (DSC), and regulation that is right-respecting and incorporating proven structures – and why strengthening them is essential to securing digital trust in Europe.

Hotlines as a proven cornerstone of online safety

When it comes to the protection of minors online and tackling illegal online content, for three decades, the eco Complaints Office (also known as a hotline) has played a central role, acting as alert platform and contact point for users, assisting law enforcement, and giving guidance to providers of hosting services and platforms. With a self-regulatory approach and well-connected with relevant stakeholders worldwide, we are helping ensure that illegal content – particularly child sexual abuse material (CSAM) – is removed quickly and consistently.

Every year, the German federal government publishes its take-down report, offering insight into the scale of online CSAM, the responsiveness of providers of hosting services and platforms, and the effectiveness of established reporting mechanisms. The 2024 report, released in June 2025, showed that 31,536 reports of child sexual abuse material were recorded in 2024 by the German hotlines and the German Federal Criminal Police Office (BKA). Although this figure is below the all-time high of 2023, it still sits well above all pre-2023 levels. The reality is clear: the volume remains persistently, historically high.

From the perspective of the eco Complaints Office, this trend was also reflected in the complaint volumes we received throughout 2025. During the year, the number of incoming reports remained consistently high. By year’s end, the total approached the record levels of 2023, confirming that the decline observed in 2024 was merely a temporary plateau rather than a lasting trend.

One of the most significant findings in the federal take-down reports concerns the role of hotlines. Out of all CSAM URLs reported to BKA in 2024, 99.2% came from German hotlines. This is not a coincidence. It demonstrates once again the indispensable function hotlines play as low-threshold, trustworthy, and accessible points of contact. Our work complements law enforcement, but we are far easier to reach – crucially, also anonymously. This enables users to report illegal content early and without hesitation, which in turn allows rapid processing and removal.

The report also examined take-down speeds. Although there was a slight increase in average removal time for content hosted in Germany, the success rate remains high: around 56% was removed within two days, and nearly 99% was removed within one week. For content hosted outside Germany, providers succeeded in removing 84.17% of the content within four weeks – a strong result, especially considering the limitations one could face when accessing foreign server infrastructure.

These numbers underscore a simple truth: effective hotlines work. They enable users to report, help law enforcement allocate resources efficiently, and ensure quicker removal of illegal material by notifying and supporting providers.

In the broader conversation about digital trust, hotlines are often overlooked, yet they represent one of the most concrete, practical, and proven tools Europe has at its disposal. For roughly 30 years, the eco Complaints Office has acted as a central point of contact for such reports and continues to collaborate closely with providers, international partners through INHOPE and with law enforcement authorities worldwide.

As we navigate an increasingly complex digital environment, the role of these well-established structures must remain central. They demonstrate that trust is built not only through regulation, but also through trusted self-regulatory mechanisms that empower users and deliver results.

Effective oversight requires adequate resources: The DSA and the role of the DSC

The DSA is a core legal framework covering responsibilities and obligations in the area of tackling illegal content online.

While hotlines support trust and enable effective enforcement on a self-regulatory approach, regulatory oversight structures also play a decisive role in Europe’s digital landscape. The Digital Services Coordinators (DSC), established under the DSA, are responsible for supervising digital services in the member states, supporting cross-border coordination, and ensuring that providers meet their legal obligations. Yet despite its central role, the DSCs, particularly in Germany, face significant staffing challenges that jeopardize its ability to fulfil its mandate. In our view, it must be more equipped with sufficient resources to carry out their mandates effectively.

According to the national law designating the DSC in Germany and its own estimates, the DSC requires over 91 permanent positions to fulfil its tasks effectively. However, the 2025 federal budget provided funding for only 47.8 positions, of which approximately 37 were filled. This means that less than half of the required capacity was available. It is evident that such limited resources are insufficient for the DSC to act with the necessary authority, consistency, and timeliness.

This resourcing problem becomes even more critical when considering the additional responsibilities the DSC has taken on. With the new law on the transparency and targeting of political advertising (PWG), the DSC became responsible for implementing and supervising the EU Regulation on political advertising from October 2025. While an additional 17.57 positions were foreseen in the explanatory memorandum, this did not resolve the fundamental issue: even the baseline staffing levels remained far too low.

Insufficient staffing has tangible consequences. Without adequate capacity, the DSC cannot provide companies with the clear guidance they need to implement complex regulations. Nor can it reliably network with relevant stakeholders, enforce rules or support cross-border coordination – both of which are central to the DSA’s framework. Companies face uncertainty, oversight becomes inconsistent, and, ultimately, users may experience a decline in the safety and reliability of digital services.

Digital trust depends on regulators who are visible, responsive, and equipped to perform their tasks. As I have emphasized previously, a significant increase in staffing is urgently necessary.

A rights-respecting path forward for the CSAM Regulation

A number of laws and regulations is currently being developed, drafted or negotiated. From our point of view, these must be proportionate, right-respecting and incorporating proven structures. In this context, I want to put a spotlight on the proposed CSAM Regulation which may have an impact on both users on the one hand and industry/providers including measures to tackle CSAM already in place on the other hand.

The ongoing negotiations surrounding the EU’s CSAM Regulation represent a defining moment for Europe’s digital policy. The goal – preventing and combatting child sexual abuse – is essential and undeniable. Yet the mechanisms chosen to achieve this goal must align with fundamental rights, technological realities, and the security of all users.

In this context, the recent decision by the EU Council to refrain from mandatory detection orders – particularly for end-to-end-encrypted services – is a welcome and important development. The European Parliament, in its position, similarly rejected mandatory suspicionless scanning of private communications. From the perspective of the Internet industry, it is crucial that this stance prevails in the upcoming trilogue negotiations.

Mandatory searching private communications without cause is incompatible with fundamental rights and technically misguided, particularly when it has to take place in encrypted environments. Mandatory scanning mechanisms in encrypted and thus secured services would significantly weaken encryption, thereby compromising the security of millions of users, including children and vulnerable individuals. They would create new vulnerabilities for misuse, expose sensitive communications, and undermine trust in secure communication tools. The risk is not hypothetical; it is systemic.

Even in non-encrypted environments, mandatory searching for CSAM or grooming would have an extreme impact on all of us and how we would use messengers and other communication tools.

For these reasons, mandatory detection orders should no longer appear – directly or indirectly – in the CSAM Regulation. Any form of “de facto” obligation must also be avoided. Even indirect pressure would threaten encryption and could force providers to implement surveillance mechanisms that erode user security.

At the same time, it is important that the Commission, Council, and Parliament agree on a stable, legally secure framework for voluntary detection tools. Many companies already use voluntary mechanisms to address known CSAM, yet they do so in a regulatory environment marked by legal uncertainty. A clear, balanced and permanent framework would help ensure that voluntary procedures can continue effectively and safely.

The debate around the CSAM Regulation is a test of Europe’s ability to balance child protection with fundamental rights. If policymakers can strike the right balance, the EU will be able to deliver legislation that protects children, respects privacy, and maintains the security of digital communications.

Regardless, the proposed CSAM Regulation covers measures and areas of action which already have proven structures and cooperations. In this context, I would like to recall the role hotlines are playing as national alert platforms for users and trusted partners and notifiers when it comes to informing law enforcement and providers of hosting services about CSAM distributed online. Proven measures, structures, and cooperations should be incorporated when developing new legislation and not counteracted. This is why we continue to call on the EU institutions to include explicitly the collaboration between new authorities and the existing hotlines and their network/umbrella INHOPE.

Conclusion: A coherent approach to digital trust

Strengthening digital trust in Europe requires more than ambitious policy goals. It demands a coherent approach that integrates reliable user-facing mechanisms, competent supervisory authorities, and balanced regulatory choices. Hotlines continue to demonstrate their crucial value, enabling swift removal of illegal content and providing an accessible, effective reporting structure. Supervisory bodies such as the DSC must receive the staffing and resources necessary to perform their responsibilities effectively. And legislative proposals, particularly the CSAM Regulation, must protect children without compromising encryption, privacy, or the security of all users.

Europe has the opportunity to lead by example in creating a digital environment that is safe, free, and trustworthy. This requires reinforcing the structures that have already proven their effectiveness, ensuring that oversight bodies are adequately equipped, and maintaining a firm commitment to fundamental rights. Only through this balanced, practical approach can we sustain and strengthen digital trust: for users, for businesses, and for society as a whole.

More information: https://international.eco.de/topics/policy-law/eco-complaints-office/

 

📚 Citation:

Koch-Skiba, Alexandra. (January 2026). Strengthening Digital Trust in Europe. dotmagazine. https://www.dotmagazine.online/issues/digital-trust-policy/digital-trust-in-europe

 

Alexandra Koch-Skiba has been registered as an attorney since 2005. During her legal education, she specialized in criminal law and the law of the protection of minors. As the Head of eco’s Complaints Office, she is in charge of the hotline’s management and of supporting the report handling, in particular in regard to legal issues. She represents the hotline at the European and national level, e.g. at European Networks, in liaising with law enforcement and other relevant stakeholders, and at events. Moreover, she represents eco on topics related to youth protection on the Internet.

 

FAQ

1. What are the main pillars of digital trust in Europe?

According to Alexandra Koch-Skiba of eco, digital trust relies on three pillars: accessible hotlines for illegal content reporting, well-resourced oversight bodies like the DSC, and regulation that respects users' rights while ensuring safety.

2. Why are hotlines like eco’s Complaints Office so important?

Hotlines are low-threshold, anonymous points of contact for users to report illegal content such as CSAM. They help law enforcement and platforms act quickly by enabling early intervention and fast removal.

3. What is the role of the Digital Services Coordinator (DSC)?

The DSC oversees compliance with the Digital Services Act (DSA), coordinates enforcement across borders, and supports service providers. However, in Germany, it is under-resourced, which limits its ability to function effectively.

4. How does the CSAM Regulation affect encryption and user privacy?

Mandatory scanning of private communications—especially encrypted messages—risks undermining digital security for everyone. eco advocates for a balanced, rights-respecting approach that avoids surveillance mechanisms.

5. What staffing issue does the DSC face in Germany?

As of 2025, Germany’s DSC had fewer than 40 of the over 90 positions needed to meet its obligations. This gap hinders its ability to guide companies, enforce rules, and coordinate across borders.

6. How should voluntary CSAM detection tools be regulated?

A stable legal framework is needed to allow voluntary detection tools to operate without creating legal uncertainty. eco supports clear rules that preserve user rights while enabling trusted interventions.

7. Why should existing hotlines be integrated into future EU regulations?

Hotlines like those operated by eco have decades of experience and global cooperation through INHOPE. New laws should build on these proven systems—not bypass them—to ensure effective, user-trusted enforcement.