
The civil liberties group liber-net published a report titled “The Censorship Network: Regulation and Repression in Germany Today”. The report maps a network of government agencies, non-profits, academic departments, and think tanks involved in online content moderation. Its findings have sparked a debate about how these organizations operate, the sources of their funding, and the influence they may exert over public discourse.
Online content moderation has become a pressing concern in Germany and across Europe. With growing attention to disinformation, hate speech, and illegal content online, governments and platforms are under increasing pressure to act. The liber-net report offers a detailed look at how moderation efforts are structured in Germany and raises questions about transparency, independence, and accountability.
This article examines the report’s findings, the legal and regulatory framework governing moderation, reactions from civil society and political actors, and the broader implications for free expression and online governance.
Mapping the Moderation Network
Liber-net identifies more than 330 organizations participating in content moderation activities. These range from academic research institutes and think tanks to foundations, NGOs, and government offices. Many organizations receive public funding through grants intended to support research, education on disinformation, and prevention of hate speech. The report also provides databases detailing organizational structures and financial flows, alongside diagrams illustrating connections among actors.
Not all organizations actively remove content. Some focus on research, education campaigns, or public awareness initiatives. Others, including trusted flaggers, submit reports to social media platforms about content that may violate the law. Liber-net assigns a rating system to each organization, reflecting its potential influence in moderation activities. The report notes that these ratings are approximate and do not represent definitive judgments about any group’s behavior.
The network is vast and multifaceted. Some organizations act locally, addressing regional or community-level issues, while others operate nationally or internationally. According to the report, certain foundations and think tanks have close ties to academic institutions, which can provide legal or methodological expertise. Meanwhile, government agencies contribute through policy guidance, regulatory oversight, or direct partnerships with platforms.
The report also highlights that while these organizations often describe their work as educational or preventive, their activities can indirectly shape what content remains visible online. This layered structure—combining research, reporting, and oversight—has raised questions about transparency and independence.
Legal and Regulatory Context
Germany’s online content moderation policies are largely shaped by two legal frameworks: the Network Enforcement Act (NetzDG) and the Digital Services Act (DSA).
NetzDG
Enacted in 2017, NetzDG obliges large social media platforms to remove clearly illegal content within 24 hours and other illegal content within seven days of notification. Non-compliance can result in fines up to €50 million. Platforms must notify users when content is removed and archive moderation decisions.
Critics argue that NetzDG can encourage preemptive content removal. Because platforms risk large fines, they may err on the side of caution and remove material that is legally permissible. Some analysts have also noted that NetzDG’s focus on social media giants may leave smaller platforms with fewer resources to comply effectively, potentially creating gaps in enforcement or inconsistent moderation.
Supporters contend that NetzDG has helped improve the speed and consistency of content removal and has clarified legal obligations for platforms. They also argue that it sets a standard for transparency, as platforms are required to report moderation statistics regularly.
Digital Services Act and Trusted Flaggers
The EU Digital Services Act, which entered into force in 2023, establishes notice-and-action obligations, risk assessments, and transparency rules for platforms operating in the EU. A key feature is the role of trusted flaggers, organizations certified by national regulators to report potentially illegal content. Platforms must prioritize reports from these organizations.
In Germany, the Bundesnetzagentur certified REspect!, a reporting center within the Youth Foundation for Baden-Württemberg, as the first trusted flagger in 2024. The agency cited Respect!’s expertise, independence, and capacity to process reports in multiple languages. Platforms, however, retain final authority to decide whether content is removed.
Trusted flaggers are meant to support faster and more accurate identification of illegal content. In practice, this involves providing detailed reports, legal assessments, and evidence of violations. While flaggers do not remove content themselves, their recommendations carry significant weight with platforms seeking compliance with German and EU law.
Funding and Questions of Independence
A major theme of the liber-net report is the level of public funding many organizations in the network receive. Respect!, for instance, reportedly depends on government grants for approximately 95% of its budget, largely from the federal initiative “Demokratie Leben!” Observers suggest that high dependence on state funding could influence priorities, even if unintentionally.
Supporters argue that public funding does not prevent organizations from acting impartially. Oversight mechanisms, reporting obligations, and public scrutiny can help ensure accountability. Nevertheless, the complex web of grants and institutional relationships highlighted in the report demonstrates that influence can be difficult to trace fully.
Some organizations have multiple funding sources, including EU programs, private foundations, and academic grants. These multiple streams can dilute potential bias but also make transparency challenging. The report stresses that clear documentation of funding and decision-making processes is essential for public trust.
Reactions from Civil Society and Politics
The report has generated debate among civil society, political actors, and legal experts. The Civil Society Alliances for Digital Empowerment (CADE) expressed concern that state-funded flaggers could blur the line between moderation and censorship. They emphasized the need for transparency, external oversight, and clear procedures for appeals.
Some German politicians have voiced concerns that government-backed organizations might inadvertently favor content aligned with official priorities. Others argue that trusted flaggers enhance online safety by helping platforms respond quickly to illegal content, improving compliance and protecting users from harmful material.
Flagger organizations themselves stress that they operate according to legal guidelines and maintain transparent processes. Regulators maintain that platforms retain ultimate authority over content removal and that users have avenues for appeal and dispute resolution.
These debates underscore the tension between freedom of expression and enforcement. While online safety is a priority, maintaining trust in moderation processes and avoiding overreach is equally critical.
Balancing Free Expression and Enforcement
Moderation systems face a persistent challenge: how to remove harmful content quickly without mistakenly removing lawful speech. Rapid enforcement can prevent harm but can also increase the risk of over-removal, especially if organizations rely heavily on public funding or have unclear decision-making protocols.
Transparency, independent oversight, and robust appeals mechanisms are essential. Users must understand why content is removed and have options to contest decisions. Similarly, public disclosure of funding and organizational structures can help prevent the perception that state influence is shaping moderation priorities.
The liber-net report suggests that without careful checks, networks of state-funded organizations could indirectly affect the visibility of political or social commentary online. These findings highlight the importance of accountability mechanisms that are both practical and accessible to ordinary users.
Germany’s Internet Freedom Landscape
Freedom House’s Freedom on the Net 2025 classifies Germany as “Free” but notes small declines in online freedom, citing rising self-censorship and prosecutions for content critical of politicians. Analysts observe that NetzDG’s compliance pressures encourage platforms to remove content preemptively, which can affect lawful expression.
The Digital Services Act, with its emphasis on transparency and trusted flaggers, introduces new dynamics. Civil society continues to monitor its implementation, particularly how it interacts with existing moderation networks and whether it introduces new risks or safeguards for free speech.
Institutional Responses
The Bundesnetzagentur has defended its approval process for trusted flaggers, noting checks on expertise, independence, and adherence to legal standards. President Klaus Müller emphasized that platforms retain final authority for content removal, and that users can appeal moderation decisions.
Flagger organizations argue that their work strengthens law enforcement against illegal content and contributes to safer online environments. Civil society groups stress the importance of oversight, transparency, and public engagement to maintain trust and legitimacy.
Challenges and Recommendations
Observers and experts have suggested several measures to improve transparency and accountability. Publishing funding sources, governance structures, and operational procedures could clarify potential conflicts of interest. Independent review of flagger activity could help ensure fair and consistent moderation.
Safeguards to protect lawful political content remain important, as does diversifying funding sources to reduce reliance on a single stream. Accessible appeals processes, public education on moderation rules, and regular reporting of outcomes could further balance enforcement objectives with freedom of expression.
These measures do not eliminate challenges entirely but provide a framework for maintaining democratic values while enforcing laws against illegal online content.
Conclusion
Liber-net’s report highlights the scale and complexity of Germany’s state-funded content moderation network. By mapping organizations and funding flows, it illuminates the tension between ensuring online safety and protecting independence and free expression.
Transparency, oversight, and accountability remain crucial as Germany continues to implement the Digital Services Act and adapt its regulatory framework. The evolving debate around moderation networks underscores the ongoing challenge of balancing content enforcement with democratic principles and public trust.
References and Suggested Reading
Liber-net: The Censorship Network: Regulation and Repression in Germany Today (liber-net.org)
Bundesnetzagentur press release on first trusted flagger (bundesnetzagentur.de)
Euronews on state funding for Respect! (euronews.com)
CADE on trusted flaggers and free-speech concerns (cadeproject.org)
Freedom House: Freedom on the Net 2025 – Germany (freedomhouse.org)
ITIF analysis of Germany’s content moderation framework (itif.org)