Digital reporting points: This is what the debate about Trusted Fahner is about

Digital reporting points: This is what the debate about Trusted Fahner is about

Trusted flaggers are reporting points for illegal online content. Since they were enshrined in law, they have been exposed to criticism from the right. The accusation is “censorship”.

At the beginning of October, the Federal Network Agency named the first German Trusted Flagger for online platforms. The “trustworthy whistleblower,” according to the translation, is the “Respect” reporting center of the Baden-Württemberg Youth Foundation. The aim of such reporting points is to report and remove illegal content, hate speech and fake news on Facebook, Instagram and Co. more quickly.

Criticism immediately poured in. A veritable wave of indignation over the alleged “censorship authority” built up on social media. Right-wing and conservative publications suspected a “digital Stasi”.

But what are the Trusted Flaggers all about and what are they criticized for? Do they really represent a threat to our liberal democracy because they restrict freedom of expression and reduce citizens to immature subjects, as the sometimes shrill criticism has stated?

Why are Trusted Flaggers appointed now?

Very easy. By naming trusted flaggers, the Federal Network Agency is following European law. In 2022, the EU passed the Digital Services Act (DSA), which applies in all member states. In February of this year it was adopted into German law (“Digital Services Act”). The DSA is intended to better protect against illegal content on the Internet and to urge platforms such as Facebook, X, Tiktok and Instagram to take stronger action against it. This applies to depictions of sexual abuse of children, to counterfeit products and also to illegal content such as incitement to hatred, insults, threats of violence, anti-Semitism, etc. The Federal Network Agency is responsible for implementing the law in Germany.

“Respect” was selected by the network agency because the organization has been offering people a contact point and expertise since 2017. In addition to “Respect”, there are numerous other organizations that work in a similar way.

What is crucial is: Platform operators such as Facebook or X are obliged by the DSA to check content reported by Trusted Fahner. If they do not delete it, they must justify why the content remains online.

How does the “Respect” reporting center work?

According to Petra Densborn, chairwoman of the Baden-Württemberg Youth Foundation, four to five people work on the reports every day. It will be checked whether the content might be criminal. If that is the case, the report will be passed on to the Federal Criminal Police Office (BKA), which will also investigate.

If the BKA and the public prosecutor’s office come to the conclusion that there is initial suspicion, “Respect” will contact the platform operator with the reports. The new law then forces them to delete the content or to justify why the post can remain online. The organization receives around 85 reports every day. “In no case do we decide ourselves what is deleted,” said Densborn to the “Süddeutsche Zeitung”. “We are not a sanctioning body.” According to this, “Respect” has almost 40 percent of incoming online content checked by the public prosecutor’s office.

How big is the state’s influence?

Trusted flaggers must be independent of online platforms. This does not apply to the relationship with politics. Although “Respect” is a private foundation organization and basically has to finance itself (membership fees, foundation capital, third-party funds), it is supported by the state. According to Klaus Müller, head of the higher-level Federal Network Agency, the states of Bavaria and Baden-Württemberg and the Federal Ministry for Family Affairs are providing money.

However, there are no signs of political influence to prosecute content below criminal liability. “Respect” is solely committed to the DSA. However, it is questioned that Müller is acting in a dual role. On the one hand, he is the head of the authority who is bound by the instructions of Robert Habeck’s Federal Ministry of Economics, and on the other hand, he is also the national digital services coordinator, who is supposed to be completely independent according to EU rules.

The criticism of the influence was reinforced by a clumsy press release from the Müller authority. It said: “Platforms are obliged to respond immediately to reports from trusted flaggers. Illegal content, hate and fake news can be removed very quickly and without any bureaucratic hurdles.” This gave the impression that the Trusted Flaggers, in conjunction with the Federal Network Agency, could determine at will what constituted illegal content. That is not the case. Fake news and hate speech do not have to be deleted across the board.

For control purposes, the work of “Respect” and similar organizations is regularly checked. Trusted flaggers are required to publish detailed reports on their work annually.

What are the allegations exactly?

Critics fear censorship. This corresponds to the general right-wing topos that one can no longer express one’s opinion freely and that politically unpopular opinions are suppressed. Here’s a fact: It’s about illegal content that is (and always has been) criminally relevant. According to BKA statistics, 80 percent of the content reported by “Respect” and similar organizations in Germany each year gives rise to initial suspicion of a crime. In addition, trusted flaggers are not a new phenomenon. Online platforms also worked with Trusted Flaggers to limit illegal posts before the DSA came into force.

Important: Any user whose content has been included in the index can defend itself against this. There are also reporting points for this and the possibility of having the deletion legally checked. A so-called arbitration board can be reached at.

Another accusation: Trusted flaggers promote informing among the population. However, there is no evidence that reporting centers support a culture of shaming. Experience so far shows that reporting offices themselves are exposed to massive attacks in order to hinder their work. The Amadeu Antonio Foundation’s portal, where you can view anti-feminist posts and incidents, was initially flooded with hate messages. This made the need for such reporting points all the more clear.

Another fear of critics is that online platforms will limit themselves in advance and delete legal content in order to avoid legal trouble. The Meta company, to which Facebook belongs, has made its “community standards,” which define permitted and prohibited content, stricter in places than is necessary under German law.

The accusation is justified, but it is doubtful whether a certain level of self-control leads to a serious restriction of freedom of expression. Online platforms have their own guidelines and often remove content regardless of the law. Simply because they can.

A weak point in the DSA is the term hate speech, which is vaguely defined. Hate speech can often be viewed as expression of opinion, even if it is offensive or even inhumane. It is also problematic that platform managers are personally liable if their companies violate the DSA, for example if they delete significantly too little. This can have the opposite effect: that cases repeatedly end up in German courts because Facebook and other platforms have deleted legal opinion posts.

Sources: “”, “”, “”, “”, “”, “”, “”, , AFP

Source: Stern

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest Posts