Image: private
The “Digital Services Act” (DSA) of the EU brings far-reaching changes to the Internet: Large platforms such as Facebook, Google and Amazon must, among other things, check their own content for illegality, comply with transparency requirements and pass on their data to research centers. Ranjana Achleitner from the Institute for European Law at Johannes Kepler University speaks to the OÖN about the opportunities and possible weak points of the regulation, which will apply in full from February 2024.
OÖN: From the EU’s perspective, what is the basic intention behind the Digital Services Act?
Ranjana Achleitner: The premise is that online platforms can be harmful to society due to their structure. That is why the DSA stipulates, for example, that platforms themselves must monitor whether and what risks they pose. Critics point out that the platforms are operated by private companies. Anyone can use them or, if they have a problem with them, not. I think that’s not enough: these platforms now play an important role in our society.
Is the EU shirking responsibility for companies with this self-regulation?
Yes, although the question is whether it would actually be the responsibility of the EU or the nation states. But the platforms basically act like judges and, for example, have to weigh up fundamental rights when it comes to content, for example between personal rights and freedom of expression. The reason for this is that otherwise legal enforcement on the Internet would take too long. In addition, the results of the risk analyzes to be carried out by the platforms are in turn evaluated by private audit bodies that are paid by the platforms. The EU is monitoring the process, but overall a large number of private actors are responsible for very serious decisions. We are not yet talking about the fact that many of these tasks are carried out by algorithms and not by humans.
Does the public have access to these processes?
Coordination centers will be set up in all Member States. Researchers can apply for access to data from risk analyzes. This right currently only exists on paper, but in practice the question will arise as to how extensive access is. But that would be huge progress. Some scientists are already receiving data, but not all. I hope very optimistically that research will even take on a monitoring function in practice. Companies must also publish transparency reports annually.
What role does the EU Commission play in this structure?
The EU Commission has created an immense position of power for itself. It is involved in the development and implementation of the regulation and can subsequently specify it with files. The question is whether it can find the highly qualified staff to keep pace in the future – it is now in direct competition with Meta and Google.
The DSA is often referred to as “the end of hate on the internet” sold.
I see that critically. The DSA does not specify what content is illegal. This is still the responsibility of national or EU law. It only prescribes how users can take action against problematic content. I also see a need for improvement in child protection: for example, your data may no longer be used for targeted advertising in the future, but the question is how websites determine whether users are underage.
Finally: How do you rate the DSA?
Despite all the criticism, he brings a new lease of life to the discussion. But for it to really become an effective tool, it needs rigorous implementation. Because of the different actors involved in surveillance, it risks becoming weaker in practice than on paper. The role of companies must be rigorously controlled.
My themes
For your saved topics were
new articles found.

info By clicking on the icon you can add the keyword to your topics.
info
Click on the icon to open yours “my themes” Page. They have of 15 keywords saved and would have to remove keywords.
info By clicking on the icon you can remove the keyword from your topics.
Add the topic to your topics.
Source: Nachrichten