DIGITAL SERVICES ACT (“DSA”): What has already changed on digital platforms since 2024?

The Digital Services Act (“DSA”), adopted by Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022, marked a structural shift in the way digital services operate within the European Union. Far from being an abstract regulatory exercise, the DSA has begun to produce visible effects on platform architectures, interfaces and content moderation processes, concretely altering the experience of users, content creators and economic operators.

The Regulation applies to a wide range of intermediary services and online platforms, including marketplaces, social networks, content-sharing platforms, app stores and travel and accommodation platforms, regardless of their size or place of establishment, provided that they offer services to recipients located in the European Union.

For years, the governance of the digital space was based on a predominantly self-regulatory model, characterized by opaque technical decisions, unclear moderation criteria and limited possibilities for users to react.

It is important to note that the DSA does not seek to eliminate platform autonomy, but rather to transform that model into a legally framed process.

One of the most significant changes introduced by the DSA concerns the requirement for accessible and effective mechanisms to report illegal content. Platforms are now required to provide simple and intuitive tools enabling users to flag illegal goods, services or content, as well as to cooperate with qualified entities (“trusted flaggers”).

This requirement has already translated into visible changes on major platforms. Today, for example, on platform X, illegal content can be reported directly via the menu associated with each post, a solution like those currently available on Facebook, Instagram, TikTok and Pinterest. Apple services have also integrated clearer and more intuitive mechanisms for reporting illicit content. Reporting illegal content has therefore ceased to be a residual technical procedure and has become part of the normal user experience on digital platforms.

At the same time, the DSA significantly strengthens transparency in moderation decisions. Whenever content is removed, accounts are suspended or visibility is restricted, platforms are legally required to provide clear, specific and reasoned explanations for such decisions. Moderation is no longer definitive and unchallengeable and now forms part of a genuine control framework that includes internal complaint mechanisms and out-of-court dispute resolution pathways.

The impact of the DSA is also visible in recommendation systems and in the way content is presented to users. Platforms are now required to provide information on the main criteria used to rank feeds and, in the case of very large online platforms, to offer options allowing users to disable profile-based personalization. Algorithmic logic ceases to be an imposed rule and becomes an informed choice.

In the field of digital advertising, the Regulation imposes clear rules on ad identification, reinforces transparency regarding why specific advertising content is displayed and prohibits the use of sensitive data for targeting purposes.

Advertising targeted at minors is expressly prohibited, reflecting a paradigm shift in the protection of vulnerable audiences. The protection of minors assumes unprecedented centrality under the DSA, requiring platforms to adopt concrete measures to safeguard privacy, security and physical and mental well-being. These requirements are already reflected in design changes, functional restrictions and new age-verification models, guided by proportionality and personal data protection principles.

At the institutional level, the DSA is based on a shared supervision model between the European Commission and national authorities designated as Digital Services Coordinators. The Commission plays a particularly active role with respect to very large online platforms and search engines, exercising investigatory powers, issuing information requests and ensuring continuous monitoring. Enforcement has moved beyond theory and has become part of European regulatory practice.

It may therefore be concluded that, in this new framework, self-regulation does not disappear but is no longer sufficient. The exercise of technological power now implies procedural transparency, the possibility of challenge and effective legal accountability.

The Digital Services Act does not merely redefine rules, it redefines behavior.

Failure to comply with the obligations laid down in the DSA may result in the imposition of significant financial penalties, calculated based on the provider’s worldwide annual turnover.

Belzuz Abogados, S.L.P. has experienced lawyers in Digital and Regulatory Law who can provide legal advice on this matter.

Request specialized legal advice

Our team of lawyers analyses your case and provides clear, strategic legal solutions tailored to your situation.

Explain your situation and receive a personalised proposal

Other publications

error: Content is protected !!