The Digital Services Act (DSA), which came fully into force on 17 February 2024, is now the cornerstone of the regulation of online services in the European Union.
Procedures launched, guidelines published, trusted signaller activities... what assessment can be made after two years of application?
What is DSA?
The Digital Services Act (DSA) sets out common rules for online platforms, search engines and other intermediary services to govern the way they deal with illegal content as defined by current law.
The regulation does not create any new offences, but imposes enhanced transparency, reporting, withdrawal and risk management obligations on platforms, with stricter requirements for very large platforms.
As far as minors are concerned, the DSA lays down specific obligations designed to ensure a high level of protection: restrictions on certain forms of targeted advertising, more protective default settings and an assessment of the risks that services may pose to their rights, safety and development.
VLOPs face up to their responsibilities
The testing of the DSA begins with the very large platforms, i.e. services with more than 45 million active monthly users in the European Union, which concentrate the bulk of the Regulation's most demanding obligations. The link between Article 28 (protection of minors) and Articles 34 and 35 (assessment and mitigation of systemic risks) is at the heart of the regulation.
These platforms must identify, analyse and mitigate the systemic risks associated with the operation of their services, including those affecting the fundamental rights of minors. These include exposure to pornographic or violent content, cyber-bullying, paedo-crime and the effects of certain design and recommendation logics.
Since 2024, thehe European Commission has initiated several formal procedures on this basis.
- On social networks, the TikTok sequence illustrates the gradual shift from control to questioning. Proceedings were opened against TikTok in February 2024, notably concerning the protection of minors, advertising transparency and access to data for researchers. The Commission then adopted preliminary conclusions on 14 May 2025 on TikTok's advertising directory, under the transparency obligations.
- On 5 February 2026, it issued new preliminary findings concerning what it described as addictive design, These include features such as infinite scrolling, autoplay, push notifications and a highly personalised recommendation system, which are linked to the obligations to manage and mitigate systemic risks set out in Articles 34 and 35.
- At the same time, in May 2024, the Commission opened proceedings against Meta (Facebook and Instagram), including a section on risks to minors and the management of certain content.
- On 23 October 2025, preliminary conclusions were reached against both TikTok and Meta concerning access to data for researchers and Meta specifically concerning certain mechanisms for notifying illicit content and user recourse.
- In the case of X, proceedings were initiated in December 2023 and the Commission issued preliminary findings on 11 July 2024 concerning, among other things, misleading design practices (dark patterns), advertising transparency and access to data for researchers.
- On 5 December 2025, the European Commission fined X €120 million, the first large-scale penalty based on the DSA.
- In January 2026, the Commission opened a new formal investigation into Grok, the AI integrated into X, focusing on the risks of disseminating illegal content and manipulated content, including non-consensual sexual content.
- Marketplaces have also been targeted. Proceedings were opened against AliExpress in March 2024, and against Temu in October 2024, concerning risk management in relation to the sale of illegal products and recommendation systems. In February 2026, a formal investigation was opened against Shein.
Two years after it came into force, the DSA has well and truly entered the litigation phase. These conclusions do not yet constitute sanctions, but they do open up an adversarial phase between the Commission and the platforms.
Protection of minors: guidelines to clarify the framework
The guidelines published by the Commission in 2025 on the protection of minors have clarified expectations in terms of service design: age verification for high-risk services, default protective configuration of minors' accounts, supervision of recommendation systems and easier access to national assistance schemes.
The development of a European age verification prototype, based on a trusted third party and a double anonymity mechanism, illustrates this desire for operationalisation.
For the e-Enfance/3018 Association, age verification remains a decisive lever for making the requirements of Article 28 effective.
The status of trusted signaller: from legal status to the test of alerts
Article 22 of the DSA introduced the status of trusted reporter, enabling entities with recognised expertise in the detection of illegal content to obtain priority treatment of their reports by the platforms.
The e-Enfance/3018 Association was the first entity designated in France by Arcom. What sets 3018 apart is its intervention model, based on human support provided by qualified counsellors specialised in the protection of minors.
This approach makes it possible to qualify situations with precision from a legal point of view, to identify any related offences and to reveal other forms of digital violence that were initially invisible.
This approach guarantees support that goes beyond the isolated removal of content and enables action to be taken by providing advice, practical solutions and tailored support to prevent digital violence from worsening or recurring. The 3018 system relies on human and legal expertise, which is essential for fine-tuning situations and detecting emerging trends.
Next week, the Association e-Enfance / 3018 will publish its first transparency report as a trusted signaller. This document will provide a detailed overview of the reports submitted, processing times, withdrawal rates and cooperation arrangements with the major platforms. It will also analyse the types of violence observed, the changes observed in online practices and the emerging trends identified through support for victims. These elements will make it possible to anticipate the assessment of systemic risks provided for in articles 34 and 35 of the DSA, by providing an interpretation based on concrete situations encountered in the field.
This demand for quality implies resources commensurate with the responsibilities entrusted to us. The e-Enfance/3018 Association advocates the introduction of long-term funding for trusted signallers. Such a system would ensure that a high level of legal expertise and human support is maintained, which is essential if reports are to be accurately qualified, systemic phenomena of digital violence are to be detected and prevented, beyond the mere technical removal of content.
Two years on: regulation in the consolidation phase
The next few months will be a key stage.
The European Commission is due to adopt guidelines relating to Article 22 in order to clarify the role of trusted alerts in the fight against illegal content.
Two years after it came into force, the DSA has profoundly changed the legal framework applicable to platforms. Obligations relating to the protection of minors are now integrated into risk management mechanisms and are subject to effective controls.
The coming period will enable a more detailed assessment of the system's ability to produce measurable effects on the protection of minors online.



