Protection of minors online: when judges follow in the footsteps of European legislators

Making digital platforms more accountable

In less than forty-eight hours, on 24 and 25 March 2026, two American verdicts marked a turning point that we would be wrong to underestimate.

In New Mexico, Meta was awarded $375 million in damages for 37,500 willful violations of the Unfair Trade Practices Act. The case was based on an undercover investigation in which state agents posed as children on Instagram, Facebook and WhatsApp, documenting the sexual solicitations received and the inadequacy of moderation responses. What convinced the court was Meta's knowledge of the facts. 

The following day, in California, Meta and Google were found guilty of negligence in the very design of their services. For the first time, an American jury acknowledged that Instagram and YouTube had been designed to maximise engagement on their platforms to the detriment of minors' mental health. Mark Zuckerberg was forced to admit on the stand that Instagram had waited until 2022 to verify the age of its users.

At the same time, multidistrict federal litigation has brought together thousands of cases against Meta, TikTok, Snap and Google, accused of setting up addictive mechanisms deliberately targeting teenagers. The federal judge in charge refused to grant the platforms immunity, ruling that allegations of deliberately addictive design could escape immunity. 

The American courts are using the judicial route to build up what the DSA, the Digital Services Act - the European regulation - laid down in legislation. 

The DSA: the European lever for holding platforms accountable

For its part, Europe is stepping up its investigations. After TikTok, Meta and X, Snapchat is also within the scope of the DSA's investigation. On 26 March, the European Commission launched a formal investigation into a number of possible shortcomings: age verification mechanisms, grooming and recruitment of minors for criminal activities, default account settings deemed insufficient, dissemination of information on the sale of prohibited products and mechanisms for reporting illegal content. 

For a long time, the debate on the responsibility of platforms was confined to content moderation. Who should remove what? By what deadline? According to what criteria? This framework, necessary though it is, has its limits.

This opening of the enquiry illustrates once again the logic of the DSA as an instrument for ex ante regulation of systemic risks. The text imposes an ongoing obligation on very large online platforms to identify, analyse and reduce risks, particularly when these affect the fundamental rights of minors. Two years old after it came into force, its effects are now beginning to be felt. 

Convergence of regulators and judges on the liability of platforms

At the same time, four major porn sites - Pornhub, Stripchat, XNXX and XVideos - are being investigated separately for lack of effective age verification mechanisms by the European authorities. These European investigations echo the actions taken in parallel by Arcom in France and Ofcom in the UK on the regulation of pornographic sites, drawing a common front between national and European regulators on the issue of minors' access to adult content.

These battles are in line with those of the Association brought jointly with the Voix de l'Enfant before the courts to restrict access by minors to pornographic sites.

In Ireland, the Data Protection Authority has opened an investigation into X and its Grok artificial intelligence system for creating and distributing sexual deepfakes. Intimate, non-consensual images are no longer seen simply as illegal content to be moderated after the fact, but as potentially illegal processing of personal data from the moment they are created. Platforms could thus be held responsible not only for what they disseminate but also for what they allow to be produced.

In the Netherlands, a court forced Meta to offer a non-algorithmic news feed, a decision upheld on appeal on 11 March 2026 with a fine increased to €10 million. In the UK, the ICO sanctioned Reddit, enshrining a principle that deserves to be emphasised: the absence of an effective age verification mechanism in itself constitutes an autonomous legal fault. 

In France, the litigation dynamic has also crossed a new threshold. In September 2025, Arthur Delaporte, a member of parliament, referred the matter to the Paris public prosecutor's office regarding the malfunctions observed by the parliamentary commission of enquiry into TikTok. In November, the public prosecutor's office opened a preliminary investigation, which was handed over to the cybercrime unit. Then, on 26 March 2026, the Minister for Education, Édouard Geffray, sent a report to the public prosecutor's office under article 40 of the Code of Criminal Procedure. His office had created a TikTok account, claiming to be 14 years old. In less than twenty minutes, without having liked anything, the thread of recommendations proposed depressive videos, scarification tutorials and incitements to suicide. 

This alignment goes beyond the usual divisions between legal systems, national laws and sectoral regulations. It reflects a shared determination, built up case by case, investigation by investigation, to show the platforms that children's health, development and safety cannot be adjustment variables in an economic model based on capturing attention.

Twenty years in the field, now a recognised reality

As a trusted signaller appointed by Arcom, the Association e-Enfance / 3018 is confronted on a daily basis with situations and mechanisms of violence suffered by minors online, These children are not only exposed to dangerous content, but also grow up in environments designed to increase their vulnerability.

Recommendation systems, interfaces and commitment mechanisms produce documented effects... and are now sanctioned. When a service is designed to encourage exposure to risky content or solicitations, responsibility cannot be diluted by individual use or passed on to families.

As long as the business model for digital platforms remains incompatible with the safety, health and healthy development of minors, no peripheral measure will suffice. 

Let us work together to combat online harassment and violence!