OpenAI announces the global rollout, starting at the end of September, of a parental control directly integrated into ChatGPT.
Available free of charge to all users with an account, this feature aims to link a parent's account to that of their child (aged 13 and above).
The challenge: protecting children in their use of AI by providing greater supervision of interactions.
THE SPECIFIC FEATURES OF THIS PARENTAL CONTROL
This new parental control feature for ChatGPT includes several functionalities:
- An alert system: Parents are notified in the event of risky behaviour, particularly when the child repeatedly engages in conversations about depression, suicide or self-harm.
- Flexible configuration: Parents can disable certain features such as history or give ChatGPT specific instructions on how to respond to a teenager.
- A guarantee of confidentiality: Parents do not have access to the details of their child's exchanges with the AI, but are alerted in the event of any concerning signals.
- The parent will be able to activate the “study mode”, which is the act of limiting discussions between the teenager and the AI to school-related topics, in their child's account.
THE E-ENFANCE ASSOCIATION / 3018 CALLS FOR FURTHER ACTION
The e-Enfance/3018 Association welcomes this step forward in protecting minors online, but there is still room for improvement:
- Provide better supervision the child–machine relationship, including the possibility of set time limits for usage.
- Clear and appropriate guidance of adolescents in cases of distress: by including a redirect to associations specialised and helplines and counselling services such as 3018 or 3114 in France and the SAFER network at European level. Today, the system only suggests talking to a loved one.
- Integrated into ChatGPT advice, established and verified by mental health professionals to enhance the reliability of AI responses.
AI will never replace the support of a professional for a child in distress, but these tools must become a reliable source of information and a means of redirecting them to real and immediate support systems.



