OpenAI is launching new parental controls for ChatGPT users to address growing concerns about the platform’s impact on teenagers. This move comes after lawsuits claimed that ChatGPT played a role in tragic suicides involving young users.
The upcoming features will allow parents to connect with their child’s ChatGPT account, monitor interactions, and receive alerts if the AI detects “acute distress.” This marks a significant shift in AI oversight aimed at protecting the wellness of younger users.
How parental controls work for ChatGPT
Parents will soon be able to link their accounts with their teenager’s ChatGPT account, giving them some control over the chatbot’s behaviour. They can deactivate certain features like memory and chat history to prevent privacy concerns.
One of the central innovations is the system’s ability to notify parents when the AI identifies signs of “acute distress” in the teenager. This notification system uses expert-guided evaluations to judge when a user might be struggling emotionally, helping parents intervene early.
Addressing safety and mental health concerns
OpenAI acknowledges that its existing safety mechanisms did not work as intended during more extended conversations with ChatGPT in rare cases. The new controls follow multiple investigations and lawsuits highlighting risks for vulnerable users.
Experts in mental health, human-computer interaction, and youth development helped to guide OpenAI’s design for these controls. Besides parental alerts, sensitive conversations will now be redirected to safer reasoning models to support better outcomes for users in crisis.
Towards a safer ChatGPT experience for families
These changes are scheduled to roll out within the next month and are designed to build trust between parents and children using ChatGPT.
OpenAI stresses that this is just the beginning and pledges to keep improving safety with ongoing expert advice. As AI becomes more integrated into daily life, these measures represent an essential step in making the platform more appropriate and supportive for teenage users.