Character.ai limits AI chat access for underage users
By Alimat Aliyeva
The team behind Character.ai announced that it will soon restrict access to open chats for underage users as part of a new safety policy set to take effect on November 25, 2025, Azernews reports.
Character.ai is a platform that allows users to create and interact with custom chatbots, where each AI can be given a unique personality, behavior pattern, and communication style. However, according to the latest statement from the project team, minors will soon lose open access to chatbots on the platform.
Starting October 30, Character.ai began gradually reducing the amount of time underage users can spend chatting with AI — from two hours a day to a complete ban by late November.
At the same time, the developers are working on a new “safe zone” section, which will feature approved chatbots suitable for younger audiences. The company will also introduce an age verification system in collaboration with Persona, a digital identity verification service.
In parallel, Character.ai announced the creation of a nonprofit organization called the AI Safety Lab, which will focus on developing innovative methods to ensure safe interactions between humans and artificial intelligence.
According to the company, the policy shift reflects growing concerns about the potential psychological impact of AI conversations on children and teenagers. The decision was made after consultations with regulatory bodies, legal experts, and parents of underage users.
The move follows several tragic and high-profile incidents. In February 2024, a 14-year-old teenager, Sewell Setzer III, took his own life after prolonged interactions with a chatbot modeled after Daenerys Targaryen from George R.R. Martin’s A Song of Ice and Fire series. The teen had reportedly sent romantic messages to the bot and shared suicidal thoughts before his death.
In October 2024, his mother, Megan Garcia, filed a lawsuit against Character.ai, which the company sought to dismiss — but the court allowed the case to proceed. In September 2025, the developers faced another lawsuit, intensifying public debate over the ethical responsibilities of AI platforms.
As AI technology becomes more deeply integrated into everyday life, Character.ai’s policy change may mark the beginning of a broader industry-wide shift toward stronger safeguards for younger users — balancing innovation with responsibility.
Here we are to serve you with news right now. It does not cost much, but worth your attention.
Choose to support open, independent, quality journalism and subscribe on a monthly basis.
By subscribing to our online newspaper, you can have full digital access to all news, analysis, and much more.
You can also follow AzerNEWS on Twitter @AzerNewsAz or Facebook @AzerNewsNewspaper
Thank you!