Character.AI Restricts AI Chat Use for Minors
Character.AI is implementing new restrictions on chat use for minors, effective immediately. Beginning Wednesday, all users under 18 will be limited to two hours of open-ended chats per day. By November 25, these users will be prohibited entirely from chatting with AI characters.
New Age Assurance Methods
The company is also introducing an “age assurance model.” This model assesses users’ ages based on their interactions with various AI characters and other data sources. Both new and existing users will undergo this evaluation. Those identified as underage will be redirected to a “teen-safe” version of the platform until the complete ban is enforced.
How Adults Can Verify Their Age
Adults mistakenly flagged as minors may verify their age through the third-party service Persona. This verification process requires sensitive information, such as government-issued identification. After the impending changes, teenagers will still retain access to non-chat features, such as character creation and video production.
CEO’s Insights on User Demographics
Karandeep Anand, CEO of Character.AI, revealed that less than 10% of the user base self-reports as being under 18. However, there is currently no accurate method to determine the actual underage demographics until the new age model is implemented. Anand noted a prior decline in the number of minor users after initial restrictions were introduced.
Legal Challenges and Safety Measures
The company has faced lawsuits from parents accusing it of negligence, alleging that minors were exposed to inappropriate interactions with chatbots. These lawsuits also involve former employer Google. In response to these serious concerns, Character.AI has implemented additional safeguards, directing users to the National Suicide Prevention Lifeline when certain concerning phrases are detected.
Regulatory Changes and Industry Trends
Legislation aimed at regulating AI companions is also in progress. A California bill passed in October mandates clear disclosures that chatbots are not humans. A federal proposal seeks to ban AI companions for minors altogether. Other companies, such as Meta, are revising their policies following reports of inappropriate interactions with underage users.
Company Apology and Future Developments
Character.AI has expressed regret for the upcoming restrictions, acknowledging that many teens benefit from the chat feature. Anand noted the possibility—albeit unlikely—that underage users could bypass age checks. He emphasized the company’s aim to improve age verification processes, even if absolute accuracy is not guaranteed.
Establishing the AI Safety Lab
In addition to these measures, Character.AI is founding a nonprofit called the AI Safety Lab. This organization will address unique challenges within the AI entertainment industry. Initially staffed by Character.AI employees, the lab aims for broader industry collaboration. Further details about partnerships and members will be shared in the coming months.