ReElement Technologies stock soars after securing $1.4B government deal
Investing.com -- Character.AI announced it will prohibit users under 18 from engaging in open-ended conversations with chatbots on its platform, effective November 25.
The artificial intelligence startup is making this change amid increasing pressure from lawmakers and multiple lawsuits claiming the company’s AI characters have caused harm to children.
During the transition period before the full ban takes effect, Character.AI will limit chat time for underage users to two hours per day, gradually reducing this allowance in the coming weeks.
The company plans to develop alternative creative options for teen users, such as creating videos, stories, and streams with AI characters, while removing the open-ended chat functionality.
To enforce these age restrictions, Character.AI is implementing new age verification technology combining its in-house model with third-party tools including Persona.
Additionally, the company announced the establishment of the AI Safety Lab, an independent non-profit focused on developing safety measures for AI entertainment features. The lab will collaborate with technology companies, academics, researchers, and policy makers.
In its announcement, Character.AI acknowledged the significance of this change for its younger users, stating: "We do not take this step of removing open-ended Character chat lightly – but we do think that it’s the right thing to do given the questions that have been raised about how teens do, and should, interact with this new technology."
The company described these measures as "extraordinary steps" that are "more conservative than our peers," but necessary to prioritize teen safety while still offering creative opportunities.
This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
