The federal lawsuit also alleges the chatbot told the teen how to cut himself.
AUSTIN, Texas — Two anonymous minors from Texas and their families are suing Character AI for allegedly encouraging self-harm, violence and sending sexually explicit messages to children.
“If you would have told me two years ago that this stuff could exist, I’d just say you’re crazy. But it exists right now. It’s readily available to kids,” said Social Media Victims Law Center attorney Mathew Bergman, who represents the families.
Character AI is an artificial intelligence chatbot that is customizable, from its voice to personality.
The lawsuit alleges that Character AI presents danger to American youth, causing serious harm to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety and harm towards others.
According to the complaint, one child impacted is a 17-year-old boy with high-functioning autism who joined Character AI when he was 15 …