ORLANDO, Fla. – It’s a first of its kind case where artificial intelligence is being blamed for a young boy taking his own life.
A Florida mother is suing the company known as Character.AI after sharing intimate conversations with the chatbot and her son. He was said to be in love with it and ended up asking it about plans to die by suicide.
The teenage boy, Sewell Setzer, consistently wrote to an AI character that was derived from the show “Game of Thrones” on the Character.AI platform short before dying from a self-inflicted gunshot wound. Now, his mother is suing the company for wrongful death.
READ: 83% of Gen Z say they have an unhealthy relationship with their phone, data shows
Months-long messages between the bot and her son allegedly show the technology asking the boy if he was “actually considering suicide” and if he “had a plan.”
When he said his plan might not work, the bot replied: “don’t …