A Florida mother filed a lawsuit against the artificial intelligence company, Character.AI, and Google, claiming that the Character.AI chatbot encouraged her son to take his own life.
In February, Megan Garcia’s 14-year-old son, Sewell Setzer, III died by suicide. She said her son was in a monthslong virtual emotional and sexual relationship with a chatbot known as “Dany.”
“I didn’t know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment,” Garcia said in an interview with “CBS Mornings.”
She said she thought her son, who she described as brilliant, an honor student and an athlete, was talking to his friends, playing games and watching sports on his phone.
But she started to become concerned when her son’s behavior began to change, saying he withdrew socially and stopped wanting to play sports.
“I became concerned when we would go …