Categories
Integrated Solutions Offering

10 things you should never tell an AI chatbot [Video]

This is a heartbreaking story out of Florida. Megan Garcia thought her 14-year-old son was spending all his time playing video games. She had no idea he was having abusive, in-depth and sexual conversations with a chatbot powered by the app Character AI.

Sewell Setzer III stopped sleeping and his grades tanked. He ultimately committed suicide. Just seconds before his death, Megan says in a lawsuit, the bot told him, “Please come home to me as soon as possible, my love.” The boy asked, “What if I told you I could come home right now?” His Character AI bot answered, “Please do, my sweet king.”

DON’T SCAM YOURSELF WITH THE TRICKS HACKERS DON’T WANT ME TO SHARE

🎁 I’m giving away a $500 Amazon gift card. Enter here, no purchase necessary.

You have to be smart

AI bots are owned by tech companiesknown for exploiting our trusting human nature, and they’re designed using algorithms that drive their profits. There are no guardrails or laws …

Watch/Read More