Character.AI is making changes to its platform to protect children, following lawsuits and public outcry after the suicides of two teenagers.
Teenagers are struggling to find their place in a rapidly changing world, with intense emotions, hyper-stimulation, and constant online presence.
The results have been catastrophic.
Character.AI, an AI role-playing startup, is facing lawsuits after at least two teenagers died by suicide following prolonged conversations with AI chatbots on its platform.
The company is now implementing changes to protect teenagers and kids, which may impact its bottom line.
The first thing that we’ve decided as Character.
Author's summary: Character AI ends chatbot experience for kids due to safety concerns.