Lawsuit: A Character.AI chatbot hinted a kid should murder his parents over screen time limits
The parents of two Texas minors are suing a chatbot developer saying the company's services endangered their kids. One chatbot allegedly encouraged a child to self-harm and to kill their parents; another allegedly exposed them to sexualized content.
The parents of two Texas minors are suing a chatbot developer saying the company's services endangered their kids. One chatbot allegedly encouraged a child to self-harm and to kill their parents; another allegedly exposed them to sexualized content.
What's Your Reaction?