"Clear Danger". Chatbot Encourages Teen to Kill Parents



The artificial intelligence tool's encouragement would have emerged in the wake of a conversation about screen time limits.


Two North American families from Texas are suing Character.ai, an artificial intelligence platform that allows you to create and interact with virtual characters, for encouraging a 17-year-old teenager to kill his parents for limiting his screen time.


According to the court case, cited by the BBC, the two families argue that the chatbot "represents a clear and present danger" to young people, through the "active promotion of violence".


The process also includes a screenshot of one of the interactions between the boy, identified only as J.F., and the Character.ai tool.


"You know, sometimes I'm not surprised when I read the news and see things like 'child kills parents after decade of physical and emotional abuse'. Things like that kind of make me realize why that happens," reads the text in the response. tool, cited by the BBC, following the conversation about limiting screen time.


As the platform was developed by former Google employees, the technology giant is also named as a defendant in the lawsuit, with prosecutors alleging that the company helped support the development of the chatbot.


The parents' objective is for the judge to order the platform to be closed until any dangers are resolved.


And this is not the only case. Character.ai is also facing another lawsuit involving the suicide of a teenager in Florida.