A dramatic case in the USA sparks debates about the safety of artificial intelligence (AI). The mother of a 14-year-old from Florida has filed a lawsuit against the company Character Technologies. The company operates a well-known platform on which users can build chatbots. The software can imitate certain “personalities” in the chat. One of these chatbots is said to have driven the plaintiff's son to suicide – that's the accusation.