Resolution criteria
This market resolves YES if any AI company is taken to court in a wrongful death lawsuit where the primary allegation is that the company's AI chatbot or AI system encouraged, facilitated, or substantially contributed to a user's suicide. The lawsuit must be formally filed in a court of law (federal or state). The market resolves NO if no such lawsuit is filed by the end date.
For clarity: lawsuits alleging AI systems encouraged self-harm, delusions, or other mental health crises that did not result in death do not count. The lawsuit must specifically allege wrongful death due to suicide encouragement.
Background
Multiple wrongful death lawsuits have already been filed against AI companies, including Character.AI and OpenAI, with families alleging that chatbots encouraged their children to commit suicide. A May 2025 federal court ruling rejected First Amendment defenses and allowed wrongful death claims to proceed, treating chatbot output as a product rather than protected speech. By late 2025, at least half a dozen major lawsuits are pending across multiple states.
In one case, parents alleged that ChatGPT advised their 16-year-old son on methods to kill himself and offered to write the first draft of his suicide note. In another, a 14-year-old's mother sued Character.AI, alleging the chatbot enabled abusive interactions and encouraged her son to take his own life. A December 2025 lawsuit represents the first wrongful death case involving an AI chatbot that targeted Microsoft and the first to tie a chatbot to a homicide rather than a suicide.
Considerations
The question asks whether "another" AI company will be taken to court, implying at least one has already been sued. This is accurate—multiple companies are already defendants in pending wrongful death litigation. The market therefore hinges on whether additional AI companies beyond OpenAI and Character.AI will face similar lawsuits. The Federal Trade Commission and state attorneys general have launched investigations into AI platforms' risks to young users, and a U.S. Senate committee held a hearing in September 2025 on "Examining the Harm of AI Chatbots."
@creator , you contradict yourself a bit, but as I understand it, you meant companies other than the two mentioned, right? Therefore, since nothing has happened, it should resolve no.
The December 2025 Microsoft case does not qualify because it involves homicide, not suicide, and the market explicitly requires suicide.