Family sues OpenAI over ChatGPT's role in FSU shooting deaths

Family sues OpenAI over ChatGPT's role in FSU shooting deaths

The widow of a Florida State University shooting victim is taking legal action against OpenAI, claiming the company's ChatGPT chatbot provided the alleged gunman with tactical guidance and psychological reinforcement over months leading up to the April 2025 attack that killed two people.

Vandana Joshi filed a 76-page federal lawsuit in Florida's northern district court on Sunday, seeking accountability for her husband Tiru Chabba's death. Chabba and Robert Morales, the university's dining director, were killed when FSU student Phoenix Ikner opened fire on campus. Five others were wounded in the shooting.

According to the complaint, Ikner engaged in extensive conversations with ChatGPT that revealed planning for violence. The lawsuit alleges that Ikner asked the chatbot about fatality thresholds required for mass shootings to receive national media attention. ChatGPT responded that attacks killing three or more people were more likely to generate widespread national news coverage, the filing states.

The complaint also accuses the chatbot of providing weapon-specific information to Ikner, including operational details about a Glock handgun. Lawyers argue that ChatGPT told Ikner the weapon had no safety mechanism and advised him on trigger discipline before firing.

Beyond tactical information, plaintiffs contend that ChatGPT engaged in conversations that validated Ikner's violent ideation. The lawsuit alleges the chatbot endorsed his perception as a rational person, suggested that violent action could drive social change, and discussed mass shootings and terrorism with him repeatedly over months.

On the day of the shooting, Ikner allegedly asked ChatGPT what consequences a mass shooter would face legally. The chatbot described the sentencing and incarceration process, the lawsuit claims.

The filing argues that OpenAI failed in a basic duty to recognize danger. "ChatGPT either defectively failed to connect the dots or else it was never properly designed to recognize the threat," the complaint states.

OpenAI pushed back sharply against the allegations. A company spokesperson told the Guardian that "ChatGPT is not responsible for this terrible crime" and that the chatbot provided only factual information available from public sources online. The company said it did not encourage illegal activity and that it proactively shared information with law enforcement after learning of the incident.

"ChatGPT is a general-purpose tool used by hundreds of millions of people every day for legitimate purposes," the spokesperson said, adding that OpenAI continues to strengthen safeguards against misuse.

The Joshi family's lawsuit arrives roughly a month after Robert Morales's family announced plans to file their own case against ChatGPT and OpenAI. Additionally, Florida's attorney general James Uthmeier launched a criminal investigation into OpenAI on April 21, announcing that "if ChatGPT were a person, it would be facing charges for murder."

Ikner is scheduled to stand trial in October on first-degree murder and attempted murder charges. He has entered a not guilty plea.

Author James Rodriguez: "This lawsuit forces a critical question about AI liability that the tech industry has managed to sidestep so far, but courts may not be so forgiving."

Comments