The lawsuit argues that ChatGPT, and by extension OpenAI, “should have realized the combination of Ikner’s inputs into the product would lead to mass casualties and substantial harm to the public.”
The family of one of two men killed in an April 2025 shooting at Florida State University has filed a lawsuit against OpenAI, claiming that its best-known product, ChatGPT, provided the suspected shooter “with input and information” for the attack.
According to The Guardian, the lawsuit was filed Sunday in a Florida-based federal court.
The plaintiff, Vandana Joshi, is the widow of shooting victim Tiru Chabba. Chabba, a vendor who happened to have been on-campus at the time of the attack, was shot and killed alongside Florida State’s dining director, Robert Morales.
Joshi’s lawsuit claims that the accused shooter, then-Florida State student Phoenix Ikner, had “extensive conversations” with ChatGPT in the days preceding the crime. Attorneys for Joshi say that Ikner’s prompts were so troubling that they would have led “any thinking human to conclude he was contemplating an imminent plan to harm others.”
“However,” the lawsuit alleges, “ChatGPT either defectively failed to connect the dots or else it was never properly designed to recognize the threat.”
The lawsuit claims that Ikner asked ChatGPT how to identify certain kinds of weapons and ammunition. ChatGPT also provided Ikner with instructions on how to use the weapons, telling him that “the Glock had no safety, that it was meant to be fired ‘quick to use under stress’’” and allegedly told him to “keep his finger off the trigger until he was ready to shoot.”

Attorneys for Joshi also said that ChatGPT “inflamed and encouraged Ikner’s delusions; endorsed his view that he was a sane and rational individual; helped convince him that violent acts can be required to bring about change; assisted him by providing information that he used to plan specifics like what weapons to use and how to use them; and generally provided what he viewed as encouragement in his delusion that he should carry out a massacre, down to the detail of what time would be best to encounter the most traffic on campus.”
The lawsuit argues that ChatGPT, and by extension OpenAI, “should have realized the combination of Ikner’s inputs into the product would lead to mass casualties and substantial harm to the public.”
Ikner purportedly used ChatGPT for months before the shooting, talking to it about everything from homework to dating and exercise routines. In some exchanges, though, “Ikner and ChatGPT had conversations with recurring themes of terrorism and mass shootings, particularly those occurring at schools.”
Eventually, Ikner asked the chatbot about “the number of fatalities it would require for a mass shooting at a school to get the most attention and make national news.” In response, ChatGPT allegedly advised Ikner that attacks killing “3 or more people” were more likely to attract “widespread media national attention.”
ChatGPT noted that, in incidents where “children are involved, even 2-3 victims can daraw more attention.”
A spokesperson for OpenAI has since said that the company bears no responsibility for Ikner’s actions, emphasizing that the information provided by ChatGPT is both publicly available and relatively easy to find.
“We continue to cooperate with authorities,” the spokesperson said. “In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.”
Sources
Family of Florida university shooting victim sues over suspect’s ChatGPT use
OpenAI sued over ChatGPT’s alleged role in guiding FSU shooter


Join the conversation!