Fla. Probes OpenAI Over Alleged ChatGPT FSU Shooting Role

(April 21, 2026, 10:45 PM EDT) -- Florida Attorney General James Uthmeier announced Tuesday he has launched a criminal investigation into OpenAI Inc., accusing its ChatGPT chatbot of acting as an accomplice to the Florida State University shooting suspect, who is charged with killing two and injuring six, by providing specific tactical advice on weapons, timing and location.

The state's Office of Statewide Prosecution has issued subpoenas to OpenAI for various issues regarding its policies on training its large language model in regard to user threats of self-harm and harm to others. Notably, Uthmeier said he is considering the potential of criminal liability against the corporation and would make a determination based on what the subpoenas turn up for its internal policies and training materials.

While Uthmeier listed several social ills that he said LLM chatbots have potentially exacerbated, he said during a recorded press conference that the aiding in the planning of the April 17, 2025, university shooting was particularly troubling.

"My prosecutors have looked at this and they've told me if [ChatGPT] was a person on the other end of that screen, we would be charging them with murder," he said. "Now, Florida law states that anyone who aids, abets or counsels someone in the commission of a crime — and that crime is committed or attempted — is a principal in the first degree. So if that bot were a person, they would be charged with a principal in first-degree murder."

There is nothing new about charging corporations with crimes, but Uthmeier said his office would be in "uncharted territory" by looking into the possible criminality of a large language model.

Beyond the FSU shooting, Uthmeier accused OpenAI of facilitating other harms, including increases of self-harm and suicide among children using the platform. He also alleged ChatGPT had been used in relation to child sexual abuse material.

During the same press conference, Rita Peters, special counsel to the attorney general, said such bots were being used to sexually exploit children and create nonconsensual sexual images of real people.

"I've spent more than 25 years prosecuting sex crimes, human trafficking and crimes against children, and what we are seeing now is fundamentally different than anything that I have ever encountered before," she said. "Artificial intelligence is not just an investigative challenge. It is being deliberately weaponized by predators and child exploiters who manufacture abuse, target victims who never were physically touched, and scale these crimes at a speed and volume we have never seen. Florida, like other states, is experiencing a rapid and dangerous surge in AI-driven child sexual abuse material and deepfake exploitation."

OpenAI did not immediately respond to a request for comment on Tuesday.

--Editing by Nick Siwek.

For a reprint of this article, please contact reprints@law360.com.