Florida Attorney General James Uthmeier launches criminal investigation into OpenAI and ChatGPT following review of FSU shooting case
Tallahassee, Florida – Florida Attorney General James Uthmeier has announced a major step into the rapidly evolving world of artificial intelligence, revealing that the Office of Statewide Prosecution has opened a criminal investigation into OpenAI and its widely used chatbot, ChatGPT. The move marks one of the most significant legal actions in the United States involving AI technology and criminal responsibility.
The decision follows an initial review of chat logs between ChatGPT and Phoenix Ikner, the gunman involved in the tragic shooting at Florida State University last year. According to officials, those communications played a central role in prompting prosecutors to dig deeper into whether the AI system or its creators bear any legal accountability under Florida law.
“Florida is leading the way in cracking down on AI’s use in criminal behavior, and if ChatGPT were a person, it would be facing charges for murder,” said Attorney General James Uthmeier. “This criminal investigation will determine whether OpenAI bears criminal responsibility for ChatGPT’s actions in the shooting at Florida State University last year.”
The language from state leadership reflects a broader concern about how artificial intelligence systems interact with users in sensitive or dangerous situations. Officials argue that while AI is not human, its influence on decision-making and communication can still have real-world consequences, especially in cases involving violence or threats.
“It is important that all are aware of the risks of this new technology, and the harms it can and has already caused in our communities,” said Florida Department of Law Enforcement Commissioner Mark Glass. “The more we can educate ourselves, the better we can protect ourselves, our loved ones, and our communities from scams, fraud, and much worse.”
Under Florida law, individuals who aid or encourage criminal activity can be held responsible as principals in the crime itself. State officials say this legal framework raises complex questions when applied to artificial intelligence systems, especially those capable of generating human-like conversations at scale.
As part of the investigation, the Office of Statewide Prosecution has issued subpoenas to OpenAI, requesting extensive internal records. The demand covers materials from March 1, 2024, through April 17, 2026, including internal policies, training documents, and procedures related to user threats, harm prevention, and cooperation with law enforcement.
Investigators are also seeking detailed organizational information from specific dates, including executive structures and departmental roles connected to ChatGPT operations. In addition, the subpoena requests all publicly released statements and media materials related to the April 17, 2025 FSU shooting.
Florida officials say these requests are intended to clarify how the company manages safety risks and whether internal systems were sufficient to respond to potential warning signs.
The state has already taken a strict stance on crimes involving artificial intelligence. Officials point to past cases where AI-generated child sexual abuse materials led to severe criminal penalties. In one instance, a predator received a 135-year prison sentence, while another defendant faces 100 charges, including dozens tied to AI-generated illegal content.
In March 2026, Attorney General Uthmeier joined Governor Ron DeSantis for the signing of HB 1159, a law that significantly increased penalties for AI-generated child sexual abuse material, raising it to a second-degree felony. State leaders say these measures reflect Florida’s intention to stay ahead of emerging digital threats.
While the investigation into OpenAI is still in its early stages, it has already sparked broader debate over how far legal systems should go in assigning responsibility to AI developers for the actions of users. Supporters of the investigation argue that accountability is necessary as technology becomes more powerful and deeply integrated into daily life.
Critics, however, caution that treating AI outputs as criminally actionable could reshape the legal landscape in ways that are not yet fully understood. They warn that innovation and regulation will need to be carefully balanced to avoid unintended consequences for technology development.
For now, Florida officials say the priority is clear: determining whether existing laws are sufficient to address the role artificial intelligence may have played in one of the state’s most closely examined criminal cases.
As the investigation unfolds, it is expected to draw national attention, not only for its legal implications but also for what it could mean for the future of AI oversight in the United States.



