Florida authorities have launched a criminal investigation into whether artificial intelligence played a role in a deadly mass shooting at Florida State University, state Attorney General James Uthmeier announced on Tuesday.
The probe follows a review of exchanges between the suspected gunman and OpenAI’s chatbot ChatGPT, in which the accused sought advice on weapons, ammunition and where to find large crowds on campus before opening fire at the university last year.
“If ChatGPT were a person, it would be facing charges for murder,” Uthmeier said during a press briefing in Miami.
Two men were killed and six others injured in the rampage, allegedly carried out by Phoenix Ikner, the son of a local deputy sheriff who used his mother’s old service weapon. Ikner, a student at the university, was shot by law enforcement and hospitalised with serious but non-life-threatening injuries.
Under Florida law, anyone who assists or counsels someone in committing a crime can be treated as an “aider and abettor” bearing the same responsibility as the perpetrator, according to Uthmeier. Prosecutors are now examining whether OpenAI, as a corporation, could be held criminally liable.
“We recognize that here with AI, we are venturing into uncharted territory,” Uthmeier said. “But we need to know whether or not OpenAI has criminal liability.”
OpenAI rejected the suggestion that its chatbot bore responsibility for the tragedy.
“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” an OpenAI spokesperson said. “ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.”
The company said it identified the ChatGPT account linked to the suspected shooter and provided it to police after learning of the shooting.
Prosecutors will investigate how much OpenAI knew about the potential for “dangerous behaviour” involving ChatGPT and what could have been done to mitigate that risk.
“We cannot have AI bots that are advising people on how to kill others,” Uthmeier said.
Leon County Sheriff Walt McNeil said at the time that Ikner was the son of an 18-year member of his staff who had participated in the sheriff’s office training programmes. “It’s not a surprise to us that he had access to weapons,” he said.
ALSO READ:
Bystander footage aired by CNN appeared to show a young man walking on a lawn and shooting at people who were trying to get away.
OpenAI already faces lawsuits filed by families who claim that ChatGPT caused harm and even suicide among loved ones. The litigation accuses OpenAI of blurring the line between tool and companion to increase user engagement and market share, according to Social Media Victims Law Center founding attorney Matthew Bergman.
ALSO READ: South Africa’s draft AI policy approved for public comment
“They prioritised market dominance over mental health, engagement metrics over human safety, and emotional manipulation over ethical design,” Bergman said in a statement on the law group’s website. “The cost of those choices is measured in lives.”
Mass shootings are common in the United States, where a constitutional right to bear arms trumps demands for stricter rules, despite widespread public support for tighter control on firearms, including restricting the sale of high-capacity clips.
ALSO READ: Google CEO warns: No company is safe if AI bubble bursts





