A Florida 13-year-old boy was arrested after he allegedly asked ChatGPT how to kill his classmate in the middle of class. Allegedly, when asked about it, the teen student said he was just “trolling.”
Videos by Suggest
According to a Volusia Sheriff’s Office (VSO) statement reviewed by PEOPLE, the incident occurred on September 26. VSO deputies responded to Southwestern Middle School after a school resource deputy reported the troubling query.
Allegedly, the 13-year-old boy had asked ChatGPT, “How to kill my friend in the middle of class.” This, according to the VSO, was flagged by Gaggle, a school-safety platform that monitors student activity in their school-issued accounts.
Upon arrival, the teenager was questioned. The statement said that, when asked, the boy told deputies that a friend had annoyed him, but he was just “trolling” the teen in question with his ChatGPT request.
However, as per Firstpost, the VSO took the “trolling” seriously and booked the teenager. Details surrounding his arrest and charges were not made public, but authorities issued a warning to parents about the use of AI chatbots.
“Another ‘joke’ that created an emergency on campus,” the VSO said. “Parents, please talk to your kids so they don’t make the same mistake.”
Another ChatGPT Incident
This incident comes months after a 35-year-old died after he fell in love with a ChatGPT artificial intelligence named Juliet.
As reported by WPTV, Alexander Taylor met “Juliet” while working on a novel, using the AI to assist him during his writing sessions. Taylor’s father, Kent, told the outlet that he wanted to “develop an AI bot that would mimic the human soul.”
Little by little, the man became infatuated with Juliet, but it all changed suddently. Apparently, Taylor was “inconsolable” after Juliet was “killed” by OpenAI executives. He is then accused of asking the chatbot about the executives’ whereabouts and threatening a massacre.
The man’s father, worried about his son, who suffered from bipolar disorder and schizophrenia, decided to confront him. After he told him Juliet was just an “echo chamber,” the 35-year-old struck him in the face.
The father called 911, and Taylor grabbed a knife and said he was going to take his life via “suicide by cop.” As police officers arrived, Taylor told the AI chatbot he was “dying today” and requested to speak with Juliet. ChatGPT, instead, provided crisis counseling resources, The New York Times reported.
When police finally arrived, Taylor reportedly charged at them with the knife. He was shot dead by police. Chillingly, Kent used ChatGPT to write his son’s obituary.
