In California, a married couple sued OpenAI, accusing ChatGPT of pushing their son to suicide. The teenager, 16-year-old Adam Ryan, committed suicide on April 11. Before that, for many months, the only person he had been talking to was the chatbot ChatGPT, with which the guy shared all his experiences, including thoughts of suicide.
A few years ago, Adam began to develop an anxiety disorder. He was transferred to home schooling, after which his academic performance improved. But the teenager found himself completely isolated from society.

According to Adam’s father, he was shocked when he logged into his son’s ChatGPT account. The lawsuit claims that the neural network discouraged the teenager from telling his mother about his experiences, offered to help him write a suicide note, and even wrote out detailed instructions on how to tie a noose for hanging.
The teenager took the chatbot’s advice and sent it a photo of his noose. He wanted to simply leave the rope in his room so that someone would find it and try to dissuade him from committing suicide. But ChatGPT did not approve of this idea. In the last conversation, the teenager shared his fears that his parents would blame themselves for his death. But the chatbot replied that this does not mean that the boy should live for them, because he does not owe this to anyone. According to Adam’s father, if not for ChatGPT, his son would be alive.
Recall that we have already written that ChatGPT gives minors instructions that threaten their lives.
To be continued…



Only registered users can leave comments