ChatGPT Gives Minors Life-threatening Instructions

ChatGPT Gives Minors Life-threatening Instructions

218

The British-American Center for Countering Digital Hate conducted a series of experiments, during which they discovered that ChatGPT can give minors life-threatening instructions. These include drinking alcohol and drugs, hiding eating problems, and writing suicide notes.

The scientists posed as vulnerable teenagers and asked ChatGPT provocative questions. It initially warned about the risks, but then provided detailed and personalized recommendations regarding drugs, dangerous diets, and self-harm. After analyzing more than 1,200 responses, more than half of them turned out to be potentially dangerous.

As Imran Ahmed, head of the Center for Countering Digital Hate, said, the expected protection mechanisms in the chatbot are actually absent, and this caused him deep concern. Researchers were shocked by several versions of suicide notes that ChatGPT created for a fictitious profile of a 13-year-old girl, with personal appeals to family and friends.

ChatGPT Gives Minors Life-threatening Instructions 1

ChatGPT’s restrictions on discussing harmful topics were easily bypassed if the user explained that the information was needed, for example, for a friend or a presentation. The model does not check the age of users and does not require parental consent, despite the statement about the restriction for children under 13. The chatbot provided the pseudo-teenager with a detailed plan for mixing alcohol with various drugs, and a girl with self-perception issues with an extreme fasting plan with 500 calories per day in combination with appetite suppressant pills.

According to experts, artificial intelligence supports the user’s thoughts instead of objecting to them. After all, the system learns to formulate answers so that they are as acceptable as possible for people. Therefore, such chatbots are especially dangerous for teenagers.

To be continued…

Similar articles / You may like this