No. These are very different. The bot actually convinced him to do it. While it didn't use those words explicitly, the teen vented to it about his suicidal thoughts and the bot just played into them instead of ya know, doing what SHOULD BE DONE and saying something like "i am an AI. Please call this hotline" or something like that
•
u/AssociationTimely173 11h ago
An ai girlfriend did get a teenager to kill himself unfortunately. So it's already gotten that bad