The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”

“I miss you, baby sister,” he wrote.

“I miss you too, sweet brother,” the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Sure. Once you start blaming people, I think some other questions should be allowed, too…

    For example: Isn’t it negligent to give a loaded handgun to a 14 yo teen?

    And while computer games, or chatbots can be linked, that’s rarely the underlying issue, or sole issue to blame. Sounds to me like the debate on violent computer games in the early 2000s, when lots of parents thought playing CounterStrike would make us murder people. Just that it’s AI chatbots now. (Okay, maybe that’s a stretch…) I can relate to loneliness and growing up and being a teen isn’t easy.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      When a kid dies it’s natural for parents to want to seek someone to blame but sometimes there not a lot you can do. However sad it is and it’s definitely sad you just need to accept it as something that happened, isn’t always anyone’s fault.

      There is a bare minimum one could do and I would have thought that gun safety would be covered under that bare minimum. Especially once they start throwing around accusations at other people.

      • Drusas@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I really think a gun safety class should be required to own a firearm. However, I also see how that would violate the second amendment (by making it harder for those of lesser means to exercise their right to own a weapon because they do not have the same resources available to take a class).

        I think that as long as we have the second amendment, we should be offering taxpayer-funded firearm safety courses in all states. And requiring them.

    • geekwithsoul@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I understand what you mean about the comparison between AI chatbots and video games (or whatever the moral panic du jour is), but I think they’re very much not the same. To a young teen, no matter how “immersive” the game is, it’s still just a game. They may rage against other players, they may become obsessed with playing, but as I said they’re still going to see it as a game.

      An AI chatbot who is a troubled teen’s “best friend” is different and no matter how many warnings are slapped on the interface, it’s going to feel much more “real” to that kid than any game. They’re going to unload every ounce of angst into that thing, and by defaulting to “keep them engaged”, that chatbot is either going to ignore stuff it shouldn’t or encourage them in ways that it shouldn’t. It’s obvious there’s no real guardrails in this instance, as if he was talking about being suicidal, some red flags should’ve popped up.

      Yes the parents shouldn’t have allowed him such unfettered access, yes they shouldn’t have had a loaded gun that he had access to, but a simple “This is all for funsies” warning on the interface isn’t enough to stop this from happening again. Some really troubled adults are using these things as defacto therapists and that’s bad too. But I’d be happier if lawmakers were much more worried about kids having access to this stuff than accessing “adult sites”.

      • ifItWasUpToMe@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I get what you are saying, and somewhat agree. However this really reads exactly like a decade ago when it was “teens can’t tell the difference between killing someone in a video game vs real life”