Megan Garcia sued Character.AI in federal court after the suicide of her 14-year-old son, Sewell Setzer III, arguing the platform has "targeted the most vulnerable members of society – our children"
Issue I see with character.ai is that it seem to be unmoderated. Everyone with a paid subscription can submit their trained character. Why the Frick do sexual undertones or overtones come even up in non-age restricted models?
They, the provider of that site, deserve the full front of this lawsuit.
Issue I see with character.ai is that it seem to be unmoderated
Its entire fucking point is that it’s an unrestricted AI for replaying purposes, it makes this very clear, and is clearly for a valid purpose
Why the Frick do sexual undertones or overtones come even up in non-age restricted models?
Because AI is hard to control still, maybe forever?
They, the provider of that site, deserve the full front of this lawsuit
Lol, no. I don’t love companies, but if they deserve a lawsuit despite the clear disclaimers on their site and that parents inability to parent then I fucking hate our legal system
Shit mom aware her kid had mental issues did nothing to actually try to help, wants to blame anything but herself. Too bad, so sad, I’d say do better next time but this isn’t that kind of game
But disclaimers, who read those? Probably not kids. And if LLMs can’t be moderated/controlled then there needs to be laws and rules do that they do become easier to moderate and control. This is getting out of control real fast.
Everyone who visits the page and reads “create your own character and customize their voice, tone, and skin color!” The guy was talking to Daenaerys from GoT ffs, that doesn’t even take a disclaimer
And if LLMs can’t be moderated/controlled then there needs to be laws
Not cant, it’s hard. Also, the entire point of this specific one is to not have those limits so it can be used for specific purposes. This is made clear to anyone who can read, which is required to even use the chatbot service
The only law we need here are the ones already on the books. Parent was aware there was an issue and did nothing at all to stop it. Could have been drugs, porn, shady people they knew IRL, whatever, doesn’t matter
Issue I see with character.ai is that it seem to be unmoderated. Everyone with a paid subscription can submit their trained character. Why the Frick do sexual undertones or overtones come even up in non-age restricted models?
They, the provider of that site, deserve the full front of this lawsuit.
Its entire fucking point is that it’s an unrestricted AI for replaying purposes, it makes this very clear, and is clearly for a valid purpose
Because AI is hard to control still, maybe forever?
Lol, no. I don’t love companies, but if they deserve a lawsuit despite the clear disclaimers on their site and that parents inability to parent then I fucking hate our legal system
Shit mom aware her kid had mental issues did nothing to actually try to help, wants to blame anything but herself. Too bad, so sad, I’d say do better next time but this isn’t that kind of game
Yes I agree with you on the parenting side.
But disclaimers, who read those? Probably not kids. And if LLMs can’t be moderated/controlled then there needs to be laws and rules do that they do become easier to moderate and control. This is getting out of control real fast.
Everyone who visits the page and reads “create your own character and customize their voice, tone, and skin color!” The guy was talking to Daenaerys from GoT ffs, that doesn’t even take a disclaimer
Not cant, it’s hard. Also, the entire point of this specific one is to not have those limits so it can be used for specific purposes. This is made clear to anyone who can read, which is required to even use the chatbot service
The only law we need here are the ones already on the books. Parent was aware there was an issue and did nothing at all to stop it. Could have been drugs, porn, shady people they knew IRL, whatever, doesn’t matter