Nucleo’s investigation identified accounts with thousands of followers with illegal behavior that Meta’s security systems were unable to identify; after contact, the company acknowledged the problem and removed the accounts

  • General_Effort@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    13 hours ago

    When I saw this, 2 questions came to mind: How come that this isn’t immediately reported? Why would anyone upload illegal material to a platform that tracks as thoroughly as Meta’s do?

    The answer is:

    All of those accounts followed the same visual pattern: blonde characters with voluptuous bodies and ample breasts, blue eyes, and childlike faces.

    The 1 question that came to mind upon reading this is: What?

    • WeirdGoesPro@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      11 hours ago

      I’m a little confused as to how it can still be AI CSAM if the bodies are voluptuous and the breasts are ample. Childlike faces have been the bread and butter of face filters for years.

      Which parts specifically have to be childlike for it to be AI CSAM? This is why we need some laws ASAP.

      • sik0fewl@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 hours ago

        Things that you want to understand but sure as fuck ain’t gonna Google.

    • mic_check_one_two@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      My guess is that the algorithm is really good at predicting who will be likely to follow that kind of content, rather than report it. Basically, it flies under the radar purely because the only people who see it are the ones who have a vested interest in it flying under the radar.