• SacrificedBeans@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I’m sure there’s some loophole there, maybe between countries’ laws. And if there isn’t, Hey! We’ll make one!

        • Clbull@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?

          • Strawberry@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 year ago

            That month, Sama began pilot work for a separate project for OpenAI: collecting sexual and violent images—some of them illegal under U.S. law—to deliver to OpenAI. The work of labeling images appears to be unrelated to ChatGPT.

            This is the quote in question. They’re talking about images

        • smooth_tea@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          I really find this a bit alarmist and exaggerated. Consider the motive and the alternative. You really think companies like that have any other options than to deal with those things?

    • Clbull@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      So they paid Kenyan workers $2 an hour to sift through some of the darkest shit on the internet.

      Ugh.

    • GenesisJones@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      This reminds me of an NPR podcast from 5 or 6 years ago about the people who get paid by Facebook to moderate the worst of the worst. They had a former employee giving an interview about the manual review of images that were CP andrape related shit iirc. Terrible stuff