• @Nyoka@lemm.ee
    link
    fedilink
    316 months ago

    In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

      • Raphaël A. Costeau
        link
        fedilink
        English
        4
        edit-2
        6 months ago

        If they can plant AI CSAM in my computer they can also plant “real” CSAM in my computer. Your point doesn’t make any sense.

    • @TheAnonymouseJoker@lemmy.ml
      link
      fedilink
      -236 months ago

      Reporting and getting my comment removed for feeling the hypothetical threat of becoming a CSAM planting victim? Wow, I think I struck the chord with you. It makes sense, people like you never think through things before suggesting them. Such people should never get the tiniest sliver of power.

      • @papertowels@lemmy.one
        link
        fedilink
        5
        edit-2
        6 months ago

        Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.

        • @TheAnonymouseJoker@lemmy.ml
          link
          fedilink
          -18
          edit-2
          6 months ago

          There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not “CSAM”. Reducing problems to absurdity is lethal to liberty of citizens.

          All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

          • @papertowels@lemmy.one
            link
            fedilink
            66 months ago

            Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.

            Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.

            • @TheAnonymouseJoker@lemmy.ml
              link
              fedilink
              -13
              edit-2
              6 months ago

              It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

              The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what “level” of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

              Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

              I just had a dejavu moment typing this comment, and I have no idea why.

                • @TheAnonymouseJoker@lemmy.ml
                  link
                  fedilink
                  -86 months ago

                  So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

                  • @ssj2marx@lemmy.ml
                    link
                    fedilink
                    2
                    edit-2
                    6 months ago

                    Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.

        • @TheAnonymouseJoker@lemmy.ml
          link
          fedilink
          -96 months ago

          This is not a “CSAM” problem, since there is no physical outcome. This is a defamation and libel problem, and should be treated as such. If I see nonsensical notions, I will call them out without fear.