• @Kusimulkku@lemm.ee
    link
    fedilink
    English
    303 months ago

    Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

    I get how fucking creepy and downright sickening this all feels, but I’m genuinely surprised that it’s illegal or criminal if there’s no actual children involved.

    It mentions sexual extortion and that’s definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

    • @HappySkullsplitter@lemmy.world
      link
      fedilink
      English
      183 months ago

      It’s certainly creepy and disgusting

      It also seems like we’re half a step away from thought police regulating any thought or expression a person has that those in power do not like

    • @Korhaka@sopuli.xyz
      link
      fedilink
      English
      5
      edit-2
      3 months ago

      It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don’t count.

      • @Kusimulkku@lemm.ee
        link
        fedilink
        English
        133 months ago

        It sounds like a very iffy thing to police. Since drawn stuff doesn’t have actual age, how do you determine it? Looks? Wouldn’t be great.

        • @JuxtaposedJaguar@lemmy.ml
          link
          fedilink
          English
          113 months ago

          Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.

        • @jacksilver@lemmy.world
          link
          fedilink
          English
          33 months ago

          I mean that’s the same thing with AI generated content. It’s all trained on a wide range of real people, how do you know what’s generated isn’t depicting an underage person, which is why laws like this are really dangerous.

          • @sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            53 months ago

            Exactly. Any time there’s subjectivity, it’s ripe for abuse.

            The law should punish:

            • creating images of actual underage people
            • creating images of actual non-consenting people of legal age
            • knowingly distributing one of the above

            Each of those has a clearly identifiable victim. Creating a new work of a fictitious person doesn’t have any clearly identifiable victim.

            Don’t make laws to make prosecution easier, make laws to protect actual people from becoming victims or at least punish those who victimize others.

  • @BrianTheeBiscuiteer@lemmy.world
    link
    fedilink
    English
    263 months ago

    On one hand I don’t think this kind of thing can be consequence free (from a practical standpoint). On the other hand… how old were the subjects? You can’t look at a person to determine their age and someone that looks like a child but is actually adult wouldn’t be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.

    This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.

    • ֆᎮ⊰◜◟⋎◞◝⊱ֆᎮ
      link
      fedilink
      English
      403 months ago

      I think it’s pretty stupid. Borders on Thought Crime kind of stuff.

      I’d rather see that kind of enforcement and effort go towards actually finding people who are harming children.

      • @Inucune@lemmy.world
        link
        fedilink
        English
        113 months ago

        This is also my take: any person can set up an image generator and churn any content they want. Focus should be on actual people being trafficed and abused.

    • @sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      73 months ago

      I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.

      So could I, but that doesn’t make it just. It should only be a crime if someone is actually harmed, or intended to be harmed.

      Creating a work about a fictitious individual shouldn’t be illegal, regardless of how distasteful the work is.

    • @General_Effort@lemmy.worldOP
      link
      fedilink
      English
      -283 months ago

      It’s not a gray area at all. There’s an EU directive on the matter. If an image appears to depict someone under the age of 18 then it’s child porn. It doesn’t matter if any minor was exploited. That’s simply not what these laws are about.

      Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It’s not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

      17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that’s possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

      Anyway, what I’m saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

      • @FauxLiving@lemmy.world
        link
        fedilink
        English
        323 months ago

        There’s not an epidemic of child porn.

        There’s an epidemic of governments wanting greater surveillance powers over the Internet and it is framed as being used to “fight child porn”.

        So you’re going to hear about every single case and conviction until your perception is that there is an epidemic of child porn.

        “You can’t possibly oppose these privacy destroying laws, after all you’re not on the side of child porn are you?”

          • @FauxLiving@lemmy.world
            link
            fedilink
            English
            53 months ago

            It’s all part of ‘manufacturing consent’.

            There’s plenty of material out in academia about it (as always check your sources), if you want to get into the weeds

      • @BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        93 months ago

        It’s not a gray area at all. There’s an EU directive on the matter. If an image appears to depict someone under the age of 18 then it’s child porn.

        So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don’t look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.

        In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

        Is it though? I don’t know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I’m not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.

        • @barsoap@lemm.ee
          link
          fedilink
          English
          12 months ago

          In Germany, if 14-18yolds make nude selfies then nothing happens, if they share it with their intimate partner(s) then neither, if someone distributes (that’s the key word) the pictures on the schoolyard then the law is getting involved. Under 14yolds technically works out similar just that the criminal law won’t get involved because under 14yolds can’t commit crimes, that’s all child protective services jurisdiction which will intervene as necessary. The general advise to kids given by schools is “just don’t, it’s not worth the possible headache”. It’s a bullet point in biology (sex ed) and/or social studies (media competency), you’d have to dig into state curricula.

          Not sure where that “majority of cases” thing comes from. It might very well be true because when nudes leak on the schoolyard you suddenly have a whole school’s worth of suspects many of which (people who deleted) will not be followed up on and another significant portion (didn’t send on) might have to write an essay in exchange for terminating proceedings. Yet another reason why you should never rely on police statistics. Ten people in an elevator, one farts, ten suspects.

          We do have a general criminal register but it’s not public. Employers generally are not allowed to demand certificates of good conduct unless there’s very good reason (say, kindergarten teachers) and your neighbours definitely can’t.

      • @barsoap@lemm.ee
        link
        fedilink
        English
        4
        edit-2
        3 months ago

        That’s a directive, it’s not a regulation, and the directive calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren’t. Germany, for example, splits the whole thing into under 14 and 14-18.

        We certainly don’t arrest youth for sending each other nudes:

        (4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.

        …their own nudes, that is. Not that of classmates or whatnot.

  • @Xanza@lemm.ee
    link
    fedilink
    English
    183 months ago

    I totally agree with these guys being arrested. I want to get that out of the way first.

    But what crime did they commit? They didn’t abuse children…they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it’s flat, but where’s the line here? If they draw pictures of non-existent children is that also a crime?

    Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it’s not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

    It just seems entirely unenforceable and an entire goddamn can of worms…

    • @sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      113 months ago

      Exactly, which is why I’m against your first line, I don’t want them arrested specifically because of artistic expression. I think they’re absolutely disgusting and should stop, but they’re not harming anyone so they shouldn’t go to jail.

      In my opinion, you should only go to jail if there’s an actual victim. Who exactly is the victim here?

    • billwashere
      link
      fedilink
      English
      22 months ago

      First off I’ll say this topic is very nuanced. And as sick as any child porn is I completely agree. This, in my gut, feels like a weird slippery slope that will somehow get used against any AI generated images or possibly any AI generated content. It makes me feel like those “online child protection” bills that seem on the surface like not terrible ideas, but when you start thinking about them in detail are horrific dystopian ideas.

  • @JuxtaposedJaguar@lemmy.ml
    link
    fedilink
    English
    73 months ago

    Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.

    As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.

      • @sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        93 months ago

        only way

        That’s just not true.

        That said, there’s a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there’s a good chance they don’t understand how they work, but the creators of the model should absolutely know where they’re getting the source data from.

        Prove that the models use illegal material and go after the model creators for that, because that’s an actual crime. Don’t go after people using the models who are providing alternatives to abusive material.

        • @DoPeopleLookHere@sh.itjust.works
          link
          fedilink
          English
          -33 months ago

          I think all are unethical, and any service offering should be shut down yes.

          I never said prosecute the user’s.

          I said you can’t make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.

          • @sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            53 months ago

            the odds of human explotations at some point in the chain are just too high

            We don’t punish people based on odds. At least in the US, the standard is that they’re guilty “beyond a reasonable doubt.” As in, there’s virtually no possibility that they didn’t commit the crime. If there’s a 90% chance someone is guilty, but a 10% chance they’re completely innocent, most would agree that there’s reasonable doubt, so they shouldn’t be convicted.

            If you can’t prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.

            Services should only be shut down if they’re doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing “beyond a reasonable doubt” that they were committing a crime. That’s how the law works, you only punish people you can prove “beyond a reasonable doubt” were committing a crime.

            • @DoPeopleLookHere@sh.itjust.works
              link
              fedilink
              English
              -33 months ago

              How can it be made ethically?

              That’s my point.

              It can’t.

              Some human has to sit and make many, many, many models of genitals to produce an artificial one.

              And that, IMO is not ethically possible.

              • @sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                73 months ago

                How can it be made ethically?

                Let’s say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.

                I don’t know, I’m not an expert. But just because I don’t know of something doesn’t mean it doesn’t exist, it means I need to consult experts.

                It can’t.

                Then prove it. That’s how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.

                My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It’s not CSAM if it’s generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn’t use CSAM in its training data.

                • @DoPeopleLookHere@sh.itjust.works
                  link
                  fedilink
                  English
                  -33 months ago

                  You can’t prove a negative. That’s not how prooving things work.

                  You also assume legal images. But that puts limits on what’s actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?

                  You assume it can, prove that it can.

      • @ifItWasUpToMe@lemmy.ca
        link
        fedilink
        English
        93 months ago

        I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.

  • @badbytes@lemmy.world
    link
    fedilink
    English
    73 months ago

    If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.

    • @General_Effort@lemmy.worldOP
      link
      fedilink
      English
      23 months ago

      There have been controversies about that sort of thing.

      I know the Oscar-winning movie The Tin Drum as an example. The book by Günter Grass is a very serious, highly celebrated piece of German post-war literature. It takes place around WW2. The protagonist has the mind of an adult in the body of a child. I guess the idea is that he is the other way around from most people?

      The movie was banned in Ontario and Oklahoma, for a time. https://en.wikipedia.org/wiki/The_Tin_Drum_(film)#Censorship

      With European societies shifting right, I doubt such a movie could be made today, but we aren’t at a point where it would be outright illegal.