Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • Flying Squid
    link
    fedilink
    English
    887 months ago

    I think the last two paragraphs in the body of this post are the real issue here, not that he was just using AI to create CSAM.

    • @Mango@lemmy.world
      link
      fedilink
      English
      317 months ago

      Right? Feels like this is being tacked on as a shot at AI. Otherwise nobody is harmed except the guy. Pedos are ick, but if harmless then why punish? I don’t think anyone should have to take a fall because others think their desires are gross.

      • Jake Farm
        link
        fedilink
        English
        477 months ago

        Because they are using images of real children.

        • Flying Squid
          link
          fedilink
          English
          237 months ago

          I agree, but if there were some way to create CSAM without using real children (I’m not sure how you would train such an AI model), it would probably be worth seeing if that did anything to make pedophiles less likely to act out on their desires.

          Because my god, we need to figure out something.

          • @Zorque@lemmy.world
            link
            fedilink
            English
            167 months ago

            I mean trying to help them get treatment instead of going all pod-people on anyone showing even the possibility of being attracted to kids would be helpful.

            • Flying Squid
              link
              fedilink
              English
              197 months ago

              I’ve been saying that for ages. Obviously we don’t want to enable any pedophiles to do anything horrific to children, but we’re at a state right now where if you have those urges to begin with, you’re basically already told to accept that you’re an incurable monster. So why not act on the urges?

              Somehow we need to get through to such people that they need to get help before they do anything terrible. I’m not sure how to do that in the current climate though.

          • JohnEdwa
            link
            fedilink
            English
            7
            edit-2
            7 months ago

            The way AI models work, you don’t have to train it on the thing you want it to do, you can ask it to combine the things it knows about. Take any of the meme loras for example, like pepe punch or patcha.

            So literally any model that can generate pictures of naked adults and clothed children - which is to say almost all of them - is going to be at least somewhat competent in creating CP unless those prompts are being actively censored and blocked.

            • @Danquebec@sh.itjust.works
              link
              fedilink
              English
              16 months ago

              Wouldn’t that generate images of children with small-sized adult bodies?

              If it doesn’t know what a child’s body looks like, it can’t just figure it out.

              • JohnEdwa
                link
                fedilink
                English
                16 months ago

                The datasets will have enough images of kids in bikinis and underwear from stock photos and clothes shop listings etc to figure that part out rather easily.

          • @otp@sh.itjust.works
            link
            fedilink
            English
            47 months ago

            Train it to depict humans that look like anime characters that are definitely 18 or older immortal dragons that are taking on the bodies of young human beings

            Disclaimer

            I am not condoning, endorsing, or suggesting this

              • @Organichedgehog@lemmy.world
                link
                fedilink
                English
                -16 months ago

                “it would probably be worth seeing if that did anything to make pedophiles less likely to act out on their desires.”

                What’s the implication here? You’re saying we should look into placating child predators by creating AI CP for them to consume?

                • Flying Squid
                  link
                  fedilink
                  English
                  56 months ago

                  That would be worth a scientific study, don’t you think? Isn’t it worth trying to find ways to stop child predators before they become predators?

                  You seem to think I’m suggesting that the UK government create childporn.gov.uk or something.

          • Jake Farm
            link
            fedilink
            English
            57 months ago

            Its a form of stalking, probably makes it more likely for them to rape that child, even if they don’t wind up doing that it would still qualify as a form of revenge porn.

                • @Mango@lemmy.world
                  link
                  fedilink
                  English
                  37 months ago

                  Commissioning as in buying? I’m not sure how that changes it to stalking.

                  IMO, the worst part about it is that there’s someone else out there who thinks less of me because there’s some naked imagery of me.

                • @Mango@lemmy.world
                  link
                  fedilink
                  English
                  -27 months ago

                  I can buy photos of Robert Downey Junior from Marvel Studios and that’s not stalking.

      • @cygnus@lemmy.ca
        link
        fedilink
        English
        157 months ago

        I think this was a crime because he modified images of actual kids. If the images were 100% AI (not of real people) I’m not sure on what basis that would be considered a crime, no more than a handmade drawing of a nude minor drawn from imagination.

        • @FourPacketsOfPeanuts@lemmy.world
          link
          fedilink
          English
          217 months ago

          Any sexual representation of a child is illegal in the UK whether it looks real or not. In fact I believe it doesn’t need to even be a child, it’s a illegal if a reasonable person would believe it was depicting a child. This came up when adults who were into age play got into trouble distributing their images because it looked convincingly underage.

          • Jake Farm
            link
            fedilink
            English
            257 months ago

            Wait so even if the subjects are adults in costume its illegal? Fuck man, school uniforms is a whole genre of porn.

            • @FourPacketsOfPeanuts@lemmy.world
              link
              fedilink
              English
              1
              edit-2
              6 months ago

              Relevant part of Coroners and Justice Act 2009 (UK)

              Section 65 (regarding what “child” means in the context of indecent images)

              (6)Where an image shows a person the image is to be treated as an image of a child if—

              (a)the impression conveyed by the image is that the person shown is a child, or

              (b)the predominant impression conveyed is that the person shown is a child despite the fact that some of the physical characteristics shown are not those of a child.

              (end quote)

              In other words, an image can be treated as an indecent image of a child if the “impression conveyed” is that the person is under 18, even if that person has older “physical characteristics”.

              This legislation is more directed at non photographic imagery (so hentai / CGI etc) and the reference to physical characteristics is apparently a reference to a large breasts or “1000 year old vampire teeth” not being viable as an excuse that the image doesn’t give the impression of a child.

              I can’t recall specifically what legislation was used regarding the age play couple I referenced. I can’t find a specific law that says it’s wrong for a photograph of an adult to appear underage. So it may just be that they were reported to police because they shared their images online without context. I don’t know if they were subsequently prosecuted.

          • @cygnus@lemmy.ca
            link
            fedilink
            English
            27 months ago

            Thanks for clarifying, I didn’t know that. Seems like a bit of an overreach to me, but I suppose in this particular case it’s best to err on the side of caution.

        • @Mango@lemmy.world
          link
          fedilink
          English
          57 months ago

          I don’t really think anything is 100% AI. I also don’t really believe in the concept of thought being a crime and extend personally kept data to that realm.

        • @NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          36 months ago

          In the US federally you might be able to get away with creating the images for yourself if they are 100% fictional, but the guy also was doing commission work. The moment you start transmitting the images (and selling would involve that) it becomes very very illegal.

    • @Zaktor@sopuli.xyz
      link
      fedilink
      English
      57 months ago

      I have not personally explored AI porn, but as someone with experience in machine learning and accidental biases that’s not very surprising to me.

      On top the of the general societal bias towards youth for “beauty” related roles, smoother and less-featured faces (that in general look younger) are closer to an average face so defaulting to that gets a bit of training boost (when in doubt, target the mean). It’s probably also not helped by youth-related porn keywords (teen, daughter, young) that further associate other porn prompts (even ones not about youth) with non-porn images of underage women that also have those keywords.

    • Flying Squid
      link
      fedilink
      English
      37 months ago

      Most real porn has women who look like kids to me.

      Even the so-called MILFs look about 15 years younger than me and I’m 47.

      You have to get into “mature” and shit to see women my age.

      I’m not into young women. I’m just not. It looks like they’re fucking a high schooler and it’s icky to me.

      And then there’s all the schoolgirl and incest or incest-adjacent shit. “Playing with my stepdad.” No. Just no.

      • @Zorque@lemmy.world
        link
        fedilink
        English
        -27 months ago

        So… anyone who’s not your age looks like a child to you? That’s kind of fucked up.

        • Flying Squid
          link
          fedilink
          English
          67 months ago

          No?

          The majority of the women in porn, who can’t be more than their very early 20s, look like children to me. And they infantilize them too. I’m not sure where you got anyone not my age from.

          • @Zorque@lemmy.world
            link
            fedilink
            English
            07 months ago

            Fair, you didn’t explicitly state it. Just implied it with statements about how most people in porn (who should all be adults, unless you’re looking at questionable material) look like children to you. Then make comments about how even the “milfs” are too young.

            Maybe it’s not about them being too young, maybe it’s time you accept that you’re old. You’re putting a lot of your own biases into your judgment instead of looking at it objectively.

            • Flying Squid
              link
              fedilink
              English
              2
              edit-2
              7 months ago

              Sorry… why should I look at what I personally want out of the porn I want to see objectively? It’s entirely subjective.

              I mean I’m not sure how I could have been clearer that this was about my personal preferences. I said “to me” twice.

  • Media Bias Fact CheckerB
    link
    fedilink
    English
    -27 months ago
    The Guardian - News Source Context (Click to view Full Report)

    Information for The Guardian:

    Wiki: reliable - There is consensus that The Guardian is generally reliable. The Guardian’s op-eds should be handled with WP:RSOPINION. Some editors believe The Guardian is biased or opinionated for politics. See also: The Guardian blogs.
    Wiki: mixed - Most editors say that The Guardian blogs should be treated as newspaper blogs or opinion pieces due to reduced editorial oversight. Check the bottom of the article for a “blogposts” tag to determine whether the page is a blog post or a non-blog article. See also: The Guardian.


    MBFC: Left-Center - Credibility: Medium - Factual Reporting: Mixed - United Kingdom


    Search topics on Ground.News

    https://www.theguardian.com/uk-news/2024/oct/28/man-who-used-ai-to-create-child-abuse-images-jailed-for-18-years

    Media Bias Fact Check | bot support