Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

    • @taladar@feddit.de
      link
      fedilink
      English
      592 years ago

      In the long term that might even lead to society stopping their freak-outs every time someone in some semi-sensitive position is discovered to have nude pictures online.

  • @rufus@discuss.tchncs.de
    link
    fedilink
    English
    49
    edit-2
    2 years ago

    Interesting. Replika AI, ChatGPT etc crack down on me for doing erotic stories and roleplay text dialogues. And this Clothoff App happily draws child pornography of 14 year olds? Shaking my head…

    I wonder why they have no address etc on their website and the app isn’t available in any of the proper app-stores.

    Obviously police should ask Instagram who blackmails all these girls… Teach them a proper lesson. And then stop this company. Have them fined a few millions for generating and spreading synthetic CP. At least write a letter to their hosting or payment providers.

      • @rufus@discuss.tchncs.de
        link
        fedilink
        English
        2
        edit-2
        2 years ago

        I just hope they even try to catch these people. I’ve tried to look up who’s behind that and it’s a domain that’s with name.com and the server is behind Cloudflare. I’m not Anonymous, so that’s the point at which I’m at my wits’ end. Someone enraged could file a few reports at their abuse contacts… Just sayin…

        There’s always the possibility they just catch the boy and just punish him. Letting the even more disgusting people in the background keep doing what they want. Because it would be difficult to get a hold of them. This would be the easiest route for the prosecuters and the least efficient way to deal with this issue as a whole.

      • @rufus@discuss.tchncs.de
        link
        fedilink
        English
        1
        edit-2
        1 year ago

        I didn’t follow how the story turned out that closely. I think it was a schoolmate who did this. I kinda split up my answer because I think if a kid/minor is the offender, it’s not yet too late to learn how to behave (hopefully). But blackmailing people with nudes is a bit more than the usual bullying and occasional fight between boys we did back in the day. I trust some judge has a look at the individual case and comes up with a proper punishment that factors this in.

        What annoys me is the people who offer this service. Advertise for use-cases like this and probably deliberately didn’t put any filters in place not even if it’s pictures of minors. I think they should be charged, fined and ultimately that business case should be banned. I (anonymously) filed a complaint, after writing that comment in September. But they’re still online as of today.

        So in my opinion the kid should be taught a lesson and the company should pay for this and be closed for good.

  • @them@lemmy.world
    link
    fedilink
    English
    362 years ago

    Yes, lets name the tool in the article so everybody can participate in the abuse

      • DarkThoughts
        link
        fedilink
        92 years ago

        Considering that AI services typically cost money, especially those advertising adult themes, it kinda does do support the hosters of such services.

        • RaivoKulli
          link
          fedilink
          English
          82 years ago

          Then again, naming and shaming puts pressure on them too. But in the end I doubt it matters. Those who want to use them will find them.

          • DarkThoughts
            link
            fedilink
            22 years ago

            Of course, which isn’t even the problem but rather people using the edited pictures for things like blackmail or whatever. From a technical standpoint it isn’t too dissimilar to the old photoshopping. Face swapping can probably even provide much higher quality results, especially if you have a lot of source material to pull from (you want like matching angles for an accurate looking result). Those AI drawn bodies often have severe anatomical issues that make them very obvious and look VERY different to their advertisement materials.

          • @30p87@feddit.de
            link
            fedilink
            English
            12 years ago

            True. Especially as just googling ‘undress AI free’ yields tons of results which may be less or more legit.

    • @Rediphile@lemmy.ca
      link
      fedilink
      English
      52 years ago

      You can literally Google ‘AI nude generation tool’ and get multiple results already. And I do sort of agree with you as I’m not sure how naming this specific tool was necessary or beneficial here. But I don’t think not naming it is going to prevent anyone interested in such a tool from finding one. The software/tool itself is (currently) not illegal.

  • rayyyy
    link
    fedilink
    292 years ago

    The shock value of a nude picture will become increasingly humdrum as they become more widespread. Nudes will become so common that no one will batt an eye. In fact, some less endowed, less perfect ladies will no doubt do AI generated pictures or movies of themselves to sell on the internet. Think of it as photoshop X 10.

      • @andrai@feddit.de
        link
        fedilink
        English
        382 years ago

        I can already get a canvas and brush and draw what I think u/DessertStorms looks like naked and there is nothing you can do about it.

        • @ParsnipWitch@feddit.de
          link
          fedilink
          English
          -12 years ago

          The lack of empathy in your response is telling. People do not care for the effect this has on teenage girls. They don’t even try to be compassionate. I think this will just become the next thing girls and women will simply have to accept as part of their life and the sexism and objectification that is targeted at them. But “boys will be boys” right?

          • @Seudo@lemmy.world
            link
            fedilink
            English
            -42 years ago

            The number of people offering practical solutions instead of knee jerk feels… oh the humanity!

            • @ParsnipWitch@feddit.de
              link
              fedilink
              English
              1
              edit-2
              2 years ago

              Demanding people to just accept that this will happen and they just shouldn’t feel bad about it is not a practical solution.

                • @ParsnipWitch@feddit.de
                  link
                  fedilink
                  English
                  12 years ago

                  Because this is not a solution for the people who are actually victimized. It’s just a solution for the people around those who are victimzed, so that they don’t need to change anything or talk (or listen) about it.

        • DessertStorms
          link
          fedilink
          -102 years ago

          You’re not making the point you think you are, instead you’re just outing yourself as a creep. ¯_(ツ)_/¯

      • @taladar@feddit.de
        link
        fedilink
        English
        242 years ago

        Photoshopped nude pictures of celebrities (and people the photoshopper knew personally) have been around for at least 30 years at this point. This is not a new issue as far as the legal situation is concerned, just the ease of doing it changed a bit.

      • @Jax@sh.itjust.works
        link
        fedilink
        English
        -192 years ago

        Have you ever posted a photo on Facebook or Instagram?

        If the answer is yes, congratulations! You gave consent.

        • Black616Angel
          link
          fedilink
          English
          72 years ago

          Please show me where exactly the terms and conditions mention the production and publication of ai generated nudes on those sites.

          Also eww, I would not want to be near you in real life.

          • @Jax@sh.itjust.works
            link
            fedilink
            English
            -7
            edit-2
            2 years ago

            You give them free reign to do literally whatever they want with your images the moment you post them. They OWN YOUR PHOTOS. The only reason you don’t know about it is because you’re fucking stupid and don’t read their terms of service.

            Signed: person who stopped using sites like Facebook and Instagram for this reason.

            Edit: Sorry, I realized that reading isn’t your strong suit which is why you demanded I sift through their ToS for you. It’s under the privacy section of Meta’s terms of service. Anything you post that is public immediately grants them the rights to your image.

            You ever put an image on Tinder through Facebook, congrats: consent achieved.

            I genuinely do not care if you are aware or otherwise. Your comment proves you’re fucking dumb, and deserve your images being used against you for not protecting yourself from predatory social media sites.

            • Black616Angel
              link
              fedilink
              English
              12 years ago

              You are right, they own my photos, this of course doesn’t grant them the right to do anything with it and it as well doesn’t give someone else the right, but what do you know? You are some lonely little sit harassing others online.

              Delete your CSAM collection and then yourself please. Do something for us all, thanks.

              • @Jax@sh.itjust.works
                link
                fedilink
                English
                -22 years ago

                Jesus christ, you’re a fucking idiot. Maybe if you went through English class without writing every report through sparknotes you’d have developed the critical thinking required to understand what a TERMS OF SERVICE agreement is.

                It’s not too late, you can always go back to school. Although, reading your replies, you’re still too fucking dumb to gain anything from it.

                • Black616Angel
                  link
                  fedilink
                  English
                  22 years ago

                  Wow, you have to be one of the most stubborn, stupid, insolent, arrogant, self-absorbed assholes, I ever had the displeasure of exchanging words with.

                  Eat a dick!

            • @ParsnipWitch@feddit.de
              link
              fedilink
              English
              1
              edit-2
              2 years ago

              Jep, women and girls should just stay away from social media. Also, do not appear on other types of photos. Best stay under a blanket all times, since if some guy sees your face you gave him consent to do whatever he likes with that. You really are a pathetic human being if you don’t see the problem with your mindset.

              • @Jax@sh.itjust.works
                link
                fedilink
                English
                -12 years ago

                God you people are fucking dumb.

                It is in their TERMS OF SERVICE. IF YOU ACCEPT THEIR TERMS OF SERVICE WITHOUT READING THEM, YOU DO NOT GET TO COMPLAIN. THIS IS NOT NEW, YOU ARE JUST STUPID.

  • @negativeyoda@lemmy.world
    link
    fedilink
    English
    222 years ago

    Can this come full circle so I can shirtcock it and later say, “dog, that’s AI” when people post pictures?

    • @benni@lemmy.world
      link
      fedilink
      English
      12 years ago

      I don’t know about AI nudes. But with normal AI generated pics, they have a specific style and genericness to them. Don’t get me wrong, many AI generated pictures are hard to distinguish from real photographs. But on the other hand, many real photographs are easy to distinguish from AI generated pics. So you’d probably need to take the nudes in a specific way to have plausible deniability.

  • Margot Robbie
    link
    fedilink
    English
    222 years ago

    Banning diffusion models doesn’t work, the tech is already out there and you can’t put it back in the box. Fake nudes used to be done with PhotoShop, the current generative AI models only makes them faster to make.

    This can only be stopped on the distribution side, and any new laws should focus on that.

    But the silver lining of this whole thing is that nude scandals for celebs aren’t really possible any more if you can just say it’s probably a deepfake.

    • @GCostanzaStepOnMe@feddit.de
      link
      fedilink
      English
      -52 years ago

      Other than banning those websites and apps that offer such services, I think we also need to seriously rethink our overall exposure to the internet, and especially rethink how and how much children access it.

        • @GCostanzaStepOnMe@feddit.de
          link
          fedilink
          English
          -8
          edit-2
          2 years ago

          We’ll need an AI run police state to stop this technology.

          No? You really just need to ban websites that run ads for these apps.

  • @YurkshireLad@lemmy.ca
    link
    fedilink
    English
    212 years ago

    Maybe something will change as soon as people start creating and distributing fake AI nudes of that country’s leaders.

  • @Sigmatics@lemmy.ca
    link
    fedilink
    English
    18
    edit-2
    2 years ago

    The only thing new about this is that the photos are probably more realistic, but still fake. Apps to do this existed before GenAI was a thing

  • @tetraodon@feddit.it
    link
    fedilink
    English
    172 years ago

    I feel somewhat bad saying this, but the wo/man (it will be a man) who can make an Apple Vision Pro work with AI nudifiers will become rich.

    • @TheGreenGolem@lemm.ee
      link
      fedilink
      English
      132 years ago

      You know the old joke: if we could do anything with just our eyes, the streets would be full of dead people and pregnant women.

    • @uxia@midwest.social
      link
      fedilink
      English
      72 years ago

      Lol then people will probably start assuming anyone wearing that technology is a pedophile and/or disgusting creep.

        • @bitsplease@lemmy.ml
          link
          fedilink
          English
          22 years ago

          I don’t see how it won’t, people are always going to be sketched out by the notion that the guy across from you could be recording you or taking pictures without your knowledge

          Yeah phones can kind of do the same, but it’s a lot harder to hide with a phone

          • @nicoweio@lemmy.world
            link
            fedilink
            English
            12 years ago

            Assuming Apple locks down their device enough, it should make it pretty clear when it’s recording. Whether this notion becomes generally known and accepted, though, is a question in itself.

            • @bitsplease@lemmy.ml
              link
              fedilink
              English
              12 years ago

              People already don’t trust the webcams on their own machines to not record them, even when they have hardwired indicator lights, I really doubt that they’ll suddenly trust tech that most people have no experience with to be frank.

              I don’t think it’ll be an issue with the Apple Vision Pro specifically though, it’s not like the Google Glass in that it’s super convenient to wear when you go out on a regular basis. No one but an absolute weirdo is going to sit down at the bar wearing his Apple Vision Pro, it’d be like bringing your Quest 2 lol

  • @Aetherion@feddit.de
    link
    fedilink
    English
    52 years ago

    Better don’t stop posting your life into the internet, this would push people to create more child porn! /s

    • @iegod@lemm.ee
      link
      fedilink
      English
      122 years ago

      Definitely not down with banning. You can imagine nudity in your mind and redraw it. Do we ban thoughts and artists too? The AI isn’t the problem.

      • @pinkdrunkenelephants@sopuli.xyz
        link
        fedilink
        English
        -32 years ago

        No amount of false equivalencies will make me or anyone else accept something as stupid, dangerous and terrible as generative AI.

        It’s on you to accept you don’t have the right to have a robot think and be creative for you, and that poor girl is one of many reasons why.

        • @iegod@lemm.ee
          link
          fedilink
          English
          102 years ago

          You’re riled up, I get it, but your statements are simply not factual, as much as you want them to be.

        • LoafyLemon
          link
          fedilink
          02 years ago

          It’s impossible to ban AI once it’s allowed for public use because technology spreads rapidly, and enforcing a ban becomes impractical due to its widespread adoption and the difficulty of regulating it effectively. But hey, if you want to make an ineffective ban that will only affect one small part of the world, irrelevant to the masses, be my guest.

          • @pinkdrunkenelephants@sopuli.xyz
            link
            fedilink
            English
            02 years ago

            No it isn’t. We can and have banned awful, terrible shit that became widespread before, and we’ll do it again. To your precious AI you’re dumb enough to allow to do your thinking for you. We’ll even jail you for using the things.

            That’s what laws are for and if we believe what you’re saying, then no law can exist.

            • LoafyLemon
              link
              fedilink
              0
              edit-2
              2 years ago

              Considering I’m not even a US resident, your government and laws cannot touch me, that’s how irrelevant your knee-jerk reactions are.

              Do you think China, India, or even members of the EU will stop developing AI because one country said so? Your expectations are highly unrealistic.

              • @pinkdrunkenelephants@sopuli.xyz
                link
                fedilink
                English
                22 years ago

                Other countries can ban you, too.

                And I do think the EU is more likely even than us to ban you, or at least heavily regulate you.

                You’re living in a dream world if you think you can steal everyone else’s artwork en masse, use it to generate art for you and think you can get away with it. It’s going to happen. You’re going to get banned.

                • LoafyLemon
                  link
                  fedilink
                  02 years ago

                  I’m fine making art on my own, without AI, but thanks for your concern.

  • @electrogamerman@lemmy.world
    link
    fedilink
    English
    -22 years ago

    This all would not be a problem if people appreciated nudism more. Im not even talking about people being nudists, just people accepting nudists. Once you take away the nudism taboo, all these photos won’t matter at all

  • @duxbellorum@lemm.ee
    link
    fedilink
    English
    -72 years ago

    This seems like a pretty significant overreaction. Like yes, it’s gross and it feels personal, but it’s not like any of the subjects were willing participants…their reputation is not being damaged. Would they lose their shit about a kid gluing a cut out of their crush’s face over the face of a pornstar in a magazine? Is this really any different from that?

      • blargerer
        link
        fedilink
        82 years ago

        Obviously this is creepy, but the technology is out there, one of those can’t put the genie back in the bottle techs. You can and should look at the people generating the images as creeps, but ultimately we as a society need to learn to not put as much veracity or identity in images now.

        With that said where the fuck did this model get its training data for 14 year olds. That sounds like a more serious issue.

          • @taladar@feddit.de
            link
            fedilink
            English
            42 years ago

            Not that they aren’t both bad but I hate this false equivalence between images that were created by literally raping a child and filming that rape and images that were created purely from the imagination of the creator. This is what is actually enabling child abuse by treating both identically in legal terms because to the person attracted to children you suddenly made the cost identical while they probably prefer the real thing to a fake thing.

              • @taladar@feddit.de
                link
                fedilink
                English
                12 years ago

                They’re both terrible and illegal to different degrees.

                But most people and most legal jurisdictions do not make that distinction and that is my point. I am not saying either should be legal but at the very least one should carry a lot lower punishments, in a similar way that possession of stolen goods and possession of murder weapons are both punished but not with the same severity.

        • @barsoap@lemm.ee
          link
          fedilink
          English
          92 years ago

          With that said where the fuck did this model get its training data for 14 year olds.

          Nowhere, at least for any model you could get your hands at in public places like civitai. Or, well, it’s not like they can tell whether someone trained on those kinds of pictures but they’re rightly nuking any underage/loli example images, as well as anyone who posts them, from orbit.

          Generally speaking models can be very good at mixing concepts they have an understanding of, say a giraffe with zebra stripes, but that doesn’t mean that you can just combine anything – if you try to generate a nude human with zebra fur you’re bound to get body paint, random skimpy zebra-striped clothing, or at most a fursuit, not convincing fur, unless you use a model trained by furries but at that point you’ll have trouble generating faces without muzzles: The AI just doesn’t know how actual zebrakin look like so it’s either copping out or making stuff up.

          I’ve never tried nor am I remotely attracted to that age range but I wouldn’t be surprised if a paedophile would complain “these aren’t kids they’re scaled-down adults”. Things like the difference between budding and small breasts, ask a biologist I haven’t seen 14yold breasts in over two decades.

          On another note though I’d much rather have paedophiles jack off to generated images than doing anything involving actual children, including creeping around. Lesser of two evils and all that. Therapy, of course, is preferable to both.

          • @ParsnipWitch@feddit.de
            link
            fedilink
            English
            12 years ago

            How do you know that these people replace harassment with these pictures? And not just do both, or even increase their fetishes?

            What about the girls who’s pictures were used as material for these generated images?

            • @barsoap@lemm.ee
              link
              fedilink
              English
              02 years ago

              How do you know that these people replace harassment with these pictures? And not just do both, or even increase their fetishes?

              AFAIK psychologists simply don’t know, and it might be a case by case thing.

              What about the girls who’s pictures were used as material for these generated images?

              As I explained, it might not be necessary to have any underage material in the training data.

              Generally speaking I didn’t come here to have a deep discussion about a very difficult moral and legal issue, I’ll leave that up to the specialists. I wanted to say something about AI and somehow all answers I get are about the last tacked-on paragraph making a quick statement about me preferring keeping paedophiles away from kids.

            • @barsoap@lemm.ee
              link
              fedilink
              English
              102 years ago

              I’ll leave the judgement of that to psychologists. What should not be controversial, however, is the amount of direct harm avoided if one can be replaced by the other.

              Don’t let the perfect be the enemy of the less shitty.

                • @barsoap@lemm.ee
                  link
                  fedilink
                  English
                  4
                  edit-2
                  2 years ago

                  that you suggested

                  I did not suggest anything. I expressed a preference: That it’s better if a paedophile jacks off to generated pictures than if they molest actual children. What do you disagree with, there? That both situations are equally bad, that an equal amount of harm is occurring? Have you ever asked a victim about that.

                  There are laws in place about sexualizing minors.

                  Just for the record: Not by a far stretch all countries outlaw drawings, fiction, etc., but only as the German term goes “documents of child abuse”.

                  You can’t just hand wave my response away

                  You mean your accusation and I tend to do that for civility’s sake as doing otherwise tends to result in shouting matches. It is AFAIK currently unknown whether, by and large, paedophiles having access to simulated material for their sexual gratification increases or decreases the incidence of child abuse happening. I have no idea either, you don’t know better either, and it may very well differ on a case-by-case basis. All I’m saying is that I’d rather have them fapping than molesting children is that so hard to understand and why in the everloving fuck would you disagree with that: If anything it’s you who’s trivialising child abuse (and, look, see, I stopped to ignore your incitement and we’re in an accusatory shouting match)

        • krellor
          link
          fedilink
          52 years ago

          Right, the technology is out there so we as a society need to establish norms, customs, and yes, laws governing its use.

          I’m pretty firmly on the side of there being legal consequences for taking pictures of real minors, running them through a service to create nude replicas, and then circulating those pictures. That is wrong on so many levels and could constitute any number of crimes without the AI component including, such as harassment. I mean, intentionally using someone’s likeness to circulate embarrassing materials already had legal consequences. This is just a whole other level of ick on top.

          • @taladar@feddit.de
            link
            fedilink
            English
            22 years ago

            Personally I don’t see a difference between using an AI service or plain old Photoshop to create a fake nude picture of someone. Both should be punished in the same way and if law makers haven’t caught up with the Photoshop version after 30 years they likely won’t handle the AI version in this century either.

              • @taladar@feddit.de
                link
                fedilink
                English
                12 years ago

                I don’t see the difference of photoshopping a convincing nude of the same minor vs. using AI to generate a nude of the same minor.

            • krellor
              link
              fedilink
              32 years ago

              I would agree, though I wonder about the service mentioned that is dedicated to the process. My comment was in response to someone who seemed to think circulating fake nudes wasn’t a problem, regardless of how they were generated.

      • @duxbellorum@lemm.ee
        link
        fedilink
        English
        12 years ago

        It’s not different, it’s all fake, cobbled together from images of other people’s bodies and will show zero authentic details about the subject except what are already known and visible about them.

        What the fuck are you talking about? Spreading nude photos of any provenance around at work is definitely an HR violation, and the use of my partner’s face in them (just like pasting their face on a pornstar’s photo) is sexual harassment. Nothing about it being AI generated changes any of that equation except to make it a little more uncanny.

        It’s a fad, and how would we deal with you sending your hand drawn pictures around the neighborhood….form a group of concerned moms and raid all of the local art shops to stop the sale of drawing materials?

        The genie is out of the bottle. We can shower these types of content with huge attention which will ultimately extend and expand the fad, we can ignore them because they are pointless, or we can try a futile war on AI porn that, like the war on drugs, will ruin a lot of ultimately benign peoples lives in order to crack down on a few legitimately criminal creeps who probably can already be prosecuted according to existing laws.

        • @ParsnipWitch@feddit.de
          link
          fedilink
          English
          4
          edit-2
          2 years ago

          Why is it not sexual harassment if the target are teenage girls?

          In my opinion there should be really impactful punishment for the people who did this. Otherwise there will be more and more people like you who seem to think this is a funny little school prank.

      • @duxbellorum@lemm.ee
        link
        fedilink
        English
        -3
        edit-2
        2 years ago

        Why? They didn’t take or share any nudes, and nobody believes they did.

        This is only a nightmare if an ignorant adult tells them that it is.

        • @0x815@feddit.deOP
          link
          fedilink
          English
          32 years ago

          @duxbellorum

          Why? They didn’t take or share any nudes, and nobody believes they did.

          This is only a nightmare if an ignorant adult tells them that it is.

          So you don’t have children, right?

        • @ParsnipWitch@feddit.de
          link
          fedilink
          English
          -1
          edit-2
          2 years ago

          Did your picture got taken and shared as a teenager? Did you get heavily sexualised and harassed? Believe me, it feels like a nightmare even if no one is telling you that it should feel like a nightmare.

          Take your “sexual harassment is only bad to teenage girls if you tell them” shit elsewhere.