Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • @books@lemmy.world
    link
    fedilink
    English
    871 year ago

    I feel like I live on the Internet and I never see this shit. Either it doesn’t exist or I exist on a completely different plane of the net.

    • @GBU_28@lemm.ee
      link
      fedilink
      English
      131 year ago

      You ever somehow get invited to a party you’d usually never be at? With a crowd you.never ever see? This is that.

    • @schnurrito@discuss.tchncs.de
      link
      fedilink
      English
      121 year ago

      On the Internet, censorship happens not by having too little information, but too much information in which it is difficult to find what you want.

      We all have only so much time to spend on the Internet and so necessarily get a filtered experience of everything that happens on the Internet.

        • @schnurrito@discuss.tchncs.de
          link
          fedilink
          English
          11 year ago

          No, that is not what I’m saying, mostly because I don’t think it is true. I’m saying that nowadays there is nearly all kinds of information one can think of somewhere out there on the Internet; but if it is only in relatively obscure places and you don’t know where to look for it, then it is still de facto censored by having too much other information out there.

          • @eatthecake@lemmy.world
            link
            fedilink
            English
            11 year ago

            I dont think that can be called censorship as it is not deliberate suppression of info, it’s just being drowned out or ignored.

  • @BeefPiano@lemmy.world
    link
    fedilink
    English
    491 year ago

    I wonder if this winds up with revenge porn no longer being a thing? Like, if someone leaks nudes of me I can just say it’s a deepfake?

    Probably a lot of pain for women from mouth breathers before we get there from here .

    • @thantik@lemmy.world
      link
      fedilink
      English
      151 year ago

      This has already been a thing in courts with people saying that audio of them was generated using AI. It’s here to stay, and almost nothing is going to be ‘real’ anymore unless you’ve seen it directly first-hand.

    • @TwilightVulpine@lemmy.world
      link
      fedilink
      English
      81 year ago

      Why would it make revenge porn less of a thing? Why are so many people here convinced that as long people say it’s “fake” it’s not going to negatively affect them?

      The mouth breathers will never go away. They might even use the excuse the other way around, that because someone could say just about everything is fake, then it might be real and the victim might be lying. Remember that blurry pictures of bigfoot were enough to fool a lot of people.

      Hell, even others believe it is fake, wouldn’t it still be humilliating?

      • Æther
        link
        fedilink
        English
        91 year ago

        I think you’re underestimating the potential effects of an entire society starting to distrust pictures/video. Yeah a blurry Bigfoot fooled an entire generation, but nowadays most people you talk to will say it’s doctored. Scale that up to a point where literally anyone can make completely realistic pics/vids of anything in their imagination, and have it be indistinguishable from real life? I think there’s a pretty good chance that “nope, that’s a fake picture of me” will be a believable, no question response to just about anything. It’s a problem

        • @TwilightVulpine@lemmy.world
          link
          fedilink
          English
          -11 year ago

          There are still people to believe in Bigfoot and UFOs, there’s still people falling for hoaxes every day. To the extent that distrust is spreading, it’s not manifested as widespread reasonable skepticism but the tendency to double down on what people already believe. There are more flat earthers today than there were decades ago.

          We are heading to a point that if anyone says deepfake porn is fake, regardless of reasons and arguments, people might just think it’s real just because they feel like it might be. At this point, this isn’t even a new situation. Just like people skip reputable scientific and journalistic sources in favor of random blogs that validate what they already believe, they will treat images, deepfaked or not, much in the same way.

          So, at best, some people might believe the victim regardless, but some won’t no matter what is said, and they will treat them as if those images are real.

          • @daltotron@lemmy.world
            link
            fedilink
            English
            21 year ago

            This strikes me as correct, it’s kind of more complicated than just the blanket statement of “oh, everyone will have too calloused of a mind to believe anything ever again”. People will just try to intuit truth from surrounding context in a vacuum, much like how they do with our current every day reality where I’m really just a brain in a vat or whatever.

        • @eatthecake@lemmy.world
          link
          fedilink
          English
          -21 year ago

          I hope someone sends your mom a deepfake of you being dismembered with a rusty saw. I’m sure the horror will fade with time.

          • Æther
            link
            fedilink
            English
            21 year ago

            What a horrible thing to wish on a random person on the internet. Maybe take a break on being so reactionary, jesus

      • @fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        English
        11 year ago

        The default assumption will be that a video is fake. In the very near future you will be able to say “voice assistant thing show me a video of that cute girl from the cafe today getting double teamed by robocop and an ewok wearing a tu-tu”. It will be so trivial to create this stuff that the question will be “why were you watching a naughty video of me” rather than “omg I can’t believe this naughty video of me exists”.

    • @Cheskaz@lemmy.world
      link
      fedilink
      English
      71 year ago

      Australia’s federal legislation making non-consensual sharing of intimate images an offense includes doctored or generated images because that’s still extremely harmful to the victim and their reputation.

  • FenrirIII
    link
    fedilink
    English
    321 year ago

    Went to Bing and found “Taylor swift ai pictures” as a top search. LOTS of images of her being railed by Sesame Street characters

  • @ObsidianZed@lemmy.world
    link
    fedilink
    English
    31
    edit-2
    1 year ago

    People have been doing these for years even before AGI.

    Now, it’s just faster.

    Edit: Sorry, I suppose I should mean LLM AI

      • Joe
        link
        fedilink
        English
        111 year ago

        There is no point waiting for a response…the threat has been neutralized. Now repeat after me: There is no AGI.

  • @EatATaco@lemm.ee
    link
    fedilink
    English
    311 year ago

    God what a garbage article:

    On X—which used to be called Twitter before it was bought by billionaire edgelord Elon Musk

    I mean, really? The guy makes my skin crawl, but what a hypocritically edgy comment to put into an article.

    And then zero comment from Taylor Swift in it at all. She is basically just speaking for her. Not only that, but she anoints herself spokesperson for all women…while also pretty conspicuously ignoring that men can be victims of this too.

    Don’t get me wrong, I’m not defending non consensual ai porn in the least, and I assume the author and I are mostly in agreement about the need for something to be done about it.

    But it’s trashy politically charged and biased articles like this that make people take sides on things like this. Imo, the author is contributing to the problems of society she probably wants to fix.

    • @Psythik@lemmy.world
      link
      fedilink
      English
      171 year ago

      On the contrary, I find it more ridiculous when news media pretends like nothing is wrong over at Twitter HQ. I wish more journalists would call Musk out like this every time they’re forced to mention Twitter.

      • @EatATaco@lemm.ee
        link
        fedilink
        English
        41 year ago

        Can you really see nothing other than “pretending nothing is wrong” and “calling musk an edge lord?”

        I see the media calling out the faults regularly regularly without needing to act like …well, an edge lord.

        • @Psythik@lemmy.world
          link
          fedilink
          English
          -31 year ago

          Professionalism was thrown out the window the moment orange man became president. The Republicans play dirty, so everyone else has to as well, or else they’ll walk all over us. Taking the high ground is a dead concept.

    • @Powerpoint@lemmy.ca
      link
      fedilink
      English
      51 year ago

      I disagree. To pretend nothing is wrong is worse. The author was accurate in their description here.

      • @EatATaco@lemm.ee
        link
        fedilink
        English
        01 year ago

        This is the second poster here who can’t seem to understand that there is a whole world of things between “pretending nothing is wrong” and acting like a child by calling people “edge lord.”

        Last time I checked, on my front page, there was an article from the NY times about how x is spreading misinformation and musk seems to be part of it. yet they managed to point out this problem without using the term edge lord. Is this shocking to you?

  • @Stanwich@lemmy.world
    link
    fedilink
    English
    301 year ago

    WHAT?? DISGUSTING! WHERE WOULD THESE JERKS PUT THIS ? WHAT SPECIFIC WEBSITE DO I NEED TO BOYCOTT?

  • @DarkMessiah@lemmy.world
    link
    fedilink
    English
    171 year ago

    And this is why I don’t want to be famous. Being famous exposes your name to the crazies of the world, and leaves you blissfully unaware until the crazies snap.

  • @Neil@lemmy.ml
    link
    fedilink
    English
    161 year ago

    I’m not saying she shouldn’t have complained about this. She has every right to, but complaining about it definitely made the problem a lot worse.

      • @UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        71 year ago

        Except she’s the most famous woman in the country, a well-established sex symbol, and already the subject of enumerable erotic fantasies and fictions.

        Its the same problem as “The Fappening” from forever ago. The fact that this exists is its own fuel and whether she chooses to acknowledge it or not is a moot point. Someone is going to talk about it and the news will spread.

    • @SpaceCowboy@lemmy.ca
      link
      fedilink
      English
      41 year ago

      This a problem that doesn’t just affect her though. Not discussing the problem means it doesn’t get worse for her, but does continue to happen to other people.

      Discussing it means it gets worse for her but there could be potentially be solutions found. Solutions that would help her and other people affected.

      Worst case scenario is no solution is found, but the people making AI porn make more Taylor Swift AI porn which results in less resources being devoted towards making AI porn of other people. This makes things worse for Swift but better for other people.

      TLDR; Taylor Swift is a saint and is operating a level that us petty sinners can’t comprehend.

    • @LeroyJenkins@lemmy.world
      link
      fedilink
      English
      21 year ago

      she drew so much attention to it though that there are more news stories than actual images at this point. if you look for the images, you’re gonna have to go through pages and pages of news articles about it at this point. not sure if it was intentional, but kinda worked…

    • @dlok@lemmy.world
      link
      fedilink
      English
      01 year ago

      This is probably a good thing even if real nudes leaked nobody would know if it’s real

  • @Mango@lemmy.world
    link
    fedilink
    English
    151 year ago

    Nightmare? Doesn’t it simply give them the chance to just say any naked pic of them is fake now?

  • Dariusmiles2123
    link
    fedilink
    English
    111 year ago

    At least now, if pictures are real, you can say it’s AI generated.

    Still, to be honest, I’ve never understood how some people can let one night stands film them naked.

    If it’s a longtime girlfriend or boyfriend and they betray you, it’s different, but people aren’t acting in a clever way when it comes to sex.

    • @interdimensionalmeme@lemmy.ml
      link
      fedilink
      English
      81 year ago

      There’s nothing wrong with recording your naked body and it being seen online by willing persons.

      The people who would disrespect you for it, they’re the problem.

      • Dariusmiles2123
        link
        fedilink
        English
        111 year ago

        That’s not what I’m talking about.

        I’m talking about not being careful who you’re giving these images if you don’t want them to spread online. And, of course, the person sharing it on the web is the guilty person, not the naked victim.

        • @TwilightVulpine@lemmy.world
          link
          fedilink
          English
          31 year ago

          Well, this very situation shows one can be as careful as they could and they might still have porn of themselves spread everywhere.

          • Dariusmiles2123
            link
            fedilink
            English
            31 year ago

            Yeah it’s true.

            But at least it’s not your real intimacy this time.

            Still I understand how traumatic it can be, especially for young people.

  • bean
    link
    fedilink
    English
    101 year ago

    Well, targeting someone famous and going overboard with it likely results in legal responses. Perhaps this gets deepfakes then attention they need to be regulated or legally punishable. Especially when targeting underage children.

  • KᑌᔕᕼIᗩ
    link
    fedilink
    English
    101 year ago

    Some of them are really good too, in a realistic sense. You can tell they are AI though.

  • Nora
    link
    fedilink
    English
    31 year ago

    I feel like this is amazing. Everyone can get any porn they want and no one gets hurt? And if nudes of anyone are ever “leaked” you could just say they’re AI generated. It’s like a win win win.

    • @fidodo@lemmy.world
      link
      fedilink
      English
      13
      edit-2
      1 year ago

      Spreading it is a shitty thing to do. If you want to make it in your own privacy that’s your business, but don’t spread it around. It’s not like it’s hard to make.